Human Generated Data

Title

Untitled (llama jumping over a seated camel)

Date

c. 1945

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7807

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (llama jumping over a seated camel)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7807

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.2
Human 99.2
Chair 95.6
Furniture 95.6
Clothing 89.6
Apparel 89.6
Person 88.6
Person 88.1
Horse 86.3
Animal 86.3
Mammal 86.3
Wood 83.5
Horse 80.5
Table 79
Meal 77.7
Food 77.7
Plywood 70.1
Person 65.6
People 64.5
Leisure Activities 64
Face 61.5
Person 60.4
Dining Table 58.3
Shorts 57.4
Person 56.4
Female 55.8
Person 48.1

Clarifai
created on 2023-10-26

cavalry 99.9
people 99.8
seated 98.5
group together 98.2
man 97.3
group 97.2
monochrome 96.9
adult 95.9
many 94.9
mammal 94.9
chair 93.6
recreation 93.1
three 92.4
camel 92.3
seat 92.2
two 91.9
woman 91.6
art 91.2
racehorse 90.3
child 89.6

Imagga
created on 2022-01-09

world 20.6
people 19
man 18.1
person 17.2
newspaper 17.1
musical instrument 16.3
black 14.4
sky 14
male 13.5
product 13.3
construction 12.8
percussion instrument 12.7
structure 12.3
industry 12
work 11.9
silhouette 11.6
billboard 11.1
travel 10.6
old 10.4
men 10.3
creation 10.3
architecture 10.2
water 10
stringed instrument 9.8
building 9.8
business 9.7
sitting 9.4
device 9.4
adult 9.1
signboard 9
day 8.6
outdoor 8.4
power 8.4
industrial 8.2
worker 8.1
lady 8.1
machine 8.1
sunset 8.1
light 8
night 8
working 8
businessman 7.9
love 7.9
couple 7.8
gas 7.7
city 7.5
technology 7.4
equipment 7.4
office 7.3
statue 7.3
grand piano 7.3
color 7.2
landmark 7.2
religion 7.2

Microsoft
created on 2022-01-09

text 100
horse 88.3
black and white 81.7
old 41.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 19-27
Gender Male, 82.5%
Confused 47.2%
Disgusted 27.3%
Sad 8.5%
Happy 6.2%
Calm 4.4%
Surprised 2.5%
Angry 2.3%
Fear 1.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person 99.2%
Horse 86.3%

Captions

Text analysis

Amazon

41264
9
MOON
PISA
AABAYS MOON
AABAYS

Google

1
カ9て1カ
9