Human Generated Data

Title

Untitled (llama jumping over camel)

Date

1956

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8891

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (llama jumping over camel)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1956

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8891

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.4
Human 99.4
Person 95.8
Horse 89
Mammal 89
Animal 89
Furniture 76.9
Chair 76.9
Horse 73.1
Clothing 68.5
Apparel 68.5
Couch 67.7
Person 65.6
Person 61.8
Wood 61.6
Horse 61.2
Person 60.7
Shorts 57.8
Meal 57.7
Food 57.7
Female 57.7
Outdoors 56.2
Person 44.4

Clarifai
created on 2023-10-26

cavalry 99.7
people 99.4
monochrome 97.4
group 95.1
many 95
adult 94.1
art 94.1
man 94
mammal 93.8
seated 92.2
carriage 91.5
group together 91.2
transportation system 88.5
camel 88.3
illustration 87.9
chair 87.8
wagon 86.8
woman 83.7
street 83.3
vehicle 81.3

Imagga
created on 2022-01-15

camel 54
horse 26.9
ungulate 23
sky 23
billboard 22.6
sunset 18
structure 17.6
signboard 17.5
travel 16.2
landscape 15.6
saddle 15
silhouette 14.9
city 14.1
sun 13.7
tourism 13.2
architecture 12.6
stock saddle 12
sunrise 11.2
old 11.1
clouds 11
vacation 10.6
people 10.6
statue 10.6
truck 10.4
garbage truck 10.3
equestrian 9.8
summer 9.6
sea 9.4
stable gear 9.3
seat 9.3
outdoor 9.2
building 9.1
saddle blanket 9
sand 8.8
riding 8.8
equipment 8.7
rock 8.7
water 8.7
sculpture 8.6
dusk 8.6
farm 8
night 8
rural 7.9
urban 7.9
person 7.7
construction 7.7
industry 7.7
beach 7.6
evening 7.5
desert 7.5
ocean 7.5
bridle 7.4
man 7.4
brown 7.4
light 7.4
historic 7.3
protection 7.3
industrial 7.3
cowboy 7.2
transportation 7.2
history 7.2
country 7

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 99.9
statue 91.2
black and white 83.5
horse 67.6
sky 61.7
old 53.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 34-42
Gender Male, 97.3%
Happy 90.8%
Fear 4.1%
Calm 1.7%
Surprised 1.5%
Sad 0.7%
Angry 0.5%
Disgusted 0.5%
Confused 0.3%

AWS Rekognition

Age 22-30
Gender Female, 96.1%
Calm 46.8%
Happy 34.8%
Angry 4.4%
Sad 4.3%
Fear 4.1%
Disgusted 2.9%
Confused 1.6%
Surprised 1.1%

AWS Rekognition

Age 23-33
Gender Male, 88.1%
Calm 65%
Sad 16.4%
Happy 7.7%
Fear 3%
Confused 2.4%
Angry 2.4%
Disgusted 1.6%
Surprised 1.5%

AWS Rekognition

Age 14-22
Gender Male, 99.6%
Calm 75.7%
Fear 9.5%
Sad 8.2%
Angry 4.5%
Disgusted 0.7%
Confused 0.6%
Happy 0.5%
Surprised 0.3%

Feature analysis

Amazon

Person 99.4%
Horse 89%

Captions

Microsoft
created on 2022-01-15

a vintage photo of a person 71.4%
a vintage photo of a person 69.2%

Text analysis

Amazon

10
FILM
KODAK
SAFETY
41263

Google

4-1263
4-1263