Human Generated Data

Title

Untitled (two cheetas on a see-saw)

Date

c. 1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4750

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two cheetas on a see-saw)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4750

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.8
Human 99.8
Person 88.6
Wheel 86.5
Machine 86.5
People 71.2
Leisure Activities 65.2
Prison 55.9

Clarifai
created on 2023-10-26

people 99.8
group together 97.9
two 97.2
man 97
adult 96.9
group 96.1
recreation 95.4
three 93.5
woman 91.2
one 89.5
print 87.7
administration 87.6
child 87.1
web 85.4
fence 84.4
several 83.3
wear 82.6
home 81.4
sports equipment 81.3
chair 79.8

Imagga
created on 2022-01-23

shopping cart 80.5
handcart 61.3
wheeled vehicle 46.5
container 33.5
sketch 23.6
drawing 23.3
conveyance 20.5
cart 16.8
shopping 16.5
building 15.9
architecture 13.4
snow 13.2
buy 13.1
empty 12.9
metal 12.9
urban 12.2
business 12.1
construction 12
grunge 11.9
house 11.7
city 11.6
shop 11.6
old 11.1
representation 11.1
winter 11.1
supermarket 10.5
sky 10.2
dishwasher 9.9
trolley 9.9
market 9.8
basket 9.7
black 9.6
retail 9.5
wall 9.4
equipment 9.4
sale 9.2
inside 9.2
transport 9.1
structure 8.8
store 8.5
design 8.4
white goods 8.2
river 8
interior 8
people 7.8
male 7.8
table 7.8
cold 7.7
travel 7.7
modern 7.7
trade 7.6
chairlift 7.5
landscape 7.4
style 7.4
man 7.4
exterior 7.4
home 7.2
steel 7.1
chair 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

playground 94.6
outdoor 90.6
black and white 87.9
text 78.8
person 75.2
street 52.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Male, 99.9%
Calm 41.7%
Confused 31.7%
Angry 17.6%
Sad 3.2%
Disgusted 2.2%
Surprised 1.4%
Happy 1.3%
Fear 0.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Wheel
Person 99.8%
Person 88.6%
Wheel 86.5%

Categories