Human Generated Data

Title

Untitled (two men holding hats behind a desk with hats)

Date

1939

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8273

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two men holding hats behind a desk with hats)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8273

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.3
Human 99.3
Person 98.6
Helmet 98.5
Clothing 98.5
Apparel 98.5
Chef 93.4
Meal 59.4
Food 59.4
Person 54.7

Clarifai
created on 2023-10-25

people 99.4
science 98.5
scientist 98.1
group 97.9
monochrome 96.8
adult 95.7
medicine 94.8
medical practitioner 92.7
education 90.5
man 90
war 89.9
laboratory 89.5
indoors 89.2
furniture 89.1
three 88.9
research 88.5
room 88.3
group together 88
biology 87.6
container 87.5

Imagga
created on 2022-01-08

case 28.1
man 23.5
shop 20.9
home 19.9
people 19.5
adult 19.5
person 18.2
male 16.4
barbershop 15.9
old 15.3
mercantile establishment 15.1
kitchen 14.3
interior 14.1
indoors 14
musical instrument 13.7
happy 12.5
portrait 12.3
room 12.1
house 11.7
lifestyle 11.5
senior 11.2
men 11.1
vintage 10.7
family 10.7
couple 10.4
business 10.3
sitting 10.3
black 10.2
place of business 10.1
holding 9.9
cheerful 9.7
banjo 9.1
retro 9
worker 8.2
women 7.9
chair 7.9
table 7.9
smile 7.8
happiness 7.8
mother 7.8
face 7.8
gift 7.7
fashion 7.5
human 7.5
stringed instrument 7.5
kin 7.4
playing 7.3
aged 7.2
decoration 7.2
smiling 7.2
looking 7.2
antique 7.1
window 7.1
child 7.1
businessman 7.1
modern 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 97.7
indoor 94.7
person 94.4
bottle 94
black and white 86.6
tableware 85.1
clothing 83.8
vase 75.9
man 63.4
monochrome 56.2
table 34

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 45-51
Gender Male, 57.2%
Calm 59%
Happy 34.7%
Surprised 1.9%
Sad 1.8%
Fear 1.2%
Disgusted 0.6%
Confused 0.6%
Angry 0.3%

AWS Rekognition

Age 24-34
Gender Male, 98.3%
Calm 86.1%
Surprised 5.7%
Sad 4.8%
Confused 2.1%
Happy 0.5%
Fear 0.3%
Angry 0.3%
Disgusted 0.2%

Feature analysis

Amazon

Person 99.3%
Helmet 98.5%

Text analysis

Amazon

9323
9323.
MJI3
MJI3 AZOA
AZOA

Google

9323. 9323.
9323.