Human Generated Data

Title

Untitled (two men reading on train while seated on couch)

Date

1937

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8190

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two men reading on train while seated on couch)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1937

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8190

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 98.8
Human 98.8
Clothing 80.9
Apparel 80.9
Person 76.3
Hat 74.1
Home Decor 70.8
Furniture 67.5
Face 62.3
Door 57.4

Clarifai
created on 2023-10-25

people 99.9
adult 98.3
monochrome 98
two 97.1
man 96.9
three 96.2
group 95.3
medical practitioner 92.2
wear 92.1
administration 91.8
veil 90
one 88.3
group together 88.1
vehicle 88.1
portrait 86.9
hospital 86.2
room 85.8
four 84
woman 83.6
street 82

Imagga
created on 2022-01-08

surgeon 46.9
hospital 45
man 39.6
patient 28.3
person 28.1
male 26.9
people 26.2
medical 22.9
work 21.2
home 20.7
room 20.4
adult 20.2
passenger 19.5
health 19.4
doctor 17.9
worker 17.8
medicine 17.6
equipment 17.3
business 16.4
men 16.3
professional 16.3
illness 16.2
working 15.9
indoors 15.8
shop 15.7
barbershop 15.7
office 14.7
computer 14.4
inside 13.8
to 12.4
job 12.4
care 11.5
hairdresser 11.3
happy 11.3
smiling 10.8
uniform 10.8
surgery 10.7
nurse 10.7
sick 10.6
technology 10.4
sitting 10.3
lifestyle 10.1
treatment 10.1
occupation 10.1
interior 9.7
businessman 9.7
disease 9.7
clinic 9.7
chair 9.6
bed 9.5
mercantile establishment 9.5
industry 9.4
senior 9.4
home appliance 9.3
newspaper 9.3
communication 9.2
horizontal 9.2
holding 9.1
case 9
operation 8.9
looking 8.8
couple 8.7
emergency 8.7
profession 8.6
pain 8.6
face 8.5
monitor 8.5
portrait 8.4
color 8.3
phone 8.3
back 8.3
indoor 8.2
women 7.9
surgical 7.9
mask 7.8
microwave 7.8
desk 7.8
modern 7.7
exam 7.7
talking 7.6
college 7.6
adults 7.6
appliance 7.5
device 7.5
refrigerator 7.4
cheerful 7.3
laptop 7.3
industrial 7.3
family 7.1
happiness 7
oxygen mask 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

person 94.6
clothing 94.3
text 91.1
man 78.7
black and white 61.4
human face 57.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Male, 99.2%
Calm 89.3%
Sad 7.8%
Confused 1%
Surprised 0.6%
Angry 0.3%
Disgusted 0.3%
Fear 0.3%
Happy 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%
Hat 74.1%

Captions

Microsoft
created on 2022-01-08

a man sitting in front of a window 62.9%
a man sitting next to a window 58.3%

Text analysis

Amazon

5154
CCAG
CCAG nachanso
nachanso

Google

5154
5154