Human Generated Data

Title

Untitled (three adults with plate of food sitting on floor with dog)

Date

1938

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11656

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (three adults with plate of food sitting on floor with dog)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1938

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Human 99.2
Person 99.2
Person 99
Person 98.1
Apparel 97.9
Clothing 97.9
Person 97.6
Sitting 94.3
Furniture 90.4
Couch 89
Chair 81.1
Female 67
Text 64.9
Flooring 63.2
Person 62.6
Fashion 61.7
Gown 61.7
Robe 60.5
Evening Dress 55.8
Musician 55.1
Musical Instrument 55.1

Imagga
created on 2022-01-15

person 40.9
man 40.3
laptop 37.4
computer 36.6
people 33.5
office 31.5
working 30.1
work 29.8
male 29.8
business 29.2
patient 27.2
job 23.9
indoors 22
sitting 21.5
businessman 20.3
barbershop 20.2
adult 20
professional 19.3
case 19
shop 18.9
corporate 18.9
executive 18.6
sick person 18.4
home 18.4
worker 17.9
chair 17.7
men 17.2
notebook 17
communication 16
meeting 15.1
table 14.4
teacher 14.2
technology 14.1
hairdresser 13.8
occupation 13.8
smiling 13.7
lifestyle 13
room 12.9
mercantile establishment 12.8
businesswoman 12.7
happy 12.5
passenger 12.4
equipment 12
women 11.9
desk 11.6
busy 11.6
barber chair 11.3
looking 11.2
teamwork 11.1
portrait 11
seat 10.7
keyboard 10.3
senior 10.3
casual 10.2
phone 10.1
place of business 10
interior 9.7
together 9.6
education 9.5
businesspeople 9.5
two 9.3
manager 9.3
presentation 9.3
indoor 9.1
monitor 8.9
machine 8.9
employee 8.7
reading 8.6
adults 8.5
hand 8.4
confident 8.2
device 8.1
success 8
businessmen 7.8
modern 7.7
30s 7.7
industry 7.7
workplace 7.6
talking 7.6
student 7.6
house 7.5
hospital 7.5
holding 7.4
alone 7.3
group 7.3
smile 7.1
face 7.1
furniture 7.1

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

black and white 97.5
street 94.6
person 85.8
monochrome 84.2
text 84.2
clothing 62.6
dog 53.8

Face analysis

Amazon

AWS Rekognition

Age 35-43
Gender Male, 97.5%
Calm 44.2%
Happy 39.4%
Sad 8.4%
Surprised 4.3%
Disgusted 1.7%
Angry 1%
Fear 0.5%
Confused 0.5%

Feature analysis

Amazon

Person 99.2%

Text analysis

Amazon

8
7794.
M 117
9994
M 117 YT37A2 А7АА
YT37A2
А7АА

Google

MI
A3A
7794. 7714 MI YT33A2 A3A
7794.
7714
YT33A2