Human Generated Data

Title

Untitled (two girls making cookies at kitchen table)

Date

C. 1955

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8160

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two girls making cookies at kitchen table)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

C. 1955

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Human 99.3
Person 99.3
Person 97.5
Meal 92
Food 92
Dish 80.3
Restaurant 77.4
Sitting 76.5
Electronics 73.8
Screen 73.8
Furniture 73.7
Display 72.2
Monitor 72.2
Cafeteria 68.7
Table 68.1
LCD Screen 66.5
Couch 66.1
Finger 58.4

Imagga
created on 2022-01-08

stove 77.2
equipment 32.1
computer 28.3
table 25.9
mixer 23.9
disk jockey 22.7
technology 22.3
office 20.5
interior 20.3
laptop 20.2
work 19.6
business 18.8
electronic equipment 18.3
broadcaster 18.2
modern 17.5
people 17.3
device 16.3
desk 16.2
keyboard 16.1
person 15.7
sitting 15.5
monitor 15
working 15
kitchen 14.8
hand 14.4
home 14.4
communicator 13.6
communication 13.4
furniture 13.2
electronic 13.1
lifestyle 13
man 12.9
executive 12
indoor 11.9
businesswoman 11.8
music 11.7
house 11.7
indoors 11.4
sound 11.2
party 11.2
adult 11
black 10.8
record 10.7
room 10.6
male 10.6
information 10.6
digital 10.5
men 10.3
entertainment 10.1
light 10
cooking 9.6
corporate 9.4
professional 9.3
heater 9.3
phone 9.2
occupation 9.2
control 9
jockey 8.9
job 8.8
happy 8.8
disc 8.8
mix 8.6
mobile 8.5
contemporary 8.5
design 8.4
restaurant 8.4
display 8.4
telephone 8.3
style 8.2
home appliance 8
decor 8
smiling 8
appliance 8
women 7.9
turntable 7.9
food 7.9
mixing 7.9
nightclub 7.8
education 7.8
luxury 7.7
chair 7.7
finance 7.6
notebook 7.6
screen 7.5
floor 7.4
cook 7.3
smile 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

indoor 93.8
person 91.7
text 88.5
black and white 80.6

Face analysis

Amazon

AWS Rekognition

Age 20-28
Gender Female, 93%
Calm 70.3%
Happy 26.4%
Sad 2.6%
Surprised 0.3%
Confused 0.2%
Angry 0.1%
Disgusted 0.1%
Fear 0.1%

Feature analysis

Amazon

Person 99.3%

Captions

Microsoft

a person cooking in a kitchen 73.8%
a person standing in front of a stove 73.7%
a person standing in front of a stove 65.3%

Text analysis

Amazon

SUGAR
OUR

Google

DUR
DUR SUGAR
SUGAR