Human Generated Data

Title

Untitled (two girls seated at kitchen table with cookies)

Date

c. 1955

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8159

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two girls seated at kitchen table with cookies)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1955

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Chair 99.6
Furniture 99.6
Human 98.4
Person 98.4
Table 94.9
Dining Table 93.1
Room 90.2
Indoors 90.2
Screen 83.7
Electronics 83.7
Chair 83
Meal 82.5
Food 82.5
Monitor 79
Display 79
Restaurant 76
Living Room 75.8
Shelf 72.8
Person 71.6
Couch 68.9
Cafeteria 66
Dish 65.6
Wood 65.2
Undershirt 61.3
Clothing 61.3
Apparel 61.3
Desk 60.4
Bookcase 58.8
LCD Screen 58
Sitting 56.9
Flooring 55.3

Imagga
created on 2022-01-08

room 37
table 35.6
restaurant 32.9
interior 30.9
kitchen 29.3
home 27.1
man 25.5
indoors 23.7
chair 23.4
furniture 22.3
stove 21.6
people 20.1
house 20
indoor 19.2
computer 18.9
desk 18.6
building 18.4
sitting 18
person 17.6
office 17.3
lifestyle 16.6
classroom 16.4
food 16.4
male 16.3
disk jockey 16.2
inside 15.6
work 14.1
laptop 14.1
modern 14
broadcaster 13.9
cook 13.7
cafeteria 13.7
oven 13.4
wood 13.3
contemporary 13.2
cooking 13.1
dinner 12.9
adult 12.5
working 12.4
women 11.9
design 11.8
glass 11.7
structure 11.6
dining 11.4
men 11.2
business 10.9
smiling 10.8
meal 10.7
communicator 9.7
counter 9.7
style 9.6
education 9.5
luxury 9.4
meeting 9.4
happiness 9.4
happy 9.4
floor 9.3
communication 9.2
drink 9.2
equipment 9.1
holding 9.1
technology 8.9
businessman 8.8
together 8.8
couple 8.7
light 8.7
class 8.7
talking 8.6
friends 8.5
eating 8.4
teamwork 8.3
student 8.3
team 8.1
group 8.1
color 7.8
dining table 7.7
studio apartment 7.7
plate 7.6
reading 7.6
teacher 7.6
togetherness 7.6
keyboard 7.5
enjoyment 7.5
device 7.5
coffee 7.4
wine 7.4
cheerful 7.3
businesswoman 7.3
decoration 7.2
looking 7.2
monitor 7.1
decor 7.1
refrigerator 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

indoor 95.1
person 91
black and white 90.9
furniture 90.7
chair 65.8
text 62.4
table 33.7
dining table 9

Face analysis

Amazon

AWS Rekognition

Age 23-33
Gender Female, 78.2%
Calm 99.8%
Disgusted 0.1%
Surprised 0%
Sad 0%
Happy 0%
Angry 0%
Confused 0%
Fear 0%

Feature analysis

Amazon

Chair 99.6%
Person 98.4%

Captions

Microsoft

a person sitting at a table in front of a window 84.1%
a person sitting at a table in front of a window 78.6%
a person sitting at a table 78.5%

Text analysis

Amazon

YT77A2
جلس YT77A2 АЗАА
АЗАА
جلس

Google

4000 and
4000
and