Human Generated Data

Title

Untitled (two women standing at table of preserves)

Date

c. 1945, printed later

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6712

Human Generated Data

Title

Untitled (two women standing at table of preserves)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1945, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 98.1
Human 98.1
Person 98.1
Furniture 95.3
Flooring 85.3
Table 84.4
Apparel 82.3
Clothing 82.3
Floor 71.4
Couch 68
Face 66.5
Food 64.5
Meal 64.5
Reception 63.5
Pub 62.5
Wood 60.8
Bar Counter 60.2
Plywood 59.3
Living Room 58.9
Indoors 58.9
Room 58.9
Shelf 55.3

Clarifai
created on 2019-11-16

people 99.8
group 99
adult 98.1
room 97.7
furniture 97.3
group together 97.3
woman 97.1
man 96.4
chair 96.1
indoors 91.4
table 89.1
desk 87.2
leader 85
many 84.2
sit 83.4
meeting 82.8
several 82.7
seat 82.3
vehicle 80.1
education 79.3

Imagga
created on 2019-11-16

room 62.9
interior 46
table 42.9
classroom 37.8
chair 34.6
modern 31.6
percussion instrument 30.2
furniture 28.8
home 27.1
house 25.9
indoors 25.5
musical instrument 25.5
decor 22.1
apartment 22
floor 21.4
indoor 20.1
kitchen 20.1
design 19.7
office 19.4
wood 18.3
inside 17.5
man 17.5
business 16.4
architecture 15.6
desk 14.7
grand piano 14.6
window 14.1
glass 14
piano 13
sitting 12.9
people 12.8
light 12.7
domestic 12.7
counter 12.6
businessman 12.4
living 12.3
lifestyle 12.3
male 12.1
stringed instrument 12
luxury 12
cabinet 11.8
communication 11.8
sofa 11.6
smiling 11.6
group 11.3
contemporary 11.3
wall 11.2
empty 11.2
person 11
board 10.9
decoration 10.9
3d 10.8
restaurant 10.7
steel 10.6
residential 10.5
dining 10.5
meeting 10.4
keyboard instrument 10.2
hall 10.1
vibraphone 10.1
metal 9.7
comfortable 9.6
education 9.5
women 9.5
executive 9.5
laptop 9.5
corporate 9.5
men 9.4
adult 9.4
blackboard 9.3
building 9.2
seat 9.2
teacher 9.1
stove 9
oven 9
chairs 8.8
class 8.7
work 8.6
day 8.6
lamp 8.6
nobody 8.6
device 8.5
structure 8.5
computer 8.3
style 8.2
job 8
tile 7.9
20 24 years 7.9
leisure activity 7.8
couple 7.8
full length 7.8
vase 7.7
content 7.7
marimba 7.7
estate 7.6
enjoyment 7.5
professional 7.5
holding 7.4
coffee 7.4
cheerful 7.3
school 7.3
businesswoman 7.3

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

wall 98
indoor 96.1
table 93.6
furniture 93.2
whiteboard 93.1
handwriting 83.5
clothing 80
woman 79.7
person 74.9
chair 63.9
desk 56.4

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 4-12
Gender Female, 51.3%
Surprised 45.8%
Happy 45.1%
Sad 46.2%
Fear 48.3%
Calm 47%
Confused 46.6%
Disgusted 45.2%
Angry 45.8%

AWS Rekognition

Age 7-17
Gender Male, 53.1%
Calm 52.2%
Disgusted 45.1%
Surprised 45.4%
Fear 45%
Happy 46.4%
Angry 45.5%
Confused 45.3%
Sad 45.1%

Microsoft Cognitive Services

Age 2
Gender Male

Feature analysis

Amazon

Person 98.1%

Captions

Microsoft

a group of people standing in a kitchen 82%
a group of people standing in a room 81.9%
a group of people in a kitchen 81.8%

Text analysis

Amazon

uye