Human Generated Data

Title

Untitled (children eating at table)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17280

Human Generated Data

Title

Untitled (children eating at table)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Furniture 99.9
Chair 99.8
Chair 99.5
Room 98.7
Indoors 98.7
Person 97.5
Human 97.5
Chair 96.7
Person 96.5
Person 96.5
Person 94.1
Person 93.7
Dining Room 93.2
Dining Table 93.1
Table 93.1
Restaurant 93.1
Person 86.4
Food 86.3
Meal 86.3
Sitting 86.1
Dish 76.9
Person 76.7
People 71.2
Person 70.7
Chair 68.6
Cafe 68.1
Person 65.3
Person 61.4
Cafeteria 61.3
Photo 60.9
Photography 60.9
Tabletop 59.1
Living Room 59
Home Decor 57.2

Imagga
created on 2022-02-26

chair 53.9
room 40.1
seat 31.4
table 27.3
salon 26
furniture 25.3
restaurant 23.3
interior 23
indoors 21.9
classroom 21.8
home 21.5
man 20.8
people 19
rocking chair 17.8
chairs 16.6
sitting 16.3
lifestyle 15.2
male 15
person 14.8
grandfather 14
floor 13.9
dinner 13.7
dining 13.3
modern 12.6
wood 12.5
lunch 12.1
men 12
indoor 11.9
design 11.8
smiling 11.6
office 11.4
kin 11.3
senior 11.2
old 11.1
hospital 11
drink 10.9
tables 10.8
cafeteria 10.8
adult 10.6
happy 10.6
comfortable 10.5
computer 10.5
couple 10.4
empty 10.3
women 10.3
mature 10.2
coffee 10.2
casual 10.2
glass 10.1
relax 10.1
house 10
meal 9.9
family 9.8
food 9.7
business 9.7
hall 9.5
decoration 9.4
architecture 9.4
bar 9.2
life 9.1
urban 8.7
desk 8.6
wall 8.5
culture 8.5
eat 8.4
building 8.3
leisure 8.3
inside 8.3
style 8.2
structure 8.2
work 8.1
decor 7.9
patient 7.9
day 7.8
scene 7.8
elderly 7.7
health 7.6
relaxation 7.5
place 7.4
cheerful 7.3
group 7.2
smile 7.1
together 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

person 97.9
chair 96.2
text 96.1
furniture 92.7
table 81.9
clothing 72.3
drawing 66.4
people 60.9

Face analysis

Amazon

Google

AWS Rekognition

Age 40-48
Gender Male, 98.5%
Happy 90.8%
Calm 3.2%
Angry 1.5%
Surprised 1.3%
Sad 1.3%
Disgusted 0.9%
Confused 0.7%
Fear 0.4%

AWS Rekognition

Age 20-28
Gender Female, 80%
Calm 66.4%
Sad 26.3%
Confused 3%
Happy 1.2%
Fear 0.9%
Angry 0.9%
Disgusted 0.7%
Surprised 0.6%

AWS Rekognition

Age 26-36
Gender Female, 52.3%
Calm 93.8%
Disgusted 2%
Happy 1.5%
Sad 0.9%
Surprised 0.9%
Confused 0.4%
Angry 0.4%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Feature analysis

Amazon

Chair 99.8%
Person 97.5%

Captions

Microsoft

a group of people standing in front of a window 63.7%
a group of people standing around a table 63.6%
a group of people in a room 63.5%

Text analysis

Amazon

80

Google

ODVK-EE.
ODVK-EE.