Human Generated Data

Title

Untitled (man at desk with products)

Date

1951

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20247

Human Generated Data

Title

Untitled (man at desk with products)

People

Artist: Peter James Studio, American

Date

1951

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.1
Human 99.1
Meal 94.8
Food 94.8
Clothing 87.4
Apparel 87.4
Mammal 86.3
Horse 86.3
Animal 86.3
Dish 83.6
Person 83.2
Restaurant 74.9
Tie 63.2
Accessories 63.2
Accessory 63.2
Coat 61.8
Cafeteria 57
Scientist 56.5
Screen 55.2
Electronics 55.2
Chair 55
Furniture 55

Imagga
created on 2022-03-05

musical instrument 29.9
wedding 19.3
bride 19.2
flowers 19.1
celebration 17.5
party 17.2
table 16.8
decoration 15.4
interior 15
glass 14.4
marriage 14.2
man 14.1
bouquet 13.7
chair 13.6
wind instrument 13.2
stringed instrument 12.9
male 12.8
people 12.8
accordion 12.8
salon 12.7
window 12.3
shop 12.1
event 12
restaurant 11.3
couple 11.3
men 11.2
groom 11.1
dinner 10.9
business 10.9
bridal 10.7
romantic 10.7
keyboard instrument 10.3
indoor 10
romance 9.8
wed 9.8
modern 9.8
person 9.8
ceremony 9.7
food 9.7
setting 9.6
adult 9.6
service 9.2
flower 9.2
plate 9.2
black 9
home 8.8
love 8.7
luxury 8.6
dining 8.6
drink 8.3
happy 8.1
art 8
decor 8
banquet 7.8
napkin 7.8
banjo 7.7
old 7.7
two 7.6
elegance 7.6
house 7.5
mercantile establishment 7.4
guitar 7.4
glasses 7.4
tradition 7.4
wine 7.4
room 7.3
dress 7.2
holiday 7.2
women 7.1
family 7.1
happiness 7
indoors 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

window 99.8
indoor 93.7
person 89.2
black and white 82.1
clothing 78.4

Face analysis

Amazon

Google

AWS Rekognition

Age 53-61
Gender Male, 100%
Happy 79.4%
Surprised 7.5%
Confused 6.3%
Calm 2.9%
Fear 1.6%
Disgusted 0.8%
Sad 0.8%
Angry 0.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Horse 86.3%
Tie 63.2%

Captions

Microsoft

a group of people standing in front of a window 80.1%
a group of people in front of a window 79.7%
a group of people sitting at a table in front of a window 72.3%

Text analysis

Amazon

d-CON
d-co
d-c
d-0
-ХАСОЯ
d-COBI-CON
دار

Google

E.
-CON
-co
E. -CON d-CO -CON -co -Co-CON
-Co-CON
d-CO