Human Generated Data

Title

Untitled (men looking at vacuum display inside shop window)

Date

1948

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14551

Human Generated Data

Title

Untitled (men looking at vacuum display inside shop window)

People

Artist: Jack Gould, American

Date

1948

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Person 99.6
Human 99.6
Person 99.6
Person 99.2
Person 98.9
Clothing 98.5
Apparel 98.5
Appliance 81.8
Building 77.3
Metropolis 77.3
City 77.3
Urban 77.3
Town 77.3
Coat 67.9
Overcoat 67.9
Dish 58.5
Food 58.5
Meal 58.5
Suit 58.5
Hat 56.9
Steamer 56.9

Imagga
created on 2022-01-29

brass 27.5
glass 24.3
device 21.8
wind instrument 21.1
interior 17.7
kitchen 16.3
home 15.9
musical instrument 15.8
equipment 15.6
food 15.2
table 14.8
dinner 14.5
trombone 14.3
room 14
technology 13.3
modern 12.6
case 12
plate 11.8
people 11.7
drink 11.7
decoration 11.6
medical 11.5
luxury 11.1
industry 11.1
machine 10.8
man 10.7
medicine 10.6
party 10.3
work 10.3
restaurant 10
house 10
clean 10
business 9.7
metal 9.6
setting 9.6
wine 9.2
celebration 8.8
research 8.6
gramophone 8.4
person 8.4
service 8.3
alcohol 8.3
inside 8.3
occupation 8.2
silver 7.9
indoors 7.9
life 7.8
napkin 7.8
male 7.8
adult 7.8
record player 7.8
waiting 7.7
meal 7.7
lunch 7.7
communication 7.5
fashion 7.5
place 7.4
glasses 7.4
design 7.3
steel 7.1
working 7.1

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

text 99.4
clothing 89.6
person 86.9
musical instrument 81.7
drum 66.4
man 61.9

Face analysis

Amazon

Google

AWS Rekognition

Age 43-51
Gender Male, 98.9%
Calm 94.2%
Sad 1.6%
Disgusted 0.9%
Angry 0.9%
Confused 0.8%
Surprised 0.7%
Happy 0.6%
Fear 0.2%

AWS Rekognition

Age 45-53
Gender Male, 100%
Calm 94.2%
Surprised 4.1%
Sad 0.8%
Happy 0.2%
Confused 0.2%
Angry 0.2%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 39-47
Gender Male, 96.6%
Calm 37.4%
Happy 29.2%
Sad 14.9%
Surprised 8.7%
Fear 4.8%
Confused 2.3%
Disgusted 1.5%
Angry 1.2%

AWS Rekognition

Age 30-40
Gender Male, 99.1%
Calm 99.4%
Disgusted 0.3%
Surprised 0.1%
Confused 0.1%
Angry 0%
Sad 0%
Fear 0%
Happy 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Captions

Microsoft

a group of people standing in front of a store 47.4%
a group of people standing in a store 43.3%
a person standing in front of a store 43.2%

Text analysis

Amazon

QUEEN
REMINGTON
CHR
MJI7
MJI7 YESTAD
DISPA
YESTAD
LOURS
DE

Google

MJI7 YT3RA REMINGTON
MJI7
YT3RA
REMINGTON