Human Generated Data

Title

Untitled (man waving from back of train; seen from below)

Date

c. 1946

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14426

Human Generated Data

Title

Untitled (man waving from back of train; seen from below)

People

Artist: Jack Gould, American

Date

c. 1946

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Apparel 99.4
Clothing 99.4
Human 98.6
Person 97.8
Fashion 87.8
Robe 87.8
Face 87.1
Gown 82.4
Wedding 80.1
Female 75.3
Vehicle 74
Transportation 74
Aircraft 74
Airplane 74
Photo 69.6
Portrait 69.6
Photography 69.6
Wedding Gown 68.1
Bridegroom 64.6
Bride 56.1
Sleeve 55.9
Woman 55.7

Imagga
created on 2022-01-29

adult 31.2
people 30.1
person 27.6
male 24.9
office 23.2
man 22.9
home 19.9
smiling 18.8
portrait 18.8
happy 18.2
business 17.6
work 17.3
sitting 17.2
indoors 16.7
computer 15.3
clothing 14.4
day 14.1
laptop 14
professional 13.9
men 13.7
room 13.5
working 13.2
20s 12.8
looking 12.8
businessman 12.4
indoor 11.9
lifestyle 11.6
smile 11.4
one person 11.3
one 11.2
attractive 11.2
negative 11.1
casual 11
happiness 11
holding 10.7
handsome 10.7
job 10.6
modern 10.5
building 10.4
women 10.3
businesswoman 10
face 9.9
mother 9.7
dishwasher 9.4
architecture 9.4
light 9.4
house 9.2
table 8.9
film 8.9
desk 8.9
home appliance 8.8
interior 8.8
medical 8.8
white goods 8.8
worker 8.7
standing 8.7
30s 8.7
pretty 8.4
camera 8.3
occupation 8.2
life 8.1
clinic 8
model 7.8
corporate 7.7
horizontal 7.5
groom 7.5
clothes 7.5
appliance 7.4
technology 7.4
student 7.4
inside 7.4
chair 7.4
back 7.3
cheerful 7.3
stucco 7.1
travel 7

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

clothing 89.7
person 87.1
text 86.4
man 79.8
black and white 78.3
appliance 69.3
white goods 65
refrigerator 60.4
wedding dress 59.8
human face 59.3
smile 57.2

Face analysis

Amazon

Google

AWS Rekognition

Age 45-51
Gender Male, 99.9%
Happy 46.9%
Sad 38.3%
Surprised 8.6%
Disgusted 1.8%
Calm 1.7%
Confused 1.7%
Angry 0.6%
Fear 0.3%

AWS Rekognition

Age 45-53
Gender Male, 92.2%
Happy 88.6%
Surprised 2.8%
Calm 2.7%
Confused 2.6%
Disgusted 1.2%
Angry 1.1%
Sad 0.6%
Fear 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.8%
Airplane 74%

Captions

Microsoft

a man standing in front of a refrigerator 55.7%
a man standing in front of a mirror posing for the camera 39.8%
a man that is standing in front of a refrigerator 39.7%

Text analysis

Amazon

M 117
M 117 YT37A2 АЗАА
АЗАА
YT37A2
29118845

Google

M113 YT33A2 A32A
A32A
M113
YT33A2