Human Generated Data

Title

Untitled (nuns looking at machine)

Date

1959

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17936

Human Generated Data

Title

Untitled (nuns looking at machine)

People

Artist: Lucian and Mary Brown, American

Date

1959

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Human 99.8
Person 99.8
Clothing 97.2
Apparel 97.2
Clinic 85.8
Table 77
Furniture 77
Person 74.8
Accessories 74.3
Accessory 74.3
Sunglasses 74.3
Sleeve 73.1
Photography 63.5
Face 63.5
Photo 63.5
Portrait 63.5
Coat 56.5
Lab 56.4
Long Sleeve 55.8

Imagga
created on 2022-03-04

man 32.2
person 32.1
people 29.6
work 29
office 29
male 26.9
professional 26.7
computer 26.6
nurse 26.1
working 25.6
adult 25.5
worker 24.2
business 22.5
laptop 19.5
job 19.5
indoors 18.4
businesswoman 18.2
happy 17.5
smiling 17.4
corporate 17.2
desk 17.1
communication 15.9
medical 15.9
specialist 15.8
patient 15.5
men 15.5
portrait 14.9
hospital 14.7
room 14
device 13.9
sitting 13.7
team 13.4
businesspeople 13.3
businessman 13.2
senior 13.1
table 13.1
executive 13.1
looking 12.8
home 12.8
women 12.6
lifestyle 12.3
occupation 11.9
technology 11.9
smile 11.4
doctor 11.3
education 11.3
clinic 11
medicine 10.6
talking 10.5
health 10.4
camera 10.2
equipment 9.9
kitchen 9.8
assistant 9.7
meeting 9.4
happiness 9.4
teamwork 9.3
successful 9.1
modern 9.1
confident 9.1
iron lung 8.9
group 8.9
instrument 8.8
colleagues 8.7
clothing 8.7
chemical 8.7
elderly 8.6
profession 8.6
salon 8.6
casual 8.5
mature 8.4
human 8.2
cheerful 8.1
suit 8.1
handsome 8
bright 7.9
day 7.8
surgery 7.8
chemistry 7.7
old 7.7
screen 7.6
notebook 7.6
illness 7.6
horizontal 7.5
keyboard 7.5
surgeon 7.4
coat 7.4
phone 7.4
case 7.3
mask 7.2
respirator 7.1

Google
created on 2022-03-04

Microsoft
created on 2022-03-04

text 98.4
wall 98
drawing 89
indoor 86.2
person 85.6
clothing 74.5
sketch 56.5
human face 56.1
old 41.7

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Female, 95.7%
Happy 95.4%
Calm 2.9%
Angry 0.4%
Surprised 0.4%
Confused 0.3%
Sad 0.2%
Disgusted 0.2%
Fear 0.2%

AWS Rekognition

Age 39-47
Gender Male, 90.7%
Surprised 71.3%
Calm 24.9%
Happy 1.6%
Disgusted 0.6%
Angry 0.5%
Confused 0.5%
Sad 0.4%
Fear 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Sunglasses 74.3%

Captions

Microsoft

a person standing in front of a mirror posing for the camera 62.3%
a person standing in front of a mirror 62.2%
a person that is standing in front of a mirror posing for the camera 51%

Text analysis

Amazon

the
for
2
the Wo
pray for
Wo
St.
pray
St. JOSEP
JOSEP
us

Google

St JosEP the W pray for NAGON
the
St
pray
JosEP
W
for
NAGON