Human Generated Data

Title

Untitled (Dr. Herman M. Juergens with nurses and toddler in patient's room)

Date

1965-1968

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.483.2

Human Generated Data

Title

Untitled (Dr. Herman M. Juergens with nurses and toddler in patient's room)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1965-1968

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.483.2

Machine Generated Data

Tags

Amazon
created on 2019-08-09

Human 99.4
Person 99.4
Person 99.4
Person 98.8
Person 98.3
Apparel 97.7
Clothing 97.7
Coat 83.6
Accessory 79.9
Accessories 79.9
Sunglasses 79.9
Shorts 70.7
Lab Coat 69
Face 67.2
Head 67
Tie 62.9
Kid 60.1
Child 60.1
Woman 56.5
Teen 56.5
Female 56.5
Blonde 56.5
Girl 56.5
Scientist 55.4

Clarifai
created on 2019-08-09

people 99.9
adult 99.6
two 98.6
group 97.8
group together 97.7
three 97.6
man 97.5
woman 97.2
wear 96.8
administration 96.6
four 94.9
scientist 94.8
leader 93.6
several 92
vehicle 90.9
facial expression 90.2
medical practitioner 89
five 88
portrait 86.5
one 84.9

Imagga
created on 2019-08-09

megaphone 37.9
man 36.3
device 33.9
acoustic device 29.2
people 27.3
equipment 24.2
male 22.7
punching bag 22.5
person 19.1
men 18
worker 17.8
work 17.3
adult 16.9
industrial 15.4
helmet 14.8
job 14.1
game equipment 13.5
protection 12.7
safety 12
industry 11.9
occupation 11.9
professional 11.9
women 11.9
business 11.5
human 11.2
building 11.1
black 10.8
smile 10.7
uniform 10.6
urban 10.5
portrait 9.7
weapon 9.7
training 9.2
danger 9.1
suit 9
working 8.8
happy 8.8
repair 8.6
hat 8.6
city 8.3
inside 8.3
transport 8.2
sport 8.1
vehicle 8.1
tool 8.1
active 8.1
activity 8.1
mask 8
lifestyle 7.9
hand blower 7.7
skill 7.7
telephone 7.6
car 7.5
air 7.4
metal 7.2
life 7.2
home 7.2
transportation 7.2

Google
created on 2019-08-09

Microsoft
created on 2019-08-09

person 97.5
wall 96.3
indoor 86.6
toddler 83.2
human face 82.6
clothing 82.3
text 80.7
smile 80.5
boy 75.7
black and white 68.2
baby 67.5
posing 49.4
preparing 42.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 28-44
Gender Male, 56%
Fear 3.2%
Calm 38.2%
Disgusted 0.7%
Happy 23.5%
Surprised 11.4%
Confused 10.7%
Sad 11.5%
Angry 0.8%

AWS Rekognition

Age 23-37
Gender Female, 68.2%
Surprised 47.8%
Calm 10.9%
Disgusted 1.3%
Happy 0.8%
Sad 4.9%
Angry 5%
Fear 16.1%
Confused 13.2%

AWS Rekognition

Age 33-49
Gender Male, 52.5%
Confused 45.1%
Sad 54.4%
Angry 45.3%
Surprised 45%
Happy 45%
Calm 45.1%
Fear 45.1%
Disgusted 45%

AWS Rekognition

Age 39-57
Gender Male, 50.2%
Fear 45.1%
Sad 53.2%
Disgusted 45%
Happy 45.1%
Surprised 45%
Angry 45.7%
Calm 45.8%
Confused 45%

Feature analysis

Amazon

Person 99.4%
Sunglasses 79.9%
Tie 62.9%