Human Generated Data

Title

Untitled (interior view of New Year's party with young woman and man in bowler cap in center of crowd)

Date

1952

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9422

Human Generated Data

Title

Untitled (interior view of New Year's party with young woman and man in bowler cap in center of crowd)

People

Artist: Martin Schweig, American 20th century

Date

1952

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Human 99.6
Person 99.6
Person 99.5
Person 98.1
Clothing 97.1
Apparel 97.1
Person 96.5
Hat 95.5
Person 93.2
Person 82.4
Clinic 74.8
Person 74.3
Accessories 69.9
Sunglasses 69.9
Accessory 69.9
Tie 68.8
People 67.2
Indoors 63.3
Room 62.6
Coat 58.2
Furniture 56.9
Chair 56.9
Sleeve 56.4
Tie 55
Person 49.9
Person 46.4

Imagga
created on 2022-01-23

nurse 38.6
man 34.3
professional 34.2
medical 31.8
people 31.2
doctor 31
person 29.1
lab 28.2
laboratory 27
male 26.2
coat 26.2
work 25.9
medicine 25.5
men 24.9
scientist 23.5
worker 23.3
adult 22.2
research 21.9
test 21.2
business 20
biology 19.9
chemical 19.4
chemistry 19.3
health 18.1
science 17.8
student 17.6
hospital 17.1
team 17
clinic 16.8
businessman 16.8
scientific 16.5
human 15.7
surgeon 15.6
occupation 15.6
working 15
lab coat 14.2
job 14.2
education 13.9
chemist 13.8
patient 13.7
technology 13.4
happy 13.2
teamwork 13
portrait 12.9
development 12.4
instrument 12.3
study 12.1
clothing 12.1
biotechnology 11.8
musical instrument 11.7
assistant 11.7
serious 11.4
indoors 11.4
office 11.3
looking 11.2
film 11
microscope 10.9
smiling 10.8
equipment 10.8
experiment 10.7
bright 10.7
smile 10.7
exam 10.5
profession 10.5
group 10.5
corporate 10.3
women 10.3
suit 10
care 9.9
negative 9.8
technician 9.8
world 9.8
optical 9.7
day 9.4
hand 9.1
researcher 8.9
microbiology 8.9
percussion instrument 8.8
colleagues 8.7
businesspeople 8.5
face 8.5
casual 8.5
two 8.5
marimba 8.5
clothes 8.4
shop 8.1
handsome 8
picket fence 8
uniform 8
biochemistry 7.9
analysis 7.8
expertise 7.8
tube 7.7
window 7.5
holding 7.4
focus 7.4
color 7.2
black 7.2
wind instrument 7
together 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

clothing 96.4
person 94
man 92.9
text 92.5
drawing 82.7

Face analysis

Amazon

Google

AWS Rekognition

Age 43-51
Gender Male, 86.5%
Happy 69.3%
Surprised 11%
Confused 6.4%
Sad 4.4%
Disgusted 3.8%
Calm 3.2%
Angry 1%
Fear 0.9%

AWS Rekognition

Age 22-30
Gender Male, 77.6%
Surprised 43.2%
Happy 26.2%
Calm 18.2%
Confused 5.7%
Sad 3.5%
Disgusted 1.7%
Fear 0.8%
Angry 0.6%

AWS Rekognition

Age 29-39
Gender Female, 97.9%
Calm 95.1%
Surprised 3.3%
Confused 1%
Happy 0.2%
Disgusted 0.2%
Angry 0.1%
Sad 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Hat 95.5%
Sunglasses 69.9%
Tie 68.8%

Captions

Microsoft

a group of people standing in front of a window 83.9%
a group of people standing in front of a store window 74.5%
a group of people standing next to a window 74.4%

Text analysis

Amazon

8
80A
de 8 <<<<
de
KODAK-SLA
<<<<