Human Generated Data

Title

Untitled (men shaking hands across long table)

Date

1948

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19342

Human Generated Data

Title

Untitled (men shaking hands across long table)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1948

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.6
Human 99.6
Person 98.2
Person 98.2
Person 96.6
Dance Pose 94.1
Leisure Activities 94.1
Accessories 93.4
Accessory 93.4
Tie 93.4
Clothing 89.2
Apparel 89.2
Food 86.3
Meal 86.3
Tie 84.7
Person 73.3
Suit 72.2
Coat 72.2
Overcoat 72.2
Stage 71.9
Shirt 69.7
Dish 60.7
Person 60.6
Dance 59
Performer 58.9
Tuxedo 58.4

Imagga
created on 2022-03-05

person 42.8
man 40.3
people 32.9
professional 30.3
male 29.2
adult 27.1
medical 23
office 22.6
indoors 22
nurse 21
room 20.3
home 19.1
patient 18.9
doctor 18.8
hospital 18.2
smiling 18.1
business 17.6
worker 17.1
working 16.8
men 16.3
happy 16.3
computer 16.1
businessman 15.9
work 15.7
sitting 15.5
health 15.3
desk 15.1
coat 14.4
talking 13.3
senior 13.1
smile 12.8
student 12.3
casual 11.9
teacher 11.8
clinic 11.5
job 11.5
portrait 11
lifestyle 10.8
team 10.8
care 10.7
medicine 10.6
mature 10.2
teamwork 10.2
colleagues 9.7
clothing 9.7
group 9.7
together 9.6
standing 9.6
education 9.5
shirt 9.3
horizontal 9.2
occupation 9.2
waiter 9.1
black 9
family 8.9
interior 8.8
doctors 8.8
looking 8.8
two people 8.7
table 8.7
exam 8.6
happiness 8.6
corporate 8.6
employee 8.5
lab coat 8.4
uniform 8.4
color 8.3
laptop 8.3
planner 8.3
holding 8.3
indoor 8.2
businesswoman 8.2
stethoscope 8.1
scholar 8
practitioner 7.9
couple 7.8
child 7.8
discussion 7.8
lab 7.8
executive 7.7
laboratory 7.7
30s 7.7
elderly 7.7
profession 7.7
illness 7.6
case 7.6
businesspeople 7.6
meeting 7.5
mother 7.5
one 7.5
phone 7.4
inside 7.4
classroom 7.3
friendly 7.3
suit 7.2
handsome 7.1
to 7.1
day 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

person 99.7
text 95.7
clothing 92.2
man 91.9

Face analysis

Amazon

Google

AWS Rekognition

Age 33-41
Gender Male, 50.8%
Sad 44.5%
Calm 27.7%
Surprised 10.5%
Confused 4.1%
Fear 3.8%
Disgusted 3.6%
Angry 3.1%
Happy 2.5%

AWS Rekognition

Age 54-62
Gender Male, 98.8%
Calm 69.9%
Happy 22.4%
Surprised 5.5%
Confused 0.9%
Disgusted 0.5%
Angry 0.4%
Sad 0.3%
Fear 0.1%

AWS Rekognition

Age 24-34
Gender Male, 93.9%
Confused 33.2%
Sad 25.6%
Calm 11.9%
Happy 10.7%
Surprised 10.3%
Fear 3.6%
Disgusted 3.2%
Angry 1.6%

AWS Rekognition

Age 50-58
Gender Male, 97.8%
Happy 82%
Sad 13.2%
Confused 1.9%
Disgusted 1.6%
Surprised 0.5%
Angry 0.4%
Calm 0.2%
Fear 0.2%

AWS Rekognition

Age 45-51
Gender Male, 91.9%
Happy 57.9%
Calm 32.2%
Sad 3.8%
Confused 3.3%
Surprised 0.9%
Angry 0.8%
Disgusted 0.7%
Fear 0.4%

AWS Rekognition

Age 24-34
Gender Female, 59.3%
Calm 68.5%
Sad 17%
Confused 6.1%
Surprised 5.9%
Happy 0.8%
Disgusted 0.7%
Fear 0.6%
Angry 0.4%

AWS Rekognition

Age 41-49
Gender Male, 80.2%
Calm 93.8%
Sad 2.3%
Happy 1.4%
Disgusted 0.6%
Fear 0.6%
Angry 0.5%
Confused 0.4%
Surprised 0.3%

AWS Rekognition

Age 43-51
Gender Male, 99.9%
Calm 57.7%
Sad 39.3%
Confused 0.7%
Happy 0.7%
Disgusted 0.6%
Angry 0.5%
Fear 0.3%
Surprised 0.2%

AWS Rekognition

Age 43-51
Gender Male, 97.4%
Sad 70.7%
Angry 13%
Confused 9.5%
Calm 2.8%
Disgusted 1.9%
Fear 0.8%
Happy 0.7%
Surprised 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person 99.6%
Tie 93.4%

Captions

Microsoft

a group of people standing in a room 85.2%
a group of people in a room 85.1%
a group of people standing next to a man 67.5%

Text analysis

Amazon

23