Human Generated Data

Title

Untitled (male graduate in cap and gown receiving diploma)

Date

1948

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2968

Human Generated Data

Title

Untitled (male graduate in cap and gown receiving diploma)

People

Artist: Harry Annas, American 1897 - 1980

Date

1948

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-21

Human 99.3
Person 99.3
Person 98.8
Person 98.6
Stage 98
Musical Instrument 96.2
Musician 96.2
Person 93.1
Person 91.6
Crowd 82.2
Person 81.7
Person 80.8
Person 78
Person 70.4
Person 69.8
Music Band 68.2
Person 66.8
Apparel 66.2
Clothing 66.2
Person 63.5
Person 62.9
Person 60.5
Percussion 59.5
Drum 59.5
Text 57.1
Overcoat 56.1
Coat 56.1
Suit 56.1

Imagga
created on 2022-01-21

percussion instrument 83.8
musical instrument 70
marimba 59.4
stage 29.9
people 26.8
steel drum 26.4
man 25.5
male 24.8
business 23.7
platform 23.4
businessman 22.1
person 21.5
men 20.6
group 20.1
office 17.7
silhouette 17.4
job 15
adult 14.8
meeting 14.1
vibraphone 13.8
worker 12.5
room 12.3
table 12.1
executive 12.1
women 11.9
teacher 11.8
work 11.8
team 11.6
professional 11.3
communication 10.9
black 10.8
classroom 10.6
interior 10.6
chair 10.4
manager 10.2
device 9.7
corporate 9.4
sitting 9.4
indoors 8.8
happy 8.8
couple 8.7
smiling 8.7
lifestyle 8.7
education 8.7
crowd 8.6
businesswoman 8.2
board 8.1
suit 8.1
working 7.9
life 7.9
class 7.7
modern 7.7
desk 7.6
house 7.5
holding 7.4
teamwork 7.4
wind instrument 7.3
music 7.3
indoor 7.3
smile 7.1
to 7.1
day 7.1
paper 7.1
sky 7
together 7

Google
created on 2022-01-21

Microsoft
created on 2022-01-21

person 95.2
man 88.5
clothing 84.3
text 62.3

Face analysis

Amazon

Google

AWS Rekognition

Age 38-46
Gender Male, 87.1%
Calm 98.7%
Sad 0.9%
Confused 0.1%
Surprised 0.1%
Angry 0.1%
Happy 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 35-43
Gender Male, 99.8%
Calm 94.7%
Sad 3.2%
Happy 0.6%
Confused 0.5%
Angry 0.4%
Disgusted 0.3%
Surprised 0.2%
Fear 0.1%

AWS Rekognition

Age 39-47
Gender Male, 99.5%
Calm 87%
Disgusted 7.2%
Confused 2.2%
Sad 1.2%
Surprised 0.8%
Angry 0.7%
Fear 0.7%
Happy 0.2%

AWS Rekognition

Age 28-38
Gender Male, 71.1%
Calm 45.9%
Happy 21.2%
Sad 19.3%
Fear 6.8%
Surprised 2.4%
Disgusted 1.9%
Angry 1.7%
Confused 0.8%

AWS Rekognition

Age 21-29
Gender Female, 94.9%
Calm 80.6%
Confused 6.5%
Surprised 4.3%
Sad 2.6%
Angry 1.6%
Fear 1.5%
Happy 1.5%
Disgusted 1.3%

AWS Rekognition

Age 24-34
Gender Male, 97.9%
Sad 50.8%
Calm 22.4%
Confused 19.1%
Angry 5.5%
Fear 0.8%
Surprised 0.7%
Disgusted 0.6%
Happy 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Possible

Feature analysis

Amazon

Person 99.3%

Captions

Microsoft

a group of people standing in front of a store 56.4%
a group of people standing in front of a building 56.3%
a person standing in front of a store 49.2%