Human Generated Data

Title

Untitled (three men in top hats on podium)

Date

1946, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.158

Human Generated Data

Title

Untitled (three men in top hats on podium)

People

Artist: Durette Studio, American 20th century

Date

1946, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 98.9
Human 98.9
Person 98.5
Military 94.2
Clothing 92.5
Apparel 92.5
Military Uniform 91.2
Officer 88.4
Person 88.2
Person 80.4
People 80.3
Coat 77.4
Overcoat 77.4
Person 75.5
Person 72.7
Person 68.1
Person 67.6
Soldier 67.3
Crowd 66.2
Armored 59.1
Army 59.1
Suit 58.1
Person 43.2

Imagga
created on 2021-12-14

silhouette 38.9
stage 32.7
man 28
people 27.9
platform 25.7
male 24.1
sunset 23.4
person 22.5
men 18.9
businessman 18.5
business 15.2
sport 14
black 14
photographer 13.2
group 12.9
disk jockey 12.9
player 12.7
sea 11.7
park 11.5
sun 11.3
suit 10.9
sky 10.8
team 10.7
musical instrument 10.7
travel 10.6
ball 10.5
landscape 10.4
broadcaster 10.3
billboard 10.1
relax 10.1
field 10
adult 9.8
kick 9.8
job 9.7
success 9.7
laptop 9.7
goal 9.6
crowd 9.6
couple 9.6
water 9.3
evening 9.3
event 9.2
outdoor 9.2
leisure 9.1
ocean 9.1
shoot 8.7
match 8.7
lifestyle 8.7
soccer 8.7
world 8.4
lights 8.3
competition 8.2
technology 8.2
tourist 8.2
happy 8.1
communicator 8
love 7.9
holiday 7.9
computer 7.9
audience 7.8
stadium 7.8
championship 7.8
signboard 7.7
youth 7.7
muscular 7.6
beach 7.6
relaxation 7.5
training 7.4
vacation 7.4
alone 7.3
athlete 7.2
shadow 7.2
equipment 7.2
screen 7.1
summer 7.1
work 7.1
boy 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 98.6
man 97.1
funeral 91.8
clothing 90.8
person 90.2
standing 78.1
old 74.1
black and white 70.6
black 66.2
ship 51.8
posing 49.1

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 36-54
Gender Male, 99.5%
Calm 99.1%
Happy 0.5%
Sad 0.1%
Angry 0.1%
Confused 0.1%
Disgusted 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 29-45
Gender Male, 99.3%
Angry 84.1%
Calm 9.7%
Disgusted 2.4%
Confused 1.4%
Fear 0.9%
Sad 0.6%
Surprised 0.6%
Happy 0.4%

AWS Rekognition

Age 26-40
Gender Male, 93.8%
Calm 96.9%
Happy 1.1%
Angry 0.6%
Sad 0.4%
Confused 0.4%
Surprised 0.3%
Fear 0.2%
Disgusted 0.1%

AWS Rekognition

Age 27-43
Gender Female, 73.4%
Calm 45.6%
Confused 23.2%
Sad 15.1%
Angry 11.9%
Surprised 1.9%
Happy 0.9%
Fear 0.8%
Disgusted 0.5%

AWS Rekognition

Age 28-44
Gender Female, 64.4%
Calm 84.4%
Angry 5.7%
Sad 3%
Happy 2.5%
Disgusted 2%
Surprised 0.9%
Fear 0.8%
Confused 0.6%

Microsoft Cognitive Services

Age 44
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.9%

Captions

Microsoft

a couple of people that are standing in an old photo of a man 76.4%
a group of people standing in front of a military man 76.3%
a group of people posing for a photo 76.2%