Human Generated Data

Title

Untitled (men at podium with old water pump)

Date

c. 1950

People

Artist: Jack Rodden Studio, American 1914 - 2016

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1465

Human Generated Data

Title

Untitled (men at podium with old water pump)

People

Artist: Jack Rodden Studio, American 1914 - 2016

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 98.9
Human 98.9
Person 98.6
Person 98
Person 97.5
Military 92.8
Military Uniform 91.3
Person 90.1
Person 86.5
Person 86
Officer 79.9
Person 79.9
Crowd 75.8
Musical Instrument 71.3
Clothing 71.3
Apparel 71.3
Brass Section 66.1
Overcoat 64.4
Coat 64.4
People 63.5
Army 59.4
Armored 59.4
Soldier 58.8
Suit 57.2
Musician 56.7
Horn 56.1

Imagga
created on 2022-01-23

musical instrument 31.7
wind instrument 30.2
brass 29.5
man 27.5
trombone 24.3
mask 22.3
person 21.3
soldier 20.5
military 20.3
protection 20
adult 19.2
male 19.1
people 18.9
war 18.3
weapon 17.8
gun 16.1
danger 15.4
clothing 15.4
men 13.7
musician 13.3
bassoon 12.7
portrait 12.3
uniform 12.3
stringed instrument 12.1
black 12.1
music 12
safety 12
industrial 11.8
camouflage 11.8
radiation 11.7
toxic 11.7
protective 11.7
oboe 11.6
gas 11.5
rifle 11.4
human 11.2
equipment 11.1
guitar 10.9
warrior 10.7
statue 10.7
banjo 10.4
sword 10.3
playing 10
helmet 9.8
radioactive 9.8
disaster 9.8
army 9.7
concert 9.7
armor 9.7
chemical 9.6
outfit 9.5
industry 9.4
smoke 9.3
entertainment 9.2
bass 9.2
vintage 9.1
old 9
fashion 9
photographer 9
performer 9
looking 8.8
shield 8.8
nuclear 8.7
rock 8.7
device 8.7
business 8.5
holding 8.2
instrument 8.2
history 8
urban 7.9
destruction 7.8
drum 7.8
model 7.8
play 7.8
artist 7.7
wall 7.7
power 7.5
player 7.5
clothes 7.5
protective covering 7.4
environment 7.4
occupation 7.3
stage 7.3
singer 7.2
dirty 7.2
dress 7.2
sexy 7.2
suit 7.2
body 7.2
women 7.1

Google
created on 2022-01-23

Coat 92.9
Motor vehicle 86.2
Suit 82.6
Hat 73.9
Crew 73.7
Event 73
Vintage clothing 72.9
Classic 69.4
Formal wear 69.2
Tie 67.9
White-collar worker 66.9
Musician 66.8
Room 66.3
Sun hat 65.9
Stock photography 64.3
Team 64.1
History 63.7
Chair 61.5
Monochrome 61.5
Monochrome photography 57.9

Microsoft
created on 2022-01-23

person 99.4
musical instrument 96.4
brass 87.1
trumpet 81.8
music 80.2
clothing 75.9
saxophone 73.3
group 70.3
text 66.7
drum 62.3
brass instrument 59.6
man 59.1
horn 57.9
wind instrument 54.9
posing 49.2
concert band 25.2

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 54-64
Gender Male, 98.2%
Sad 23%
Calm 19.7%
Angry 18%
Confused 16.1%
Disgusted 14.6%
Surprised 3.8%
Fear 3%
Happy 1.8%

AWS Rekognition

Age 48-56
Gender Male, 100%
Happy 100%
Angry 0%
Surprised 0%
Fear 0%
Calm 0%
Disgusted 0%
Confused 0%
Sad 0%

AWS Rekognition

Age 64-74
Gender Male, 100%
Happy 96.5%
Disgusted 1.1%
Surprised 0.9%
Confused 0.5%
Calm 0.4%
Angry 0.3%
Sad 0.2%
Fear 0.2%

AWS Rekognition

Age 56-64
Gender Male, 99.9%
Happy 98.3%
Calm 0.3%
Confused 0.3%
Angry 0.3%
Surprised 0.2%
Disgusted 0.2%
Fear 0.2%
Sad 0.2%

AWS Rekognition

Age 41-49
Gender Male, 99.9%
Happy 98.2%
Angry 1.2%
Surprised 0.2%
Disgusted 0.1%
Fear 0.1%
Confused 0%
Sad 0%
Calm 0%

AWS Rekognition

Age 47-53
Gender Male, 100%
Calm 98.7%
Confused 0.9%
Angry 0.1%
Sad 0.1%
Happy 0%
Surprised 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 14-22
Gender Male, 98.2%
Calm 94.4%
Sad 3.5%
Surprised 0.8%
Confused 0.3%
Fear 0.3%
Disgusted 0.3%
Happy 0.3%
Angry 0.2%

AWS Rekognition

Age 24-34
Gender Male, 99.8%
Happy 99.4%
Calm 0.3%
Surprised 0.1%
Fear 0.1%
Sad 0.1%
Disgusted 0%
Confused 0%
Angry 0%

AWS Rekognition

Age 38-46
Gender Male, 99.9%
Happy 99.9%
Surprised 0%
Angry 0%
Disgusted 0%
Confused 0%
Calm 0%
Fear 0%
Sad 0%

Microsoft Cognitive Services

Age 51
Gender Male

Microsoft Cognitive Services

Age 42
Gender Male

Microsoft Cognitive Services

Age 48
Gender Male

Microsoft Cognitive Services

Age 48
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.9%

Captions

Microsoft

a group of people posing for a photo 94.6%
a group of people posing for the camera 94.5%
a group of people posing for a picture 94.4%

Text analysis

Amazon

KBI
1