Human Generated Data

Title

Untitled (Greek orphans lined up in front of ship waiting to board)

Date

1948

People

Artist: David Seymour, Polish 1911 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Dr. Stephen Nicholas, P2004.29

Human Generated Data

Title

Untitled (Greek orphans lined up in front of ship waiting to board)

People

Artist: David Seymour, Polish 1911 - 1956

Date

1948

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Human 99.7
Person 99.7
Person 99.7
Person 99.6
Person 99.6
Person 99.3
Clothing 97.8
Apparel 97.8
Military 94.1
People 90.8
Person 89.8
Footwear 87.9
Shoe 87.9
Shoe 87.5
Military Uniform 87.3
Person 87.2
Person 86.9
Person 85.4
Person 84.4
Transportation 82.2
Vehicle 81.8
Person 81.3
Overcoat 80.7
Coat 80.7
Person 77.2
Shoe 76.7
Person 75.9
Person 74
Person 73.6
Person 73
Armored 73
Army 73
Shoe 62.3
Soldier 61.8
Shoe 61.5
Vessel 59.6
Watercraft 59.6
Cruiser 58.7
Ship 58.7
Navy 58.7
Troop 58.7
Shoe 56.9
Shoe 54.7

Imagga
created on 2022-01-23

hut 39.7
shelter 32.4
structure 27.6
world 25.2
man 20.8
building 17.9
city 17.5
person 17.2
people 16.2
nuclear 14.5
outdoor 14.5
dark 14.2
tourist 13.9
dirty 13.5
travel 13.4
street 12.9
destruction 12.7
urban 12.2
mask 11.9
protection 11.8
industrial 11.8
silhouette 11.6
walking 11.4
architecture 11
danger 10.9
chemical 10.6
traveler 10.4
pedestrian 10.3
male 9.9
sunset 9.9
radioactive 9.8
radiation 9.8
toxic 9.8
protective 9.7
military 9.7
explosion 9.6
gas 9.6
seller 9.6
tourism 9.1
old 9
vacation 9
outdoors 9
sky 8.9
stalker 8.9
accident 8.8
symbol 8.7
prison 8.4
power 8.4
summer 8.4
snow 8.1
soldier 7.8
disaster 7.8
factory 7.7
winter 7.7
sign 7.5
park 7.5
uniform 7.2
house 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

person 99.8
outdoor 99.7
sky 98.3
clothing 93.9
standing 92.8
group 91.9
man 91.1
people 78.8
text 60.1
old 59.9
black and white 59.1
posing 37.6

Face analysis

Amazon

Google

AWS Rekognition

Age 19-27
Gender Male, 89.3%
Calm 90.6%
Happy 3.6%
Fear 1.6%
Sad 1.5%
Confused 1.3%
Angry 0.6%
Surprised 0.5%
Disgusted 0.4%

AWS Rekognition

Age 27-37
Gender Female, 54.1%
Calm 63.4%
Happy 27%
Sad 3.8%
Angry 1.7%
Disgusted 1.3%
Confused 1%
Surprised 0.9%
Fear 0.8%

AWS Rekognition

Age 19-27
Gender Male, 97.4%
Calm 56.9%
Sad 18.7%
Happy 9.5%
Fear 4.5%
Angry 3.6%
Confused 3%
Disgusted 2.4%
Surprised 1.6%

AWS Rekognition

Age 19-27
Gender Male, 93.4%
Calm 80.5%
Sad 8.7%
Confused 6%
Happy 1.3%
Disgusted 1%
Fear 0.9%
Angry 0.8%
Surprised 0.7%

AWS Rekognition

Age 9-17
Gender Male, 97.3%
Calm 60.4%
Sad 29.2%
Angry 4.8%
Confused 4.2%
Happy 0.5%
Disgusted 0.4%
Fear 0.3%
Surprised 0.3%

AWS Rekognition

Age 16-24
Gender Male, 70.9%
Calm 86.3%
Sad 6.5%
Happy 2.8%
Fear 1.7%
Confused 1.2%
Angry 0.8%
Disgusted 0.4%
Surprised 0.2%

AWS Rekognition

Age 22-30
Gender Male, 98.4%
Calm 91.2%
Sad 6.8%
Happy 0.7%
Confused 0.5%
Fear 0.3%
Angry 0.2%
Disgusted 0.1%
Surprised 0.1%

AWS Rekognition

Age 18-26
Gender Male, 99.6%
Calm 80.6%
Sad 11.9%
Confused 3.4%
Fear 1.2%
Angry 1%
Happy 0.9%
Disgusted 0.5%
Surprised 0.3%

AWS Rekognition

Age 6-16
Gender Female, 55.7%
Calm 91.8%
Sad 2.8%
Happy 2.3%
Fear 0.9%
Angry 0.8%
Confused 0.5%
Disgusted 0.4%
Surprised 0.4%

AWS Rekognition

Age 13-21
Gender Male, 69%
Calm 90.8%
Angry 4.5%
Sad 1.9%
Fear 1%
Happy 0.7%
Disgusted 0.5%
Surprised 0.3%
Confused 0.3%

AWS Rekognition

Age 18-26
Gender Male, 88.9%
Calm 83%
Sad 9.9%
Angry 2.5%
Confused 2%
Fear 1%
Disgusted 0.6%
Happy 0.6%
Surprised 0.4%

AWS Rekognition

Age 13-21
Gender Male, 78.5%
Calm 98.6%
Happy 0.9%
Sad 0.2%
Confused 0.2%
Disgusted 0%
Angry 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 11-19
Gender Male, 91.8%
Calm 71.9%
Sad 23.3%
Angry 2.2%
Happy 1.2%
Surprised 0.5%
Fear 0.4%
Confused 0.3%
Disgusted 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Shoe 87.9%

Captions

Microsoft

a group of people standing in front of a building 97.8%
a group of people standing next to a building 97.4%
a group of people standing outside of a building 97.3%

Text analysis

Amazon

MO

Google

MO
MO