Human Generated Data

Title

Races, Jews: United States. New Jersey. Woodbine. Baron de Hirsch Agricultural and Industrial School: Woodbine Settlement and School, Woodbine, N.J. Baron de Hirsch Fund.: 208. Class of 1903.

Date

1903

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Social Museum Collection, 3.2002.3563.3

Human Generated Data

Title

Races, Jews: United States. New Jersey. Woodbine. Baron de Hirsch Agricultural and Industrial School: Woodbine Settlement and School, Woodbine, N.J. Baron de Hirsch Fund.: 208. Class of 1903.

People

Artist: Unidentified Artist,

Date

1903

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Social Museum Collection, 3.2002.3563.3

Machine Generated Data

Tags

Amazon
created on 2019-06-05

Person 99.2
Human 99.2
Person 98.5
Person 98.1
Military 97.8
Military Uniform 97.5
Person 96.8
Person 96.7
Person 95
Person 93.6
People 93
Person 92.2
Army 90.1
Armored 90.1
Person 89.4
Officer 87.1
Troop 85.1
Person 79.1
Person 74.8
Soldier 73.4
Person 71.3
Person 69.6
Person 64.1

Clarifai
created on 2019-06-05

people 99.9
many 99.5
group 99.2
group together 98.9
child 96.5
adult 96
man 94.7
wear 93.7
military 92.7
administration 92.5
uniform 91.3
outfit 91
woman 89.5
leader 89.2
war 88.9
portrait 87.3
several 85.9
retro 85.3
soldier 84.7
boy 83.8

Imagga
created on 2019-06-05

military uniform 58.5
uniform 50.7
statue 41.8
clothing 35.9
sculpture 33.7
consumer goods 23.7
monument 22.4
art 21.5
covering 20.5
history 19.7
religion 18.8
old 18.8
ancient 18.2
architecture 17.2
tourism 16.5
kin 16.3
travel 16.2
helmet 15.4
stone 14.3
city 14.1
historical 14.1
religious 14
historic 13.7
soldier 12.7
man 12.4
person 11.9
antique 11.6
commodity 11.6
famous 11.2
people 10.6
building 10.3
culture 10.3
column 9.8
warrior 9.8
army 9.7
holy 9.6
god 9.6
church 9.2
armor plate 9.2
carving 9.1
tourist 9.1
landmark 9
catholic 8.9
detail 8.8
marble 8.7
military 8.7
war 8.7
world 8.6
male 8.5
ruler 8.3
vintage 8.3
teacher 8.2
memorial 8.1
decoration 8
face 7.8
figure 7.8
weapon 7.4
adult 7.2
sky 7

Google
created on 2019-06-05

People 95.8
Team 64.1
Crew 64.1
Family 58.9
History 54.1
Troop 54
Vintage clothing 50.7

Microsoft
created on 2019-06-05

posing 99.4
clothing 99
group 94.6
person 93.7
smile 93.6
man 91.8
old 91.3
human face 67.5
school 52.5
woman 50.9
team 46.7
clothes 20.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 20-38
Gender Male, 54.2%
Surprised 45.2%
Sad 46.3%
Happy 45.2%
Disgusted 45.4%
Confused 45.5%
Calm 51.8%
Angry 45.6%

AWS Rekognition

Age 26-43
Gender Male, 54.4%
Calm 51.7%
Confused 45.4%
Happy 45.2%
Disgusted 45.3%
Angry 45.4%
Surprised 45.2%
Sad 46.8%

AWS Rekognition

Age 23-38
Gender Male, 54.5%
Sad 45.8%
Disgusted 45.1%
Surprised 45%
Happy 45%
Confused 45.2%
Angry 45.1%
Calm 53.8%

AWS Rekognition

Age 15-25
Gender Male, 54.3%
Surprised 45%
Confused 45.1%
Sad 46.1%
Calm 53.3%
Happy 45.2%
Disgusted 45.1%
Angry 45.1%

AWS Rekognition

Age 26-43
Gender Male, 51.8%
Sad 54%
Disgusted 45.1%
Calm 45.3%
Confused 45.2%
Angry 45.3%
Surprised 45.1%
Happy 45.1%

AWS Rekognition

Age 26-43
Gender Male, 54.8%
Angry 45.4%
Calm 52.5%
Confused 45.2%
Sad 46.5%
Disgusted 45.2%
Happy 45.1%
Surprised 45%

AWS Rekognition

Age 26-43
Gender Male, 54.8%
Happy 45.1%
Disgusted 45.1%
Sad 45.5%
Confused 45.1%
Angry 45.2%
Surprised 45.1%
Calm 53.9%

AWS Rekognition

Age 35-52
Gender Male, 52.8%
Confused 45.4%
Disgusted 45.4%
Angry 45.5%
Sad 50.4%
Surprised 45.2%
Happy 45.2%
Calm 47.9%

AWS Rekognition

Age 26-43
Gender Female, 51.9%
Confused 45.1%
Surprised 45.2%
Happy 45.1%
Sad 45.2%
Calm 53.3%
Disgusted 45.9%
Angry 45.2%

AWS Rekognition

Age 17-27
Gender Female, 50.6%
Angry 45.5%
Surprised 45.2%
Confused 45.1%
Sad 45.4%
Disgusted 45.2%
Happy 45.3%
Calm 53.3%

AWS Rekognition

Age 26-44
Gender Male, 53.8%
Happy 54.3%
Angry 45.1%
Surprised 45.1%
Calm 45.3%
Sad 45.1%
Confused 45%
Disgusted 45.1%

AWS Rekognition

Age 26-43
Gender Male, 52.5%
Happy 45.2%
Sad 50.5%
Surprised 45.3%
Confused 45.3%
Calm 46.2%
Angry 46.3%
Disgusted 46.2%

AWS Rekognition

Age 26-43
Gender Female, 51.1%
Angry 45.4%
Confused 45.3%
Calm 52%
Sad 46%
Happy 45.6%
Disgusted 45.3%
Surprised 45.3%

AWS Rekognition

Age 26-43
Gender Male, 53.5%
Confused 45.3%
Calm 49.9%
Happy 45.1%
Surprised 45.2%
Sad 47.1%
Angry 46.1%
Disgusted 46.2%

AWS Rekognition

Age 26-43
Gender Female, 53.2%
Angry 45.4%
Happy 45.5%
Surprised 45.3%
Calm 46.2%
Confused 45.3%
Sad 51.7%
Disgusted 45.6%

AWS Rekognition

Age 19-36
Gender Male, 51.3%
Surprised 45.7%
Confused 45.6%
Disgusted 45.7%
Calm 48.7%
Angry 46.3%
Sad 46.9%
Happy 46.1%

AWS Rekognition

Age 20-38
Gender Male, 51.9%
Calm 45.6%
Confused 45.2%
Surprised 45.2%
Sad 52.3%
Happy 45.2%
Angry 45.9%
Disgusted 45.5%

AWS Rekognition

Age 29-45
Gender Female, 54.8%
Confused 45.5%
Disgusted 45.3%
Surprised 45.2%
Sad 51.3%
Angry 45.3%
Calm 46.4%
Happy 46.1%

Microsoft Cognitive Services

Age 28
Gender Female

Microsoft Cognitive Services

Age 31
Gender Male

Microsoft Cognitive Services

Age 27
Gender Male

Microsoft Cognitive Services

Age 32
Gender Male

Microsoft Cognitive Services

Age 29
Gender Male

Microsoft Cognitive Services

Age 33
Gender Female

Microsoft Cognitive Services

Age 42
Gender Female

Microsoft Cognitive Services

Age 31
Gender Male

Microsoft Cognitive Services

Age 33
Gender Male

Microsoft Cognitive Services

Age 26
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%

Categories

Imagga

interior objects 94.2%
paintings art 5.2%