Human Generated Data

Title

Untitled (nine men standing with large metal containers)

Date

c. 1935, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6051

Human Generated Data

Title

Untitled (nine men standing with large metal containers)

People

Artist: Durette Studio, American 20th century

Date

c. 1935, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6051

Machine Generated Data

Tags

Amazon
created on 2019-05-30

Human 99.7
Person 99.7
Person 99.7
Person 99.7
Person 99.7
Person 99.7
Person 99.2
Person 98.9
Military 97.2
Military Uniform 96.7
People 91.1
Armored 85.5
Army 85.5
Funeral 83.7
Person 82.5
Officer 81.9
Soldier 81
Clothing 80.2
Apparel 80.2
Helmet 77.8
Crowd 70.3
Person 66.5
Overcoat 61.9
Coat 61.9
Person 59.3
Suit 57.8
Troop 55.6
Person 47.4

Clarifai
created on 2019-05-30

people 99.6
group 98.1
many 98
group together 97.5
man 96.9
adult 94.5
monochrome 93.8
war 92.9
wear 91.7
child 91.6
crowd 91.5
vehicle 91.2
woman 91
cavalry 89.3
military 88.7
street 88.3
soldier 86.1
recreation 85.7
police 85
outfit 84.3

Imagga
created on 2019-05-30

carriage 31.3
silhouette 29
people 22.3
man 22.2
sunset 21.6
male 19.1
transportation 14.3
black 14.1
sport 13.9
men 12.9
sky 12.8
horse 12.3
snow 12.2
outdoor 12.2
travel 12
beach 11.8
person 11.7
military uniform 11.2
landscape 11.2
city 10.8
dusk 10.5
sun 10.5
boy 10.4
adult 10.4
clothing 10
activity 9.8
outdoors 9.8
walking 9.5
transport 9.1
mountain 8.9
urban 8.7
dawn 8.7
uniform 8.6
outside 8.6
evening 8.4
field 8.4
support 8.2
device 8.2
group 8.1
building 8
ride 7.8
extreme 7.7
winter 7.7
passenger 7.6
world 7.6
power 7.6
dark 7.5
sunrise 7.5
harness 7.5
hill 7.5
cart 7.4
light 7.4
water 7.3
alone 7.3
active 7.2
night 7.1
summer 7.1

Google
created on 2019-05-30

Microsoft
created on 2019-05-30

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 35-52
Gender Male, 53.2%
Calm 53.4%
Confused 45.3%
Happy 45.1%
Disgusted 45.1%
Surprised 45.7%
Angry 45.2%
Sad 45.3%

AWS Rekognition

Age 26-43
Gender Male, 54.1%
Calm 51.5%
Sad 47.5%
Angry 45.4%
Surprised 45.1%
Disgusted 45.1%
Confused 45.3%
Happy 45.1%

AWS Rekognition

Age 27-44
Gender Male, 50.5%
Disgusted 49.7%
Surprised 49.5%
Angry 49.8%
Sad 49.6%
Happy 49.6%
Confused 49.6%
Calm 49.7%

AWS Rekognition

Age 35-52
Gender Male, 50.5%
Confused 49.5%
Surprised 49.5%
Disgusted 49.5%
Angry 49.5%
Calm 49.6%
Sad 49.5%
Happy 50.4%

AWS Rekognition

Age 23-38
Gender Male, 50.5%
Disgusted 49.5%
Sad 49.6%
Angry 49.5%
Surprised 49.5%
Calm 50.2%
Happy 49.5%
Confused 49.6%

AWS Rekognition

Age 35-52
Gender Male, 50.2%
Sad 49.5%
Surprised 49.5%
Confused 49.5%
Calm 50.5%
Happy 49.5%
Angry 49.5%
Disgusted 49.5%

AWS Rekognition

Age 27-44
Gender Male, 50.4%
Sad 49.5%
Angry 49.5%
Disgusted 49.5%
Calm 50.5%
Confused 49.5%
Happy 49.5%
Surprised 49.5%

AWS Rekognition

Age 26-43
Gender Male, 50.1%
Calm 49.6%
Disgusted 49.7%
Surprised 49.6%
Confused 49.6%
Happy 49.7%
Sad 49.6%
Angry 49.7%

AWS Rekognition

Age 35-52
Gender Male, 54.9%
Confused 45.4%
Happy 45.8%
Angry 45.7%
Surprised 45.6%
Calm 49.2%
Disgusted 46.1%
Sad 47.1%

Feature analysis

Amazon

Person 99.7%
Helmet 77.8%

Categories