Human Generated Data

Title

Untitled (Branchville, Maryland)

Date

November 1936

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1863

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Branchville, Maryland)

People

Artist: Ben Shahn, American 1898 - 1969

Date

November 1936

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1863

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Clothing 100
Hat 100
Person 99.1
Person 98.5
Person 98
Person 97.9
Person 97.5
Person 97.4
Adult 97.4
Male 97.4
Man 97.4
Person 96.3
Person 95.9
Adult 95.9
Male 95.9
Man 95.9
Person 95.2
Terminal 94.9
Person 94.8
Person 94.5
Person 92.9
Adult 92.9
Male 92.9
Man 92.9
Railway 91.7
Train 91.7
Train Station 91.7
Transportation 91.7
Vehicle 91.7
Man 90.8
Person 90.8
Adult 90.8
Male 90.8
Cap 89.8
Person 89.3
Person 88.8
Face 86.9
Head 86.9
Bride 85.8
Female 85.8
Wedding 85.8
Woman 85.8
Person 85.8
Adult 85.8
Person 85.6
Person 84.5
Person 82.9
Baby 81.4
Person 81.4
Hardhat 79.3
Helmet 79.3
Person 77.7
Person 75
Jeans 67.9
Pants 67.9
People 65.4
Coat 61.5
Worker 56.4
Sun Hat 55.8
Baseball Cap 55.5
Footwear 55.2
Shoe 55.2
Jeans 55

Clarifai
created on 2018-05-11

people 99.9
many 98.7
group 98.6
group together 98.1
military 98
soldier 96.8
war 96.6
adult 96.3
administration 95.4
man 94.5
uniform 92.1
police 89.6
crowd 89
woman 88.5
wear 86.7
child 86.2
outfit 82.5
leader 79.9
several 79.5
boy 78.4

Imagga
created on 2023-10-05

uniform 48.9
military uniform 43.9
clothing 34.9
man 28.2
people 24.5
male 23.4
soldier 21.5
covering 19.8
military 19.3
consumer goods 18.7
spectator 18.2
person 17.8
adult 15
weapon 14.6
gun 14.6
war 14.4
city 14.1
protection 12.7
army 12.7
battle 11.7
group 11.3
outdoors 11.2
danger 10.9
camouflage 10.8
statue 10.7
travel 10.6
old 10.4
men 10.3
hat 10.2
tourism 9.9
mask 9.8
history 9.8
horse 9.5
passenger 9.1
commodity 8.7
senior 8.4
sport 8.4
guy 8.4
leisure 8.3
street 8.3
engineer 8.1
world 8.1
game 8
family 8
to 8
warrior 7.8
portrait 7.8
photographer 7.7
equipment 7.7
outdoor 7.6
two 7.6
monument 7.5
vacation 7.4
national 7.2
recreation 7.2
activity 7.2
player 7
architecture 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 99.9
outdoor 97.2
military uniform 84
group 82.9
people 74.2
posing 64.9
crowd 0.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 38-46
Gender Male, 99.7%
Sad 58.1%
Confused 40.6%
Angry 14.1%
Calm 8.3%
Surprised 6.5%
Fear 6.1%
Disgusted 6%
Happy 1%

AWS Rekognition

Age 16-24
Gender Male, 93.5%
Sad 98.4%
Calm 41.7%
Surprised 6.3%
Fear 6%
Confused 0.9%
Angry 0.7%
Disgusted 0.3%
Happy 0.1%

AWS Rekognition

Age 38-46
Gender Male, 100%
Calm 96%
Surprised 6.4%
Fear 5.9%
Sad 2.5%
Confused 1.2%
Angry 0.7%
Disgusted 0.3%
Happy 0.1%

AWS Rekognition

Age 27-37
Gender Female, 62.1%
Sad 100%
Calm 8.3%
Surprised 6.6%
Fear 6%
Confused 0.9%
Disgusted 0.7%
Angry 0.5%
Happy 0.4%

AWS Rekognition

Age 34-42
Gender Male, 99.9%
Calm 87.8%
Sad 8.2%
Surprised 6.6%
Fear 6.2%
Confused 0.5%
Happy 0.2%
Angry 0.2%
Disgusted 0.2%

AWS Rekognition

Age 25-35
Gender Male, 93.1%
Disgusted 48.2%
Angry 16.8%
Calm 13.5%
Sad 12.5%
Surprised 7.8%
Fear 6.4%
Confused 3.4%
Happy 1%

AWS Rekognition

Age 25-35
Gender Male, 99.8%
Sad 92.4%
Calm 38.1%
Angry 9.9%
Surprised 6.9%
Fear 6.9%
Confused 1.6%
Happy 1.2%
Disgusted 1.2%

AWS Rekognition

Age 31-41
Gender Male, 81.2%
Calm 95%
Surprised 6.9%
Fear 5.9%
Sad 2.5%
Happy 1%
Angry 0.6%
Disgusted 0.4%
Confused 0.4%

AWS Rekognition

Age 47-53
Gender Male, 53.1%
Calm 74.5%
Sad 16.2%
Surprised 6.6%
Fear 6%
Angry 5.3%
Confused 2.6%
Disgusted 0.8%
Happy 0.6%

AWS Rekognition

Age 21-29
Gender Male, 99.9%
Calm 63.5%
Happy 24.1%
Surprised 6.8%
Fear 6%
Sad 5.7%
Angry 2%
Disgusted 1.2%
Confused 0.7%

AWS Rekognition

Age 28-38
Gender Male, 61.6%
Fear 65.3%
Sad 30.6%
Happy 16.2%
Surprised 7.1%
Disgusted 2.9%
Angry 2.7%
Calm 2.6%
Confused 2.4%

Microsoft Cognitive Services

Age 44
Gender Male

Microsoft Cognitive Services

Age 28
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Adult 97.4%
Male 97.4%
Man 97.4%
Bride 85.8%
Female 85.8%
Woman 85.8%
Baby 81.4%
Jeans 67.9%
Coat 61.5%
Shoe 55.2%

Categories

Imagga

paintings art 87.3%
people portraits 10.8%