Human Generated Data

Title

Untitled (New York City)

Date

1932-1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2994

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2994

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Clothing 100
Coat 100
People 99.8
Person 98.7
Adult 98.7
Male 98.7
Man 98.7
City 97.9
Person 97.4
Adult 97.4
Male 97.4
Man 97.4
Person 97
Male 97
Boy 97
Child 97
Person 96.8
Adult 96.8
Male 96.8
Man 96.8
Person 96.4
Adult 96.4
Female 96.4
Woman 96.4
Person 96.3
Person 96.2
Person 95.9
Adult 95.9
Male 95.9
Man 95.9
Person 94.3
Adult 94.3
Male 94.3
Man 94.3
Person 94.3
Person 93
Person 91.5
Road 91
Street 91
Urban 91
Person 88.9
Person 88.7
Person 88.6
Person 88.2
Face 88
Head 88
Person 87.5
Adult 87.5
Male 87.5
Man 87.5
Person 86.7
Person 86.3
Person 83.9
Person 82.9
Person 81.4
Car 80.8
Transportation 80.8
Vehicle 80.8
Person 72.9
Person 71.9
Crowd 70
Person 69.4
Person 69.2
Hat 65.3
Person 64.2
Car 63.6
Person 63
Person 61.6
Hat 59
Overcoat 57.5

Clarifai
created on 2018-05-10

people 99.9
many 99.5
group 98.5
group together 97.6
military 96.4
adult 96.2
war 95.7
administration 95
crowd 93.7
soldier 92.1
man 92.1
woman 88.4
uniform 85.3
child 84.9
leader 84.6
vehicle 84.2
police 83.9
wear 80.9
skirmish 79
chair 78.1

Imagga
created on 2023-10-06

photographer 41.1
city 22.4
uniform 20.7
man 18.1
building 17.8
people 17.8
architecture 16.4
street 15.6
military uniform 14.6
male 13.6
military 13.5
clothing 12.6
industrial 11.8
war 11.6
person 11.6
old 11.1
house 10.9
factory 10.8
army 10.7
outdoor 10.7
adult 10.4
weapon 10.3
protection 10
danger 10
vehicle 9.8
soldier 9.8
nuclear 9.7
gun 9.5
history 8.9
destruction 8.8
urban 8.7
machine 8.6
industry 8.5
power 8.4
smoke 8.4
sky 8.3
outdoors 8.2
dirty 8.1
camouflage 8
home 8
day 7.8
conflict 7.8
travel 7.7
roof 7.6
equipment 7.6
dark 7.5
leisure 7.5
spectator 7.5
vintage 7.4
vacation 7.4
group 7.2
music 7.2
activity 7.2
to 7.1
work 7.1

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 99.7
outdoor 99.1
group 94.4
people 90.2
white 83.4
black 66.4
old 57.3
crowd 21.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 34-42
Gender Male, 99.9%
Calm 79.2%
Angry 13.6%
Surprised 6.4%
Fear 6%
Confused 3.8%
Sad 2.3%
Disgusted 1.4%
Happy 0.9%

AWS Rekognition

Age 51-59
Gender Female, 100%
Sad 91.8%
Calm 49.2%
Fear 6.7%
Surprised 6.6%
Happy 2.4%
Disgusted 0.9%
Angry 0.6%
Confused 0.5%

AWS Rekognition

Age 6-14
Gender Female, 88.4%
Happy 51.8%
Calm 45.7%
Surprised 6.4%
Fear 5.9%
Sad 2.3%
Angry 1.1%
Confused 0.5%
Disgusted 0.3%

AWS Rekognition

Age 19-27
Gender Female, 100%
Calm 83.1%
Surprised 6.7%
Fear 6.2%
Sad 5.9%
Disgusted 3.3%
Confused 2.2%
Happy 1.8%
Angry 0.5%

AWS Rekognition

Age 19-27
Gender Female, 99.9%
Sad 46.9%
Fear 36.9%
Calm 26.9%
Surprised 6.8%
Angry 6.6%
Disgusted 3.8%
Confused 1.6%
Happy 0.8%

AWS Rekognition

Age 29-39
Gender Female, 87.1%
Calm 48%
Sad 25.1%
Confused 14.1%
Happy 9.6%
Surprised 6.7%
Fear 6.2%
Disgusted 4.4%
Angry 3.1%

AWS Rekognition

Age 23-31
Gender Male, 91.3%
Fear 75.8%
Calm 28.9%
Surprised 7.2%
Sad 3.4%
Happy 3.1%
Disgusted 2.2%
Confused 1.5%
Angry 0.9%

AWS Rekognition

Age 28-38
Gender Male, 90.9%
Sad 99.8%
Calm 14.6%
Fear 6.8%
Happy 6.8%
Surprised 6.7%
Angry 3%
Confused 2.6%
Disgusted 1.1%

Microsoft Cognitive Services

Age 11
Gender Female

Microsoft Cognitive Services

Age 5
Gender Female

Microsoft Cognitive Services

Age 38
Gender Female

Microsoft Cognitive Services

Age 34
Gender Male

Microsoft Cognitive Services

Age 28
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.7%
Adult 98.7%
Male 98.7%
Man 98.7%
Boy 97%
Child 97%
Female 96.4%
Woman 96.4%
Car 80.8%
Hat 65.3%

Categories

Text analysis

Amazon

MBGCCO&CO