Human Generated Data

Title

Untitled (Lower East Side, New York City)

Date

1932-1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, by exchange, P2000.41

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Lower East Side, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, by exchange, P2000.41

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2022-06-11

Person 99.5
Human 99.5
Person 99.1
Person 99
Person 98.9
Person 98.2
Coat 98.1
Clothing 98.1
Apparel 98.1
Military 98.1
Person 97.9
Person 97
Person 96.6
Military Uniform 95.8
Officer 94.4
Person 94
Person 91.1
Person 90.7
Overcoat 85.8
Person 79.3
Person 74.6
Funeral 65.5
Shoe 64.6
Footwear 64.6
Person 61.5

Clarifai
created on 2023-10-29

people 100
many 98.9
group together 98.7
group 98.7
man 97.7
adult 97.6
street 97.5
woman 95
administration 94.9
military 94
leader 94
war 92
child 88.3
chair 86
crowd 85.2
uniform 83.9
soldier 83.8
portrait 80.2
veil 79.6
wear 79.3

Imagga
created on 2022-06-11

military uniform 50.5
clothing 45
uniform 44.5
people 33.5
man 32.2
business 26.7
male 23.4
covering 23.2
consumer goods 23
adult 22.7
city 22.4
street 21.2
men 20.6
person 20.5
urban 20.1
pedestrian 18.6
businessman 17.7
women 16.6
group 16.1
world 15.5
crowd 15.4
suit 15.2
fashion 14.3
black 13.8
corporate 13.7
walking 13.3
briefcase 13.1
work 12.6
job 12.4
walk 12.4
travel 12
garment 11.9
lifestyle 11.6
industrial 10.9
worker 10.7
life 10.5
standing 10.4
stand 10.4
architecture 10.2
protection 10
holding 9.9
attractive 9.8
human 9.7
office 9.7
day 9.4
industry 9.4
passenger 9.3
robe 9.3
shopping 9.2
commodity 9.1
building 8.8
attendant 8.7
bag 8.7
military 8.7
motion 8.6
professional 8.5
clothes 8.4
portrait 8.4
outdoor 8.4
occupation 8.2
window 8.2
success 8
working 8
tourist 7.9
rush 7.9
smile 7.8
mall 7.8
model 7.8
mask 7.8
modern 7.7
move 7.7
blurred 7.7
casual 7.6
store 7.6
jacket 7.5
danger 7.3
coat 7.2
activity 7.2

Google
created on 2022-06-11

Microsoft
created on 2022-06-11

text 99.8
person 99.7
clothing 97.8
standing 97.2
man 94.1
outdoor 87.8
group 86.6
people 86.2
black and white 68.4
posing 44.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 54-64
Gender Male, 100%
Calm 97.1%
Surprised 6.4%
Fear 5.9%
Sad 2.3%
Happy 0.6%
Confused 0.6%
Angry 0.4%
Disgusted 0.3%

AWS Rekognition

Age 34-42
Gender Male, 100%
Angry 51.4%
Calm 37.4%
Confused 7.1%
Surprised 6.5%
Fear 6%
Sad 3.2%
Disgusted 0.3%
Happy 0.2%

AWS Rekognition

Age 31-41
Gender Male, 99.1%
Calm 97.4%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Confused 1.4%
Disgusted 0.8%
Angry 0.2%
Happy 0%

AWS Rekognition

Age 45-51
Gender Male, 99.8%
Calm 66.1%
Sad 27.3%
Confused 11.9%
Surprised 6.6%
Fear 6%
Angry 0.7%
Happy 0.3%
Disgusted 0.3%

AWS Rekognition

Age 25-35
Gender Male, 77.3%
Calm 66.5%
Happy 18.3%
Surprised 8.1%
Fear 6.8%
Angry 4.3%
Sad 2.6%
Confused 2.3%
Disgusted 1.6%

AWS Rekognition

Age 38-46
Gender Male, 95.9%
Angry 86.9%
Surprised 6.8%
Fear 6%
Sad 4.3%
Happy 3.6%
Calm 1.7%
Disgusted 0.9%
Confused 0.5%

AWS Rekognition

Age 21-29
Gender Male, 94.7%
Calm 68.3%
Fear 11.8%
Surprised 11.7%
Disgusted 3.7%
Happy 3.5%
Sad 2.7%
Confused 2.4%
Angry 1.1%

AWS Rekognition

Age 21-29
Gender Male, 94.3%
Calm 59.3%
Happy 16.8%
Surprised 8.7%
Fear 7.5%
Sad 6.9%
Angry 3.3%
Disgusted 2.8%
Confused 1.1%

AWS Rekognition

Age 19-27
Gender Female, 93.6%
Sad 99.7%
Calm 14.5%
Fear 8%
Surprised 6.8%
Angry 4.7%
Disgusted 3.2%
Confused 2.6%
Happy 1.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Coat 98.1%
Shoe 64.6%

Text analysis

Amazon

TL
TL L 36231.4
L 36231.4

Google

TL 36231.4 XXXX
TL
36231.4
XXXX