Human Generated Data

Title

Untitled (Sixth Avenue, New York City)

Date

1932-1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4251

Human Generated Data

Title

Untitled (Sixth Avenue, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1935

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Clothing 99.6
Apparel 99.6
Human 99.4
Person 99.4
Person 99.3
Person 99.2
Person 97.1
Person 96.1
Person 95.6
Hardhat 93.2
Person 91.2
Electronics 90.9
Display 90.9
Screen 90.9
Person 87.4
Hat 86.6
Person 83.2
Person 82.4
Person 79.3
Helmet 78.8
Monitor 65.8
Person 64.6
TV 64.4
Television 64.4
Lighting 63.8
Face 61.9
Person 61.3
Crowd 58.9
Sailor Suit 58.8
Person 57.5
Crash Helmet 56.4
Hat 55.6
Building 55.6
Town 55.6
Urban 55.6
City 55.6

Imagga
created on 2022-01-08

cowboy hat 40.3
hat 40.2
headdress 26.8
clothing 23.7
man 20.8
male 18.4
black 17.4
people 15.6
world 15.2
person 14.4
covering 14.2
business 14
architecture 11.7
consumer goods 11.6
billboard 11.5
adult 11
car 9.9
businessman 9.7
portrait 9.7
men 9.4
work 9.4
industry 9.4
signboard 9.3
building 9.2
entertainment 9.2
holding 9.1
musical instrument 9
one 8.9
structure 8.9
looking 8.8
smiling 8.7
smile 8.5
percussion instrument 8.4
old 8.4
city 8.3
silhouette 8.3
job 8
equipment 7.9
uniform 7.9
sitting 7.7
happy 7.5
vehicle 7.5
outdoors 7.5
marimba 7.3
protection 7.3
suit 7.2
night 7.1
television 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 99
hat 98.8
person 95.4
fashion accessory 91.4
fedora 91
man 87.2
clothing 84.3
cowboy hat 80.2
white 61.5
old 40

Face analysis

Amazon

Google

AWS Rekognition

Age 26-36
Gender Female, 97.5%
Calm 90.8%
Sad 4.5%
Happy 1.6%
Confused 1%
Disgusted 0.8%
Surprised 0.6%
Fear 0.4%
Angry 0.3%

AWS Rekognition

Age 31-41
Gender Male, 99.9%
Calm 53.1%
Sad 38.5%
Fear 3%
Disgusted 2.5%
Angry 1.6%
Confused 0.5%
Surprised 0.4%
Happy 0.3%

AWS Rekognition

Age 19-27
Gender Male, 98%
Calm 93.9%
Confused 4.7%
Angry 0.5%
Surprised 0.3%
Sad 0.2%
Fear 0.1%
Disgusted 0.1%
Happy 0%

AWS Rekognition

Age 38-46
Gender Male, 99.3%
Calm 97.5%
Happy 1%
Sad 1%
Surprised 0.2%
Angry 0.1%
Fear 0.1%
Disgusted 0.1%
Confused 0.1%

AWS Rekognition

Age 28-38
Gender Male, 99.8%
Calm 92.3%
Sad 3.5%
Angry 1.4%
Confused 1.2%
Surprised 0.7%
Fear 0.5%
Happy 0.3%
Disgusted 0.2%

AWS Rekognition

Age 23-33
Gender Female, 63%
Angry 58.6%
Calm 18.5%
Sad 8.7%
Happy 4.8%
Fear 3.6%
Surprised 3.5%
Disgusted 1.4%
Confused 1%

AWS Rekognition

Age 20-28
Gender Male, 69.4%
Angry 62.5%
Surprised 15.3%
Happy 9.9%
Calm 3.7%
Sad 3.4%
Fear 2.4%
Disgusted 1.6%
Confused 1.1%

AWS Rekognition

Age 14-22
Gender Male, 50.1%
Sad 43.7%
Angry 26.4%
Calm 18.4%
Fear 4.8%
Happy 3.2%
Disgusted 1.9%
Surprised 0.9%
Confused 0.6%

AWS Rekognition

Age 21-29
Gender Female, 82%
Calm 71.7%
Angry 24%
Surprised 1.5%
Disgusted 0.7%
Sad 0.6%
Confused 0.6%
Happy 0.5%
Fear 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Hat 86.6%
Helmet 78.8%
Monitor 65.8%

Captions

Microsoft

a group of people standing in front of a store 78.1%
an old photo of a man 78%
a group of people in front of a store 76%

Text analysis

Amazon

DO
DO BONT
BONT

Google

d
B
t
S
3
S B 83 3 t d
83