Human Generated Data

Title

Untitled (Sixth Avenue, New York City)

Date

1932-1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.196

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Sixth Avenue, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.196

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Person 99.6
Human 99.6
Person 99.6
Clothing 99.5
Apparel 99.5
Person 99.4
Person 99.1
Person 98.1
Pedestrian 92.1
Person 91.5
Person 83.6
Overcoat 83.2
Coat 83.2
Face 82.6
Suit 76.9
Building 72.5
Crowd 67.7
Vehicle 67.5
Transportation 67.5
Hat 66.7
Car 66.5
Automobile 66.5
Person 65.5
Factory 55.7

Clarifai
created on 2023-10-15

people 99.8
street 96.9
many 96.4
group 95.7
train 95.3
monochrome 94.4
man 93.7
group together 93.6
railway 93.3
transportation system 92.2
adult 91.9
woman 86.8
portrait 83.5
subway system 82.3
lid 76.3
police 75.9
veil 75.5
retro 73.3
war 71.9
uniform 71.6

Imagga
created on 2021-12-15

building 24.6
people 24
architecture 22.6
office 20.8
city 20.8
business 20.6
silhouette 18.2
man 17.5
travel 15.5
urban 14.8
male 14.2
structure 13.4
tourism 13.2
person 13.1
passenger 12.9
men 12.9
room 12.5
black 12.1
old 11.1
construction 11.1
industry 11.1
glass 10.9
classroom 10.4
work 10.4
light 10
landmark 9.9
night 9.8
businessman 9.7
window 9.5
world 9.3
street 9.2
house 9.2
modern 9.1
life 8.9
interior 8.8
steel 8.8
corporate 8.6
lamp 8.6
finance 8.4
sky 8.3
clothing 8.2
transport 8.2
indoor 8.2
industrial 8.2
center 8.2
transportation 8.1
group 8.1
worker 8
wall 7.7
bridge 7.6
famous 7.4
vacation 7.4
inside 7.4
equipment 7.3
water 7.3
historic 7.3
women 7.1
airport 7.1
factory 7.1
job 7.1
line 7

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

text 99.7
black and white 90.9
person 89
man 87.1
clothing 84.1
street 83.2
vehicle 78.1
land vehicle 69.5
monochrome 68.3
crowd 0.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 36-52
Gender Male, 96.6%
Calm 79.8%
Sad 19.1%
Confused 0.3%
Happy 0.3%
Angry 0.2%
Disgusted 0.1%
Fear 0.1%
Surprised 0.1%

AWS Rekognition

Age 32-48
Gender Male, 99.6%
Calm 92.2%
Angry 5.5%
Sad 0.8%
Happy 0.5%
Disgusted 0.4%
Confused 0.3%
Surprised 0.2%
Fear 0.1%

AWS Rekognition

Age 31-47
Gender Male, 99.4%
Calm 87.3%
Happy 7.7%
Angry 1.5%
Surprised 1.2%
Confused 1.1%
Sad 0.5%
Fear 0.3%
Disgusted 0.3%

AWS Rekognition

Age 34-50
Gender Male, 91.5%
Happy 60.8%
Calm 26.2%
Angry 3.7%
Sad 3.2%
Disgusted 2%
Confused 1.6%
Surprised 1.5%
Fear 0.9%

AWS Rekognition

Age 7-17
Gender Male, 67%
Calm 51.7%
Happy 24.4%
Sad 13%
Fear 3.5%
Angry 2.4%
Surprised 1.9%
Confused 1.7%
Disgusted 1.5%

Microsoft Cognitive Services

Age 39
Gender Male

Microsoft Cognitive Services

Age 48
Gender Male

Microsoft Cognitive Services

Age 50
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Feature analysis

Amazon

Person 99.6%
Car 66.5%

Categories

Text analysis

Amazon

CIGA
DRUG
RESTAURANT
PENNSYLVANIA DRUG
ODA
PENNSYLVANIA
BAR RESTAURANT
OMPANY CIGA
BAR
OMPANY

Google

PENNSYLVANIA =DRUG OMPANY CIGA BAR RESTAURANT ODA
PENNSYLVANIA
=DRUG
OMPANY
CIGA
BAR
RESTAURANT
ODA