Human Generated Data

Title

Untitled (New York City)

Date

1932-1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3151

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3151

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Clothing 100
Coat 100
Adult 99.3
Male 99.3
Man 99.3
Person 99.3
Adult 99.3
Male 99.3
Man 99.3
Person 99.3
Person 99.2
Adult 99.1
Male 99.1
Man 99.1
Person 99.1
Adult 98.8
Male 98.8
Man 98.8
Person 98.8
Person 98.7
Person 98.2
Machine 94.2
Wheel 94.2
Spoke 91.6
Footwear 81.1
Shoe 81.1
Bicycle 80.5
Transportation 80.5
Vehicle 80.5
Hat 77.4
Cycling 75.2
Sport 75.2
Overcoat 70.4
Shoe 67.6
Shoe 61.9
Hat 60.2
Shoe 58.2
Shoe 57.9
City 57.5
Shoe 57.4
Urban 57.3
Road 57.1
Street 57.1
Shoe 57
Shop 55.7
Formal Wear 55.4
Suit 55.4

Clarifai
created on 2018-05-10

people 100
group together 99.3
group 98.9
adult 98.2
many 98.2
vehicle 96.1
man 96.1
military 93.9
wear 93.8
woman 91.5
administration 90.7
transportation system 90.5
one 90.2
street 89.1
several 88.4
outfit 87.1
two 87
police 86.4
war 86.1
uniform 83.8

Imagga
created on 2023-10-06

man 23.5
world 21.2
people 20.6
adult 16.8
person 14.6
city 14.1
building 13.5
black 12.7
wheelchair 12.5
urban 12.2
street 12
machinist 11.8
old 11.8
dark 11.7
wheeled vehicle 11.4
male 10.7
vehicle 10.4
danger 10
outdoor 9.9
dirty 9.9
destruction 9.8
house 9.2
night 8.9
light 8.7
architecture 8.6
stage 8.6
industry 8.5
walking 8.5
travel 8.4
passenger 8.4
outdoors 8.4
safety 8.3
window 8.2
protection 8.2
industrial 8.2
transportation 8.1
chair 8
stall 8
weapon 8
accident 7.8
portrait 7.8
child 7.8
men 7.7
leisure 7.5
cart 7.4
jinrikisha 7.4
symbol 7.4
seat 7.4
historic 7.3
tricycle 7.3
day 7.1

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

building 99.8
outdoor 99.8
road 98.5
person 98.3
people 83.4
street 76.8
store 40.6
past 40.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 40-48
Gender Male, 99.9%
Calm 90.5%
Surprised 7.6%
Fear 5.9%
Sad 3.5%
Confused 2.4%
Angry 0.5%
Disgusted 0.4%
Happy 0.1%

AWS Rekognition

Age 21-29
Gender Female, 53.7%
Calm 32.9%
Angry 23.8%
Happy 19.4%
Surprised 8.3%
Confused 7.5%
Fear 6.6%
Sad 6.4%
Disgusted 2.9%

Feature analysis

Amazon

Adult 99.3%
Male 99.3%
Man 99.3%
Person 99.3%
Wheel 94.2%
Shoe 81.1%
Hat 77.4%

Text analysis

Amazon

SHOE
218
220
CLEANING
HAT
34
HAT CLEANING 218 И
И
3.P.M
SHOE SHININGAPLOR
SHININGAPLOR

Google

2 20 218N SHOE SHINING PARLOR
2
20
218N
SHOE
SHINING
PARLOR