Human Generated Data

Title

Untitled (New York City)

Date

1932-1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2993

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2993

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Clothing 100
Coat 100
City 100
Road 100
Street 100
Urban 100
Cap 99.9
Adult 99.2
Male 99.2
Man 99.2
Person 99.2
Adult 99.2
Male 99.2
Man 99.2
Person 99.2
Adult 99
Male 99
Man 99
Person 99
Neighborhood 97.4
Person 95.6
Photography 94.8
Head 94.6
Car 94.4
Transportation 94.4
Vehicle 94.4
Face 94.2
Person 91.7
Baseball Cap 91.3
Portrait 89
Person 85.9
Hat 85.5
Machine 81.7
Wheel 81.7
People 80.9
Officer 79
Wheel 78.6
Person 74.5
Person 69.3
Outdoors 68.7
Wheel 63.2
Captain 57.6
Overcoat 57.3
Smoke 56.9
Accessories 56.3
Sunglasses 56.3
Military 56.1
Formal Wear 56
Suit 56
Metropolis 55.6
Military Uniform 55.2

Clarifai
created on 2018-05-10

people 99.9
adult 99.1
group 98.5
group together 98
man 96.7
administration 96.2
leader 95.9
wear 95.5
two 94.7
outfit 94
woman 92.1
police 92
military 91.9
uniform 91.5
three 91.1
several 91.1
vehicle 90
war 89.7
many 89.5
portrait 89

Imagga
created on 2023-10-05

man 42.4
male 34.8
person 29.8
people 28.4
adult 23.3
face 22.7
portrait 22.6
hat 21.3
men 20.6
harmonica 19
wind instrument 18.8
world 18.3
black 17.5
old 17.4
free-reed instrument 15.2
senior 15
handsome 14.3
musical instrument 13.5
human 13.5
couple 13.1
hair 12.7
businessman 12.4
scholar 12.2
business 12.1
guy 12.1
looking 12
attractive 11.9
love 11.8
head 11.8
megaphone 11.5
look 11.4
grandfather 11.2
casual 11
device 10.9
lifestyle 10.8
painter 10.8
suit 10.3
expression 10.2
mature 10.2
acoustic device 9.8
intellectual 9.7
serious 9.5
model 9.3
smile 9.3
hand 9.1
beard 9
one 9
sexy 8.8
happy 8.8
women 8.7
happiness 8.6
sitting 8.6
clothing 8.5
pretty 8.4
glasses 8.3
eye 8
cute 7.9
corporate 7.7
youth 7.7
elderly 7.7
fashion 7.5
dark 7.5
city 7.5
phone 7.4
emotion 7.4
alone 7.3
brass 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 99.5
man 91.1
outdoor 89.3
black 68.7
white 64.4
people 62.2
crowd 0.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 61-71
Gender Male, 99.8%
Calm 63.3%
Happy 23%
Surprised 6.8%
Fear 6.1%
Sad 4.7%
Confused 3.2%
Angry 2.2%
Disgusted 1.1%

Feature analysis

Amazon

Adult 99.2%
Male 99.2%
Man 99.2%
Person 99.2%
Car 94.4%
Hat 85.5%
Wheel 81.7%