Human Generated Data

Title

Untitled (New York City)

Date

1932-1934

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3661

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1934

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3661

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Adult 99.5
Male 99.5
Man 99.5
Person 99.5
Clothing 98.9
Pants 98.9
Home Decor 98.5
Window 98.1
Sitting 97
Curtain 91.4
Face 87.7
Head 87.7
Undershirt 83.8
Footwear 70.9
Shoe 70.9
Photography 70.5
Portrait 70.5
Device 60.8
Window Shade 57.4
Shutter 56.9
Body Part 56.7
Finger 56.7
Hand 56.7
City 56.5
T-Shirt 55.8

Clarifai
created on 2018-05-10

people 99.8
one 98.6
adult 98.4
man 96.9
woman 94.7
music 94.5
indoors 94
wear 93.7
street 93.2
portrait 93
window 92.6
room 92.4
shadow 92.1
two 89.2
monochrome 87.9
piano 87.2
book series 86.9
musician 83.2
child 83.1
boy 80.6

Imagga
created on 2023-10-06

scholar 27.4
person 25.9
man 24.2
intellectual 21.9
adult 19.7
male 18.7
people 17.9
chair 17.4
musical instrument 16.2
black 14.3
old 13.9
portrait 12.9
building 12.9
business 12.8
world 12
wind instrument 11.5
accordion 11
silhouette 10.8
human 10.5
seat 10.5
statue 10.5
sitting 10.3
alone 10
dark 10
keyboard instrument 9.9
religion 9.9
posing 9.8
businessman 9.7
one 9.7
room 9.4
historic 9.2
suit 9.1
couple 8.7
architecture 8.6
men 8.6
culture 8.5
historical 8.5
travel 8.5
religious 8.4
monument 8.4
city 8.3
dirty 8.1
dress 8.1
history 8.1
looking 8
love 7.9
boy 7.8
barbershop 7.8
traditional 7.5
leisure 7.5
clothing 7.5
window 7.4
holding 7.4
tradition 7.4
office 7.4
professional 7.3
device 7.3
wheelchair 7.2
lifestyle 7.2
body 7.2
women 7.1
indoors 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

man 98.4
person 94.3
window 81.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 45-51
Gender Female, 62.3%
Calm 77.8%
Angry 15.5%
Surprised 6.5%
Fear 6.4%
Sad 2.9%
Disgusted 1.5%
Confused 0.8%
Happy 0.4%

Feature analysis

Amazon

Adult 99.5%
Male 99.5%
Man 99.5%
Person 99.5%