Human Generated Data

Title

Untitled (Greenwich Village, New York City)

Date

1932-1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3828

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Greenwich Village, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3828

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-08-30

Boy 99.1
Child 99.1
Male 99.1
Person 99.1
Male 99
Person 99
Adult 99
Man 99
Photography 99
Male 98.8
Person 98.8
Adult 98.8
Man 98.8
Boy 98.7
Child 98.7
Male 98.7
Person 98.7
Person 98
Baby 98
Person 97.7
Baby 97.7
Male 95.6
Person 95.6
Adult 95.6
Man 95.6
Clothing 93.2
Footwear 93.2
Shoe 93.2
Shoe 93.2
Shoe 92.7
Person 89.7
Baby 89.7
People 88.3
Shoe 83.8
Face 76
Head 76
Outdoors 69.7
Accessories 67.6
Belt 67.6
Shoe 62.4
Shoe 60.2
Belt 58.9
Hat 56.9
City 56.8
Road 56.6
Street 56.6
Urban 56.6
Pants 56.4
Shorts 56.2
Coat 56.1
Window 56
Art 55.8
Collage 55.8
Photographer 55.4
Bench 55
Furniture 55

Clarifai
created on 2018-11-02

group 99.3
people 98.5
group together 97.7
movie 94.4
man 94.3
negative 93.2
woman 88.8
teamwork 88.8
retro 86.8
support 85.4
partnership 82.9
desktop 81.8
filmstrip 80.9
adult 80.5
actor 79.1
squad 77.5
old 76.1
three 74.6
military 72.1
cooperation 71.7

Imagga
created on 2018-11-02

negative 72.8
film 59.7
photographic paper 40.2
photographic equipment 26.8
building 24.6
architecture 24.6
old 23
city 21.6
urban 16.6
tourism 15.7
sky 15.3
travel 14.8
silhouette 14.1
hole 13.6
people 13.4
town 13
house 13
symbol 12.8
retro 12.3
black 12
culture 12
stone 11
vintage 10.7
cityscape 10.4
church 10.2
design 10.1
art 9.9
history 9.8
crowd 9.6
ancient 9.5
buildings 9.4
grunge 9.4
business 9.1
landmark 9
religion 9
tower 8.9
text 8.7
facade 8.6
street 8.3
window 8.2
group 8.1
flag 7.9
audience 7.8
movie 7.7
famous 7.4
graphic 7.3
border 7.2

Google
created on 2018-11-02

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 18-24
Gender Female, 96.6%
Calm 53%
Confused 31.2%
Surprised 7.8%
Fear 7.8%
Sad 4.3%
Angry 1.5%
Disgusted 1.5%
Happy 0.6%

AWS Rekognition

Age 16-24
Gender Female, 95.4%
Sad 100%
Surprised 6.3%
Fear 5.9%
Calm 3%
Angry 0.9%
Disgusted 0.6%
Confused 0.3%
Happy 0%

AWS Rekognition

Age 27-37
Gender Male, 99.7%
Sad 100%
Surprised 6.3%
Fear 5.9%
Calm 3.7%
Confused 1.1%
Angry 0.2%
Disgusted 0.1%
Happy 0.1%

Microsoft Cognitive Services

Age 6
Gender Male

Microsoft Cognitive Services

Age 9
Gender Male

Microsoft Cognitive Services

Age 28
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Boy 99.1%
Child 99.1%
Male 99.1%
Person 99.1%
Adult 99%
Man 99%
Baby 98%
Shoe 93.2%
Belt 67.6%

Categories