Human Generated Data

Title

Untitled (New York City)

Date

1932-1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2841

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2841

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Adult 99.6
Male 99.6
Man 99.6
Person 99.6
Face 99.2
Head 99.2
Photography 99.2
Portrait 99.2
Adult 99.2
Male 99.2
Man 99.2
Person 99.2
Clothing 99.1
Pants 99.1
Adult 98.9
Male 98.9
Man 98.9
Person 98.9
Undershirt 98.4
Sitting 96.6
Brick 93
Shorts 87.2
Smoke 74.2
Person 71.3
Jeans 64.1
Body Part 57.9
Finger 57.9
Hand 57.9
Door 57.5
Footwear 56.9
Shoe 56.9
Couch 56.4
Furniture 56.4
Accessories 55.5

Clarifai
created on 2018-05-10

people 100
adult 99.6
man 98.8
two 98.7
group 98.1
one 96.9
group together 95.9
three 95.8
portrait 94.2
wear 92.5
four 91.7
several 91.6
sit 91.3
military 88.8
administration 87.8
sitting 87.2
five 86.7
actor 86.1
war 85.7
facial hair 84.8

Imagga
created on 2023-10-06

statue 40.6
sculpture 32
old 25.8
ancient 21.6
man 21.6
person 21.3
religion 20.6
hairdresser 20.2
male 19.9
art 19.4
fan 17.8
monk 16.5
architecture 16.4
barbershop 16
religious 15.9
antique 15.6
culture 15.4
follower 14.4
monument 14
people 13.9
adult 13.7
god 13.4
city 13.3
shop 13.2
portrait 12.9
stone 12.6
history 12.5
famous 12.1
face 12.1
sitting 12
human 12
marble 11.6
vintage 11.6
historic 11
mercantile establishment 10.4
one 9.7
pray 9.7
world 9.5
landmark 9
building 8.8
home 8.8
love 8.7
historical 8.5
travel 8.4
head 8.4
symbol 8.1
detail 8
catholic 7.8
men 7.7
temple 7.7
spiritual 7.7
hand 7.6
traditional 7.5
tourism 7.4
decoration 7.2
black 7.2
looking 7.2

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 99.9
outdoor 94.8
man 94.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 50-58
Gender Male, 100%
Calm 92.7%
Surprised 6.4%
Fear 6%
Angry 4.1%
Sad 2.4%
Confused 1.2%
Disgusted 0.5%
Happy 0.2%

AWS Rekognition

Age 35-43
Gender Male, 100%
Calm 95.4%
Surprised 6.3%
Fear 5.9%
Confused 2.6%
Sad 2.3%
Angry 1.1%
Disgusted 0.1%
Happy 0.1%

AWS Rekognition

Age 19-27
Gender Male, 97.1%
Calm 96.4%
Surprised 6.4%
Fear 6%
Sad 2.3%
Happy 1.7%
Angry 0.6%
Disgusted 0.2%
Confused 0.2%

Microsoft Cognitive Services

Age 54
Gender Male

Microsoft Cognitive Services

Age 37
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.6%
Male 99.6%
Man 99.6%
Person 99.6%
Jeans 64.1%
Shoe 56.9%

Categories