Human Generated Data

Title

Untitled (South Street pier, New York City)

Date

1932-1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2996

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (South Street pier, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2996

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Clothing 99.7
Head 99.7
Face 99.7
Person 98.8
Adult 98.8
Male 98.8
Man 98.8
Person 98.8
Adult 98.8
Male 98.8
Man 98.8
Person 98.4
Adult 98.4
Male 98.4
Man 98.4
Person 98.3
Adult 98.3
Male 98.3
Man 98.3
Person 96
Person 91.7
Adult 91.7
Male 91.7
Man 91.7
Person 89.7
Hat 88.7
Smoke 74.7
People 64.6
Photography 63.7
Portrait 63.7
Cap 57.8
Coat 57.7
Hat 57.7
Body Part 57.3
Finger 57.3
Hand 57.3
Captain 57.2
Officer 57.2
Smoking 55.8
Baseball Cap 55.2

Clarifai
created on 2018-05-10

people 99.9
group together 99.2
group 98.7
adult 97.5
administration 96.5
many 96.2
several 95.5
man 94.7
military 93.8
leader 93.1
vehicle 92.3
woman 90.7
wear 90.1
war 89.5
police 86.7
outfit 86.5
transportation system 82.5
veil 82.2
watercraft 82.1
five 81.4

Imagga
created on 2023-10-06

cowboy hat 100
hat 100
headdress 78.1
clothing 62.4
man 37.6
male 31.9
consumer goods 30
covering 29.7
people 27.9
person 24.8
sombrero 23.7
senior 21.5
uniform 19
adult 18.2
portrait 17.5
work 16.5
men 16.3
cowboy 15.7
outdoors 15.7
happy 15.7
old 15.3
two 14.4
worker 14.1
shirt 13.1
surgeon 13
mature 13
guy 12.9
smile 12.8
hand 12.1
face 12.1
leisure 11.6
elderly 11.5
smiling 10.8
western 10.6
hospital 10.6
couple 10.4
professional 10.4
military uniform 9.9
together 9.6
looking 9.6
mask 9.6
love 9.5
doctor 9.4
industry 9.4
casual 9.3
occupation 9.2
patient 9
style 8.9
nurse 8.9
job 8.8
medical 8.8
hair 8.7
enjoy 8.5
attractive 8.4
health 8.3
glasses 8.3
hold 8.3
care 8.2
equipment 8.2
room 8.2
activity 8.1
handsome 8
home 8
medicine 7.9
standing 7.8
surgery 7.8
sitting 7.7
helmet 7.7
outside 7.7
expression 7.7
married 7.7
husband 7.6
loving 7.6
illness 7.6
wife 7.6
fun 7.5
cheerful 7.3
industrial 7.3
color 7.2
building 7.1
women 7.1
family 7.1
happiness 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 100
outdoor 93.8
people 88.2
group 76.2
old 56.5
crowd 25.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 25-35
Gender Male, 100%
Sad 100%
Surprised 6.3%
Fear 5.9%
Calm 5.4%
Confused 0.2%
Angry 0%
Disgusted 0%
Happy 0%

AWS Rekognition

Age 29-39
Gender Male, 100%
Sad 99%
Confused 27.9%
Calm 9.6%
Surprised 6.7%
Fear 6%
Angry 1.3%
Disgusted 1%
Happy 0.3%

AWS Rekognition

Age 20-28
Gender Male, 99.5%
Calm 99.9%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Confused 0%
Angry 0%
Disgusted 0%
Happy 0%

AWS Rekognition

Age 48-54
Gender Male, 93.9%
Sad 68.9%
Calm 55.5%
Happy 9%
Surprised 6.5%
Fear 6%
Angry 1.2%
Confused 0.9%
Disgusted 0.3%

Microsoft Cognitive Services

Age 42
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%
Adult 98.8%
Male 98.8%
Man 98.8%
Hat 88.7%