Human Generated Data

Title

Untitled (South Street pier, New York City)

Date

1932-1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2207

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (South Street pier, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2207

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Captain 99.9
Officer 99.9
Clothing 99.9
Coat 99.9
Person 99.7
Adult 99.7
Male 99.7
Man 99.7
Person 99.4
Adult 99.4
Male 99.4
Man 99.4
Person 99.2
Adult 99.2
Male 99.2
Man 99.2
Face 91.2
Head 91.2
Outdoors 86.7
Sailor Suit 81.1
Hat 70.2
Toy 65.2
Photography 64
Portrait 64
Car 60.8
Transportation 60.8
Vehicle 60.8
Jacket 56.8
Sitting 55.1

Clarifai
created on 2018-05-10

people 99.9
adult 98.9
man 97.5
group together 97.4
outfit 97
two 96.1
group 96
watercraft 95.8
three 95.6
portrait 95.4
wear 94.5
actor 92.4
administration 91.5
four 91.4
military 90.5
leader 90.5
vehicle 89.9
veil 89.6
sit 88.9
lid 88

Imagga
created on 2023-10-05

man 34.2
male 22.8
people 22.3
person 21.9
outdoors 21
adult 20.7
water 16.7
men 15.4
outside 14.5
happiness 14.1
happy 13.8
equipment 13.5
hand 12.9
summer 12.8
hat 12.6
hair 11.9
sea 11.8
sport 11.8
relaxation 11.7
lifestyle 11.6
senior 11.2
fun 11.2
sitting 11.2
fisherman 11.1
smiling 10.8
seller 9.7
fishing 9.6
elderly 9.6
sky 9.6
device 9.5
rope 9.4
industry 9.4
boat 9.3
smile 9.3
face 9.2
outdoor 9.2
travel 9.1
portrait 9.1
old 9
couple 8.7
work 8.7
love 8.7
day 8.6
business 8.5
leisure 8.3
ocean 8.3
industrial 8.2
activity 8.1
looking 8
holiday 7.9
world 7.8
hands 7.8
play 7.7
pretty 7.7
attractive 7.7
joy 7.5
child 7.5
city 7.5
holding 7.4
reel 7.4
active 7.4
vacation 7.4
recreation 7.2
to 7.1
together 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 99.5
outdoor 96.2
man 90.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 18-26
Gender Male, 100%
Calm 62.4%
Sad 36%
Fear 8.8%
Surprised 6.6%
Confused 3.9%
Angry 2.9%
Disgusted 0.6%
Happy 0.3%

AWS Rekognition

Age 19-27
Gender Male, 100%
Calm 98.8%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Angry 0.5%
Confused 0.5%
Disgusted 0%
Happy 0%

AWS Rekognition

Age 34-42
Gender Male, 100%
Angry 53.9%
Disgusted 11.7%
Confused 10.8%
Sad 10.7%
Surprised 8%
Fear 6.5%
Calm 4.7%
Happy 2.2%

Microsoft Cognitive Services

Age 34
Gender Male

Microsoft Cognitive Services

Age 36
Gender Male

Microsoft Cognitive Services

Age 42
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Adult 99.7%
Male 99.7%
Man 99.7%
Car 60.8%

Text analysis

Amazon

ELE
LUSTIAN x ELE
LUSTIAN x