Human Generated Data

Title

Untitled (James Temple, strawberry picker, Hammond, Louisiana)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2514

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (James Temple, strawberry picker, Hammond, Louisiana)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2514

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Clothing 100
Coat 100
Adult 99.6
Male 99.6
Man 99.6
Person 99.6
Jacket 98.6
Face 96.7
Head 96.7
Photography 96.7
Portrait 96.7
Hat 87.5
Formal Wear 58
Suit 58
Overcoat 57.3
Architecture 56.8
Building 56.8
Factory 56.8
Smoke 56.7
Blazer 55.3
Manufacturing 55.3

Clarifai
created on 2018-05-10

people 100
adult 99.3
man 98
group 96.6
furniture 96.2
two 96
wear 95.4
one 95.1
room 93.9
actor 91.4
portrait 90.8
administration 89.8
facial hair 88.7
sit 88.3
group together 85.7
three 85.5
outfit 84.2
woman 83.7
music 82.5
indoors 82.3

Imagga
created on 2023-10-06

person 25.1
man 21.5
people 20.6
musical instrument 20.6
male 20.6
black 18.1
stringed instrument 14.9
adult 14.4
portrait 13.6
dark 13.4
human 12.7
protection 12.7
art 12.6
sexy 12
body 12
device 11.7
model 11.7
guitar 11.5
light 11.4
danger 10.9
silhouette 10.8
hand 10.6
fashion 10.6
percussion instrument 10.4
men 10.3
symbol 10.1
music 10
sensuality 10
one 9.7
grunge 9.4
electric guitar 9.1
religion 9
dance 8.8
play 8.6
disk jockey 8.5
power 8.4
smoke 8.4
dress 8.1
night 8
working 8
lifestyle 7.9
shovel 7.8
equipment 7.8
sitting 7.7
attractive 7.7
old 7.7
holding 7.4
style 7.4
water 7.3
suit 7.3
lady 7.3
sensual 7.3
hair 7.1
women 7.1
posing 7.1
player 7.1

Microsoft
created on 2018-05-10

person 95.4
man 92.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 47-53
Gender Male, 99.9%
Fear 76.6%
Calm 21.6%
Surprised 6.9%
Confused 6.8%
Disgusted 4.4%
Sad 4%
Angry 1.5%
Happy 0.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.6%
Male 99.6%
Man 99.6%
Person 99.6%

Text analysis

Amazon

115%F.