Human Generated Data

Title

Untitled (James Temple, strawberry picker, Hammond, Louisiana)

Date

October 1935, printed later

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Museum Acquisition, P1970.3419

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (James Temple, strawberry picker, Hammond, Louisiana)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Museum Acquisition, P1970.3419

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Clothing 100
Coat 100
Hat 100
Adult 99.6
Male 99.6
Man 99.6
Person 99.6
Face 96.9
Head 96.9
Photography 96.9
Portrait 96.9
Pottery 87.9
Jacket 77.7
Accessories 62.9
Bag 62.9
Handbag 62.9
Cookware 58.5
Pot 57.9
Sun Hat 57.6
Smoke 56.4
Formal Wear 56.1
Suit 56.1
Overcoat 56
Cowboy Hat 55.8
Furniture 55
Table 55

Clarifai
created on 2018-05-10

people 100
adult 99.1
man 98.1
group 97.3
furniture 96.9
one 96.1
two 96
wear 95.9
group together 95
administration 93.7
outfit 93.7
room 93.2
military 92.5
war 90.6
music 90.3
facial hair 90.2
three 90.2
portrait 89.9
actor 89.7
leader 89.2

Imagga
created on 2023-10-06

man 22.8
person 21.7
people 19.5
black 19.2
male 19.1
adult 17.6
blackboard 14.4
x-ray film 13.3
hand 12.9
light 12.7
art 12.5
film 12.5
television 12.3
human 12
portrait 11.6
dark 10.9
silhouette 10.8
model 10.1
symbol 10.1
dress 9.9
fashion 9.8
business 9.7
one 9.7
body 9.6
mask 9.5
men 9.4
fire 9.4
protection 9.1
danger 9.1
sensuality 9.1
music 9
sexy 8.8
equipment 8.6
face 8.5
grunge 8.5
musical instrument 8.5
old 8.4
industrial 8.2
photographic paper 8.2
working 8
lifestyle 7.9
telecommunication system 7.9
love 7.9
horror 7.8
sign 7.5
smoke 7.4
style 7.4
metal 7.2
women 7.1
posing 7.1
night 7.1
businessman 7.1

Microsoft
created on 2018-05-10

person 95.2
man 91.8
old 81.7
black 74.7
white 67.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 48-56
Gender Male, 99.9%
Fear 83.6%
Sad 12.3%
Calm 8.8%
Surprised 6.8%
Confused 4.2%
Disgusted 3.8%
Happy 2%
Angry 1.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.6%
Male 99.6%
Man 99.6%
Person 99.6%
Handbag 62.9%

Text analysis

Amazon

117%F