Human Generated Data

Title

Untitled (Marked Tree, Arkansas?)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1157

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Marked Tree, Arkansas?)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1157

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Clothing 100
Adult 99.3
Male 99.3
Man 99.3
Person 99.3
Person 99
Baby 99
Adult 98.8
Male 98.8
Man 98.8
Person 98.8
Adult 98.1
Male 98.1
Man 98.1
Person 98.1
Photography 97.2
Face 96.6
Head 96.6
Adult 94.1
Male 94.1
Man 94.1
Person 94.1
Hat 91.3
Footwear 87.6
Shoe 87.6
Body Part 87.6
Finger 87.6
Hand 87.6
Portrait 81.7
Shoe 80.2
Shoe 70
Shoe 60.4
Architecture 57.7
Building 57.7
Outdoors 57.7
Shelter 57.7
Sun Hat 57.5
Cap 56.4
Transportation 55.6
Vehicle 55.6
Canopy 55.5

Clarifai
created on 2018-05-11

people 100
two 99.5
child 99.3
group 98.9
one 98.8
adult 98.7
group together 97.6
three 95.1
boy 94.2
four 93.6
wear 93.5
home 93.3
man 93.2
woman 92.1
several 90.7
veil 89.8
vehicle 89.5
war 89.4
many 88.6
offspring 88.2

Imagga
created on 2023-10-05

child 24.8
device 21.7
man 17.5
musical instrument 16.9
male 16.5
ventilator 15.9
adult 15.5
building 15
happiness 14.1
accordion 13.9
person 13.9
happy 13.8
outside 13.7
love 13.4
people 12.8
old 12.5
smiling 12.3
keyboard instrument 11.7
outdoors 11.2
vehicle 10.7
smile 10.7
wall 10.5
portrait 10.3
structure 9.7
couple 9.6
sitting 9.4
wind instrument 9.4
house 9.2
vintage 9.1
fun 9
architecture 8.8
together 8.8
lifestyle 8.7
casual 8.5
outdoor 8.4
joy 8.3
alone 8.2
dress 8.1
family 8
looking 8
kid 8
face 7.8
hut 7.7
pretty 7.7
attractive 7.7
swing 7.7
cute 7.2
hair 7.1
parent 7
car 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 97.2
person 93.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 10-18
Gender Female, 99.8%
Sad 99.9%
Calm 20.1%
Surprised 6.4%
Fear 6.1%
Confused 1%
Happy 0.4%
Angry 0.4%
Disgusted 0.3%

AWS Rekognition

Age 25-35
Gender Male, 88.7%
Sad 99.9%
Calm 10%
Surprised 7.3%
Fear 6.5%
Disgusted 2.6%
Confused 2.1%
Angry 1%
Happy 0.9%

AWS Rekognition

Age 28-38
Gender Female, 53.7%
Sad 100%
Surprised 6.3%
Fear 6%
Disgusted 1.2%
Calm 0.6%
Confused 0.3%
Angry 0.1%
Happy 0%

AWS Rekognition

Age 18-24
Gender Male, 95.3%
Calm 94.1%
Surprised 6.5%
Fear 5.9%
Sad 2.8%
Angry 2.1%
Confused 0.8%
Disgusted 0.3%
Happy 0.3%

AWS Rekognition

Age 19-27
Gender Male, 99.6%
Calm 56.1%
Sad 49.4%
Happy 9.9%
Surprised 6.9%
Fear 6.1%
Disgusted 2.3%
Angry 1.8%
Confused 1.6%

Microsoft Cognitive Services

Age 10
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.3%
Male 99.3%
Man 99.3%
Person 99.3%
Baby 99%
Hat 91.3%
Shoe 87.6%

Categories

Text analysis

Amazon

OFFICE
OFFICE O
O
ARELES
ANE
FOR
the