Human Generated Data

Title

Untitled (Ozarks, Arkansas)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1825

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Ozarks, Arkansas)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1825

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Face 100
Head 100
Photography 100
Portrait 100
Body Part 99.6
Finger 99.6
Hand 99.6
Person 99.5
Child 99.5
Female 99.5
Girl 99.5
Person 98.4
Adult 98.4
Male 98.4
Man 98.4
Sad 93.1
Person 67.5
Baby 67.5
Crying 65.2
Neck 57.3
Accessories 56.6
Earring 56.6
Jewelry 56.6

Clarifai
created on 2018-05-11

people 100
adult 99.2
portrait 98.6
two 98.3
one 96.5
actress 95.8
child 95.8
woman 95.7
man 95.6
group 94.3
music 94.2
sit 92.6
art 92.1
print 90.5
actor 89.4
wear 89.1
musician 86.9
furniture 86
facial expression 84.8
book series 81.7

Imagga
created on 2023-10-06

television 35.7
telecommunication system 26.4
one 17.9
portrait 17.5
adult 17.5
child 17.5
face 17
sexy 16.9
man 16.2
people 16.2
person 16.1
happy 14.4
male 14.3
love 14.2
cadaver 14.1
attractive 14
body 13.6
pretty 13.3
smile 12.8
smiling 12.3
cute 12.2
world 11.9
hair 11.9
happiness 11.8
black 11.4
brunette 11.3
couple 11.3
fashion 11.3
human 11.2
billboard 11.2
skin 11.2
sensuality 10.9
model 10.9
dress 10.8
lying 10.3
relaxation 10
sitting 9.4
expression 9.4
youth 9.4
casual 9.3
signboard 9.1
cheerful 8.9
sofa 8.8
women 8.7
sexual 8.7
bride 8.6
close 8.6
culture 8.5
erotic 8.5
passion 8.5
lady 8.1
posing 8
home 8
kid 8
nude 7.8
eyes 7.7
hand 7.6
art 7.5
vintage 7.4
water 7.3
sensual 7.3
blond 7.2
little 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 97.4

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 4-10
Gender Female, 99.9%
Sad 98.6%
Calm 13.6%
Angry 11.7%
Happy 7.7%
Surprised 7.6%
Fear 7.2%
Disgusted 3.3%
Confused 1.2%

AWS Rekognition

Age 43-51
Gender Male, 99.6%
Calm 75.4%
Surprised 8.4%
Fear 8.4%
Angry 5.5%
Sad 5.1%
Disgusted 1.5%
Happy 1%
Confused 0.9%

Microsoft Cognitive Services

Age 34
Gender Female

Feature analysis

Amazon

Person 99.5%
Child 99.5%
Female 99.5%
Girl 99.5%
Adult 98.4%
Male 98.4%
Man 98.4%
Baby 67.5%

Categories

Captions

Microsoft
created on 2018-05-11

a person sitting on a table 90.4%
a girl sitting on a table 80.4%
a person sitting at a table 80.3%