Human Generated Data

Title

Untitled (Bernarda Bryson Shahn, Horse Dance, Java)

Date

January 26, 1960-February 2, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5127

Human Generated Data

Title

Untitled (Bernarda Bryson Shahn, Horse Dance, Java)

People

Artist: Ben Shahn, American 1898 - 1969

Date

January 26, 1960-February 2, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5127

Machine Generated Data

Tags

Amazon
created on 2023-10-07

People 100
Face 99.8
Head 99.8
Photography 99.8
Portrait 99.8
Person 97.7
Person 97.4
Adult 97.4
Female 97.4
Woman 97.4
Person 97.3
Person 96.9
Female 96.9
Child 96.9
Girl 96.9
Person 95.5
Person 95.2
Person 95.2
Female 95.2
Child 95.2
Girl 95.2
Crowd 93
Person 92.9
Person 90.8
Adult 90.8
Male 90.8
Man 90.8
Person 89.7
Adult 89.7
Female 89.7
Woman 89.7
Bride 89.7
Wedding 89.7
Person 82.6
Adult 82.6
Female 82.6
Woman 82.6
Person 80.1
Adult 80.1
Male 80.1
Man 80.1
Clothing 78.2
Hat 78.2
Person 76.3
Female 76.3
Child 76.3
Girl 76.3
Person 72.2
Architecture 62.5
Building 62.5
Classroom 62.5
Indoors 62.5
Room 62.5
School 62.5
Audience 57.8
Footwear 56.4
Shoe 56.4
Reading 56.2
Hospital 55.5
Machine 55.4
Wheel 55.4
Couch 55.2
Furniture 55.2

Clarifai
created on 2018-05-10

people 100
group 99.7
many 99.7
group together 99.1
adult 97.2
woman 95.8
man 95.2
child 95.1
recreation 94.9
wear 93.8
crowd 92.9
military 92.9
administration 92.5
seat 89.9
furniture 89.3
spectator 88.9
several 88.7
boy 86.4
audience 86.4
leader 86.2

Imagga
created on 2023-10-07

people 30.1
happy 26.9
man 26.2
person 25
male 25
together 23.6
love 22.9
child 21.8
smiling 21.7
couple 20.9
adult 17.9
mother 17.8
family 17.8
group 16.9
fun 15.7
smile 15.7
portrait 15.5
lifestyle 15.2
women 15
senior 15
men 14.6
happiness 14.1
two 13.5
friends 13.1
sitting 12.9
kin 12.1
kid 10.6
husband 10.6
old 10.4
boy 10.4
world 10.1
bride 9.6
home 9.6
wife 9.5
day 9.4
teacher 9.3
father 9.2
outdoors 9.2
dress 9
looking 8.8
celebration 8.8
hair 8.7
married 8.6
cute 8.6
parent 8.6
black 8.6
enjoying 8.5
youth 8.5
outdoor 8.4
fashion 8.3
children 8.2
photographer 8.2
classroom 8.2
lady 8.1
team 8.1
little 7.9
grandmother 7.8
hands 7.8
school 7.8
life 7.7
attractive 7.7
drinking 7.6
marriage 7.6
spectator 7.6
friendship 7.5
teamwork 7.4
wedding 7.3
cheerful 7.3
girls 7.3
success 7.2
holiday 7.2
face 7.1
summer 7.1
indoors 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 98.8
text 90.3
people 63.9
group 56.6
picture frame 8
crowd 3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 19-27
Gender Male, 67.4%
Sad 95.2%
Calm 50.7%
Surprised 6.3%
Fear 6%
Confused 0.8%
Disgusted 0.2%
Happy 0.1%
Angry 0.1%

AWS Rekognition

Age 43-51
Gender Female, 92%
Calm 75.7%
Sad 7.9%
Surprised 7.6%
Fear 6.4%
Confused 5.6%
Happy 2.1%
Disgusted 1.6%
Angry 1.6%

AWS Rekognition

Age 21-29
Gender Female, 70.6%
Calm 97.8%
Surprised 6.3%
Fear 5.9%
Sad 2.9%
Disgusted 0%
Confused 0%
Angry 0%
Happy 0%

AWS Rekognition

Age 6-14
Gender Female, 98.8%
Calm 63.5%
Sad 33.3%
Surprised 16.5%
Fear 5.9%
Disgusted 0.9%
Happy 0.4%
Angry 0.3%
Confused 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.7%
Adult 97.4%
Female 97.4%
Woman 97.4%
Child 96.9%
Girl 96.9%
Male 90.8%
Man 90.8%
Bride 89.7%
Shoe 56.4%
Wheel 55.4%

Categories