Human Generated Data

Title

Untitled (school, Ozarks, Arkansas)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1115

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (school, Ozarks, Arkansas)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1115

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Adult 99.5
Male 99.5
Man 99.5
Person 99.5
Adult 99.2
Person 99.2
Female 99.2
Woman 99.2
Adult 99.1
Person 99.1
Female 99.1
Woman 99.1
Person 98.9
Female 98.9
Child 98.9
Girl 98.9
Adult 98.8
Male 98.8
Man 98.8
Person 98.8
Photography 98.7
Male 98.4
Person 98.4
Child 98.4
Boy 98.4
Face 97.7
Head 97.7
Portrait 97.7
People 94.8
Architecture 93.2
Building 93.2
Classroom 93.2
Indoors 93.2
Room 93.2
School 93.2
Crowd 56.1
Blackboard 56
Electrical Device 55.3
Microphone 55.3

Clarifai
created on 2018-05-11

people 99.9
group 99.7
child 99.3
adult 98.1
woman 97.5
group together 96.1
man 95.6
four 95.3
education 95.3
teacher 94.6
family 91.8
school 91
three 90.6
classroom 90.2
offspring 89.1
room 88.3
five 87.1
several 86
adolescent 85.7
boy 85.5

Imagga
created on 2023-10-05

couple 34
people 32.9
mother 29.8
adult 28.3
happy 26.3
love 26
man 24.9
bride 24
portrait 23.3
dress 22.6
person 22.5
male 21.6
happiness 21.1
smiling 20.2
women 19.8
cheerful 18.7
family 18.7
together 18.4
two 17.8
parent 17.6
wedding 16.5
kin 16.5
groom 16.4
face 16.3
fashion 15.8
attractive 15.4
child 15.2
home 15.1
smile 14.2
bouquet 14.1
room 13.4
lady 13
husband 12.8
pretty 12.6
daughter 12.5
marriage 12.3
indoors 12.3
lifestyle 12.3
senior 12.2
world 12
romantic 11.6
married 11.5
relationship 11.2
men 11.2
drink 10.9
romance 10.7
20s 10.1
sexy 9.6
wife 9.5
adults 9.5
sitting 9.4
expression 9.4
mature 9.3
indoor 9.1
old 9.1
aged 9
life 9
fun 9
gown 9
classroom 8.8
looking 8.8
hair 8.7
boy 8.7
standing 8.7
drinking 8.6
loving 8.6
culture 8.5
clothing 8.4
camera 8.3
leisure 8.3
holding 8.3
human 8.2
girls 8.2
business 7.9
flowers 7.8
embracing 7.8
color 7.8
bridal 7.8
model 7.8
ceremony 7.8
party 7.7
elderly 7.7
casual 7.6
hand 7.6
females 7.6
laughing 7.6
head 7.6
pair 7.6
wine 7.4
suit 7.2
celebration 7.2
day 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 99.9
window 90.8
people 90
group 84.9
family 18.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 19-27
Gender Female, 99.2%
Sad 99.9%
Calm 8.1%
Surprised 6.6%
Fear 6.3%
Happy 5.4%
Angry 2.2%
Disgusted 2.2%
Confused 1.1%

AWS Rekognition

Age 21-29
Gender Female, 95.7%
Calm 99.9%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Disgusted 0%
Confused 0%
Angry 0%
Happy 0%

AWS Rekognition

Age 6-16
Gender Male, 96.2%
Calm 85.6%
Surprised 8%
Confused 7.2%
Fear 6%
Sad 2.5%
Angry 1.2%
Happy 0.8%
Disgusted 0.7%

AWS Rekognition

Age 11-19
Gender Female, 86.3%
Fear 80.4%
Happy 32.2%
Surprised 6.9%
Angry 2.7%
Sad 2.3%
Confused 0.4%
Disgusted 0.4%
Calm 0.3%

AWS Rekognition

Age 1-7
Gender Female, 79.8%
Calm 75.3%
Sad 25.9%
Surprised 6.5%
Fear 6%
Angry 3.2%
Confused 0.7%
Happy 0.4%
Disgusted 0.2%

Microsoft Cognitive Services

Age 27
Gender Female

Microsoft Cognitive Services

Age 4
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.5%
Male 99.5%
Man 99.5%
Person 99.5%
Female 99.2%
Woman 99.2%
Child 98.9%
Girl 98.9%
Boy 98.4%

Categories