Human Generated Data

Title

Untitled (school, Ozarks, Arkansas)

Date

October 1935, printed later

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Robert M. Sedgwick II Fund, P1976.106

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (school, Ozarks, Arkansas)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Robert M. Sedgwick II Fund, P1976.106

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Person 98.9
Architecture 98.7
Building 98.7
Hospital 98.7
Person 98.4
Person 98.1
Baby 98.1
Person 97.3
Boy 97.3
Child 97.3
Male 97.3
Person 97.2
Male 97.2
Adult 97.2
Man 97.2
School 97.1
Person 95.7
Classroom 95
Indoors 95
Room 95
Face 93.3
Head 93.3
Furniture 77.9
Table 77.7
Dining Room 57.8
Dining Table 57.8
Clinic 57.4
Desk 56
People 55.8
Photography 55.7
Portrait 55.7
Wood 55.6
Operating Theatre 55.6
Art 55.5
Painting 55.5

Clarifai
created on 2018-05-10

people 100
group 99.8
adult 99.2
group together 98.8
man 97.8
wear 97.6
woman 97.4
many 95.7
administration 95.2
leader 94
child 93.2
outfit 91.3
room 90.8
portrait 88.7
music 88.6
several 88.4
home 87.8
five 87.5
furniture 86.8
three 86.7

Imagga
created on 2023-10-06

man 40.3
male 31.3
people 31.2
kin 26
adult 25.2
person 24.6
couple 24.4
men 24
businessman 23.8
business 23.7
group 20.1
women 19.8
smiling 19.5
happy 19.4
cheerful 18.7
team 17
office 16.6
smile 16.4
happiness 15.7
indoors 14.9
sitting 14.6
two 14.4
classroom 13.5
family 13.3
school 13.3
together 13.1
senior 13.1
teacher 13.1
student 12.8
colleagues 12.6
lifestyle 12.3
meeting 12.2
education 12.1
30s 11.5
groom 11.4
child 11.4
businesspeople 11.4
standing 11.3
executive 11.2
home 11.2
mature 11.1
teamwork 11.1
portrait 11
20s 11
businesswoman 10.9
job 10.6
professional 10.6
desk 10.4
love 10.3
indoor 10
staff 9.7
talking 9.5
holding 9.1
room 9.1
color 8.9
musical instrument 8.8
worker 8.8
coworkers 8.8
25 30 years 8.8
mid adult 8.7
mother 8.6
table 8.6
four 8.6
corporate 8.6
face 8.5
friends 8.4
world 8.4
communication 8.4
wine 8.3
occupation 8.2
suit 8.2
blackboard 8.1
looking 8
interior 8
black 7.9
associates 7.9
40s 7.8
discussion 7.8
waiter 7.8
bride 7.7
youth 7.7
drinking 7.6
employee 7.6
career 7.6
togetherness 7.5
horizontal 7.5
friendship 7.5
manager 7.4
friendly 7.3
restaurant 7.3
spectator 7.2
handsome 7.1
work 7.1
day 7.1
life 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 99.7
man 92.3
standing 79
group 72.8
people 60.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 6-14
Gender Male, 97.5%
Sad 100%
Surprised 6.3%
Fear 5.9%
Calm 3.2%
Confused 0.1%
Angry 0.1%
Happy 0.1%
Disgusted 0%

AWS Rekognition

Age 18-24
Gender Male, 99.5%
Calm 88.7%
Sad 8.4%
Surprised 6.4%
Fear 6%
Confused 0.4%
Angry 0.1%
Disgusted 0.1%
Happy 0%

AWS Rekognition

Age 6-14
Gender Male, 99.9%
Calm 99%
Surprised 6.3%
Fear 5.9%
Sad 2.3%
Confused 0.2%
Happy 0.1%
Angry 0.1%
Disgusted 0.1%

AWS Rekognition

Age 6-16
Gender Male, 96.1%
Calm 99.8%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Confused 0.1%
Angry 0%
Happy 0%
Disgusted 0%

AWS Rekognition

Age 14-22
Gender Female, 64.7%
Calm 59.7%
Sad 38.7%
Fear 7.9%
Surprised 7.1%
Confused 5%
Disgusted 2.7%
Angry 1.9%
Happy 1%

AWS Rekognition

Age 18-24
Gender Female, 83.2%
Calm 99.9%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Happy 0%
Disgusted 0%
Angry 0%
Confused 0%

Microsoft Cognitive Services

Age 24
Gender Female

Microsoft Cognitive Services

Age 13
Gender Female

Microsoft Cognitive Services

Age 11
Gender Female

Microsoft Cognitive Services

Age 11
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.9%
Baby 98.1%
Boy 97.3%
Child 97.3%
Male 97.3%
Adult 97.2%
Man 97.2%