Human Generated Data

Title

Untitled (school, Red House, West Virginia)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1191

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (school, Red House, West Virginia)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1191

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Architecture 99.6
Building 99.6
School 99.6
Person 99.4
People 99.3
Classroom 99.2
Indoors 99.2
Room 99.2
Person 99
Adult 99
Male 99
Man 99
Person 96.8
Child 96.8
Female 96.8
Girl 96.8
Person 96.4
Person 95.8
Adult 95.8
Male 95.8
Man 95.8
Face 93.8
Head 93.8
Person 91.3
Baby 91.3
Person 83.9
Person 82.3
Desk 74.5
Furniture 74.5
Table 74.5
Person 72.6
Adult 72.6
Adult 72.6
Female 72.6
Female 72.6
Bride 72.6
Wedding 72.6
Woman 72.6
Person 69.6
Person 67.8
Adult 67.8
Female 67.8
Woman 67.8
Photography 56.6
Portrait 56.6
Crowd 56.4
Person 56.2
Reading 56.1
Student 55.9

Clarifai
created on 2018-05-11

group 99.8
people 99.8
group together 99.4
many 98.5
man 96.5
adult 96.1
woman 96.1
administration 93.5
leader 92.9
audience 91.5
crowd 91.3
sitting 86.4
several 85.4
facial expression 84.5
music 82.6
recreation 82.3
actress 82.2
education 81.6
singer 81.3
sit 80.8

Imagga
created on 2023-10-06

people 31.2
man 30.2
buddy 28.6
group 28.2
male 27
person 26.7
classroom 24.6
adult 22.4
smiling 21.7
women 21.3
together 21
happy 20.7
room 19.8
friends 17.8
child 17.6
fun 17.2
couple 16.5
sitting 16.3
men 16.3
smile 15.7
black 15.3
lifestyle 15.2
business 14.6
indoors 14
youth 13.6
portrait 13.6
attractive 13.3
friendship 12.2
love 11.8
work 11.8
student 11.3
education 11.3
children 10.9
photographer 10.9
leisure 10.8
team 10.7
school 10.6
boy 10.4
happiness 10.2
model 10.1
teenager 10
music 10
computer 9.8
fashion 9.8
cheerful 9.7
kid 9.7
businessman 9.7
success 9.7
enjoying 9.5
spectator 9.4
casual 9.3
brass 9.3
professional 9.2
teen 9.2
wind instrument 9.1
nightlife 8.8
looking 8.8
home 8.8
face 8.5
sit 8.5
meeting 8.5
two 8.5
learning 8.4
musical instrument 8.4
teamwork 8.3
silhouette 8.3
megaphone 8.3
teacher 8.2
20s 8.2
indoor 8.2
businesswoman 8.2
handsome 8
family 8
posing 8
job 8
working 7.9
hair 7.9
singer 7.9
color 7.8
corporate 7.7
pretty 7.7
four 7.7
businesspeople 7.6
communication 7.6
horizontal 7.5
drink 7.5
city 7.5
study 7.5
style 7.4
lady 7.3
girls 7.3
device 7.2
laptop 7.2
holiday 7.2

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 99.9
indoor 88.6
people 80.5
group 60
conference room 13.9
crowd 7.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 6-14
Gender Male, 70.7%
Sad 63.2%
Calm 37.2%
Angry 24.1%
Surprised 7%
Fear 6.4%
Confused 3.2%
Disgusted 1.7%
Happy 0.5%

AWS Rekognition

Age 31-41
Gender Female, 78.9%
Happy 88.9%
Surprised 6.9%
Fear 6.1%
Calm 5.4%
Sad 2.4%
Disgusted 1.5%
Confused 0.8%
Angry 0.7%

AWS Rekognition

Age 6-16
Gender Female, 100%
Happy 85.4%
Surprised 14.1%
Fear 6%
Sad 2.2%
Calm 2.2%
Confused 0.7%
Angry 0.4%
Disgusted 0.3%

AWS Rekognition

Age 18-26
Gender Male, 89.3%
Sad 96.9%
Calm 44.5%
Surprised 6.7%
Fear 6.1%
Happy 1.3%
Confused 0.7%
Angry 0.7%
Disgusted 0.6%

AWS Rekognition

Age 16-22
Gender Female, 97.8%
Calm 98%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Angry 1.4%
Confused 0.1%
Disgusted 0.1%
Happy 0%

AWS Rekognition

Age 31-41
Gender Female, 99.9%
Calm 55.3%
Sad 32%
Angry 10%
Surprised 8.7%
Fear 7%
Confused 3.4%
Disgusted 2.2%
Happy 0.8%

Microsoft Cognitive Services

Age 28
Gender Male

Microsoft Cognitive Services

Age 25
Gender Male

Microsoft Cognitive Services

Age 54
Gender Male

Microsoft Cognitive Services

Age 11
Gender Female

Microsoft Cognitive Services

Age 12
Gender Female

Microsoft Cognitive Services

Age 34
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Adult 99%
Male 99%
Man 99%
Child 96.8%
Female 96.8%
Girl 96.8%
Baby 91.3%
Bride 72.6%
Woman 72.6%

Categories

Imagga

paintings art 77.9%
people portraits 20.8%