Human Generated Data

Title

Untitled (school, Ozarks, Arkansas)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1089

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (school, Ozarks, Arkansas)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1089

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Child 98.2
Female 98.2
Girl 98.2
Person 98.2
Person 98
Child 97.7
Female 97.7
Girl 97.7
Person 97.7
Person 97.4
Face 97.2
Head 97.2
Photography 97.2
Portrait 97.2
Furniture 94.9
Table 94.9
Indoors 88.9
Architecture 86.6
Building 86.6
Classroom 86.6
Room 86.6
School 86.6
Reading 65.2
People 62.2
Desk 57.6
Dining Table 57.3
Hospital 57
Electronics 57
Dining Room 56.6
Lady 56.5
Book 56.3
Library 56.3
Publication 56.3
Computer 56.2
Laptop 56.2
Pc 56.2

Clarifai
created on 2018-05-11

people 99.9
child 98.7
adult 97.5
two 96.2
group 95.2
indoors 95.2
teacher 95.1
furniture 94.4
one 94.3
classroom 94.1
room 93.6
group together 93.4
man 93
desk 92.7
concentration 92
woman 92
boy 90.9
education 90.6
sit 89.6
administration 88.6

Imagga
created on 2023-10-07

person 27
adult 26.7
man 25.5
musical instrument 25.2
people 22.3
office 21.7
sitting 21.5
laptop 20.6
happy 20
business 20
computer 18.7
portrait 18.1
male 17.7
device 16.5
grand piano 16.1
television 16
piano 15.3
smiling 14.5
electronic instrument 14.3
percussion instrument 13.7
businesswoman 13.6
work 13.3
one 12.7
stringed instrument 12.6
attractive 12.6
professional 12.2
keyboard instrument 12.1
pretty 11.9
suit 11.7
model 11.7
billboard 11.6
lifestyle 11.6
businessman 11.5
lady 11.4
fun 11.2
corporate 11.2
hair 11.1
youth 11.1
women 11.1
indoor 10.9
smile 10.7
job 10.6
couple 10.5
boy 10.4
body 10.4
men 10.3
casual 10.2
communication 10.1
fashion 9.8
worker 9.8
interior 9.7
working 9.7
education 9.5
sit 9.5
day 9.4
happiness 9.4
senior 9.4
child 9.3
manager 9.3
executive 9.2
successful 9.1
confident 9.1
desk 8.9
looking 8.8
indoors 8.8
home 8.8
equipment 8.7
signboard 8.6
expression 8.5
adults 8.5
table 8.5
outdoor 8.4
relaxation 8.4
joy 8.3
leisure 8.3
room 8.2
clothing 8.2
student 8.1
cheerful 8.1
copy space 8.1
school 8.1
childhood 8.1
brunette 7.8
black 7.8
color 7.8
show 7.6
thinking 7.6
hand 7.6
study 7.5
phone 7.4
gymnastic apparatus 7.4
joyful 7.3
balance beam 7.3
teenager 7.3
success 7.2
sexy 7.2
kid 7.1
modern 7

Microsoft
created on 2018-05-11

black 69.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 9-17
Gender Male, 96.6%
Calm 97%
Surprised 6.3%
Fear 5.9%
Sad 2.3%
Angry 2%
Confused 0.3%
Happy 0.1%
Disgusted 0.1%

AWS Rekognition

Age 11-19
Gender Female, 99.8%
Sad 99.8%
Calm 23.3%
Surprised 6.4%
Fear 6%
Confused 2.1%
Angry 0.7%
Disgusted 0.6%
Happy 0.5%

AWS Rekognition

Age 22-30
Gender Female, 90.7%
Calm 76.4%
Fear 9.9%
Confused 7.9%
Surprised 7%
Happy 3.5%
Sad 2.2%
Disgusted 1.5%
Angry 0.6%

Microsoft Cognitive Services

Age 8
Gender Female

Microsoft Cognitive Services

Age 5
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Child 98.2%
Female 98.2%
Girl 98.2%
Person 98.2%

Categories

Text analysis

Amazon

3