Human Generated Data

Title

Art School Drawing Class

Date

c. 1950

People

Artist: Ruth Orkin, American 1921 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Anonymous loan, 10.2009

Human Generated Data

Title

Art School Drawing Class

People

Artist: Ruth Orkin, American 1921 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Anonymous loan, 10.2009

Machine Generated Data

Tags

Amazon
created on 2019-04-04

Person 99.2
Human 99.2
Indoors 98.7
Room 98.7
Person 98.1
Person 98.1
Person 97.9
Person 97.6
Person 97.4
Person 97.2
Person 96.9
Person 96.9
Person 96.5
Classroom 94.9
School 94.9
Person 94.8
Person 94.3
Audience 94.2
Crowd 94.2
Person 94
Person 90.7
Person 88.9
Person 83.5
Person 78.1
Speech 72.3
Person 69.2
Court 58.4
Lecture 56.2

Clarifai
created on 2018-03-23

people 99.9
adult 98.7
group 98.6
group together 97.4
many 96.5
woman 96.2
man 93.4
administration 92.7
several 92.2
war 91.1
leader 91
room 87.1
child 85.9
five 85.6
monochrome 85.3
education 84.7
wear 84.4
military 83
school 82.9
furniture 82.8

Imagga
created on 2018-03-23

man 32.2
people 25.6
person 23.1
adult 22.2
male 22.1
musical instrument 19.1
room 17
lifestyle 16.6
men 16.3
chair 16.2
professional 16
women 15.8
teacher 15.6
business 15.2
indoors 14.9
sitting 13.7
fashion 13.6
sexy 12.8
businessman 12.4
couple 12.2
stringed instrument 12.1
black 12.1
interior 11.5
happy 11.3
home 11.2
indoor 10.9
leisure 10.8
handsome 10.7
pretty 10.5
attractive 10.5
educator 10.4
laptop 10.1
dress 9.9
wind instrument 9.9
group 9.7
casual 9.3
two 9.3
elegance 9.2
technology 8.9
table 8.8
looking 8.8
together 8.8
life 8.7
work 8.6
happiness 8.6
youth 8.5
music 8.3
office 8.2
guitar 8.1
cheerful 8.1
computer 8
clothing 8
love 7.9
boy 7.8
corporate 7.7
couch 7.7
modern 7.7
elegant 7.7
seat 7.6
meeting 7.5
human 7.5
city 7.5
smiling 7.2
worker 7.2
suit 7.2

Google
created on 2018-03-23

Microsoft
created on 2018-03-23

indoor 97.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Male, 54.8%
Happy 45.1%
Angry 45.9%
Calm 49.5%
Disgusted 45.2%
Confused 45.2%
Sad 48.9%
Surprised 45.1%

AWS Rekognition

Age 45-65
Gender Female, 51.3%
Calm 47.2%
Sad 47.4%
Confused 45.2%
Disgusted 45.3%
Happy 45.5%
Angry 49.1%
Surprised 45.3%

AWS Rekognition

Age 26-43
Gender Male, 52.2%
Surprised 45.3%
Disgusted 46.9%
Happy 45.5%
Sad 47%
Calm 46.2%
Angry 48.6%
Confused 45.5%

AWS Rekognition

Age 26-43
Gender Male, 54.3%
Surprised 45.6%
Confused 45.9%
Happy 46.1%
Sad 46.1%
Calm 45.8%
Angry 48.7%
Disgusted 46.9%

AWS Rekognition

Age 26-43
Gender Male, 53.7%
Angry 45.4%
Surprised 45.4%
Happy 45.2%
Calm 47.7%
Sad 50.9%
Confused 45.4%
Disgusted 45.1%

AWS Rekognition

Age 20-38
Gender Female, 51.5%
Sad 52%
Happy 45.1%
Calm 45.6%
Surprised 45.3%
Disgusted 45.7%
Confused 45.2%
Angry 46%

AWS Rekognition

Age 38-57
Gender Female, 50.2%
Disgusted 50%
Confused 49.5%
Calm 49.5%
Happy 49.6%
Sad 49.7%
Surprised 49.5%
Angry 49.7%

AWS Rekognition

Age 45-65
Gender Male, 53.4%
Sad 46.1%
Angry 47.1%
Happy 45.6%
Disgusted 45.2%
Surprised 45.6%
Confused 45.5%
Calm 50%

AWS Rekognition

Age 35-52
Gender Male, 52%
Calm 47.4%
Disgusted 46.2%
Angry 46.7%
Confused 45.4%
Sad 47.1%
Happy 46.7%
Surprised 45.5%

AWS Rekognition

Age 35-52
Gender Female, 51.8%
Surprised 45.1%
Calm 45.3%
Happy 45.1%
Confused 45.5%
Disgusted 45.1%
Sad 53.6%
Angry 45.3%

AWS Rekognition

Age 35-55
Gender Male, 50.3%
Angry 49.5%
Calm 49.6%
Sad 50.4%
Happy 49.5%
Surprised 49.5%
Disgusted 49.5%
Confused 49.5%

AWS Rekognition

Age 38-59
Gender Male, 54.5%
Angry 45.6%
Confused 46.1%
Happy 45.2%
Sad 46.5%
Surprised 45.5%
Calm 47.7%
Disgusted 48.3%

AWS Rekognition

Age 45-63
Gender Male, 50.9%
Confused 45.1%
Angry 45.2%
Disgusted 50.8%
Sad 45.7%
Happy 45.3%
Calm 47.7%
Surprised 45.2%

Feature analysis

Amazon

Person 99.2%