Human Generated Data

Title

Audience Reaction

Date

c. 1940 - c. 1950

People

Artist: Weegee (Arthur Fellig), American 1899 - 1968

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Dr. George A. Violin, P1995.50

Human Generated Data

Title

Audience Reaction

People

Artist: Weegee (Arthur Fellig), American 1899 - 1968

Date

c. 1940 - c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Dr. George A. Violin, P1995.50

Machine Generated Data

Tags

Amazon
created on 2019-03-29

Indoors 99.7
Interior Design 99.7
Human 99.1
Crowd 99.1
Audience 99.1
Person 98.9
Person 98.8
Person 98.6
Person 98
Person 97.1
Person 95.9
Person 91.8
Person 88.8
Person 88.8
Art 84.9
Painting 84.9
People 79.7
Person 75.9
Person 75.1
Pub 60.4
Bar Counter 60.4
Face 59.9
Performer 59.8
Room 58

Clarifai
created on 2018-02-09

people 100
group 99.9
many 99.9
adult 99.3
group together 98
man 98
leader 97.6
administration 96.9
woman 96.6
sit 96.3
several 96
seat 95.2
wear 94.8
furniture 92.9
military 90.7
chair 90.3
war 89.6
crowd 88.6
recreation 87.8
child 85.8

Imagga
created on 2018-02-09

person 24.5
people 21.2
sexy 20.9
adult 19.1
man 18.8
male 17.8
fashion 16.6
black 16.5
style 15.6
attractive 14.7
pretty 14
portrait 12.9
face 12.8
model 12.4
clothing 12.2
lady 12.2
human 12
music 10.9
dark 10.8
helmet 10.7
fun 10.5
uniform 10.4
body 10.4
men 10.3
hair 10.3
dress 9.9
brass 9.8
posing 9.8
art 9.5
elegant 9.4
lifestyle 9.4
stage 9.2
makeup 9.1
vintage 9.1
sensual 9.1
one 9
night 8.9
statue 8.7
women 8.7
love 8.7
play 8.6
sitting 8.6
expression 8.5
youth 8.5
passion 8.5
elegance 8.4
retro 8.2
dance 8.2
metal 8
world 8
couple 7.8
rock 7.8
war 7.7
device 7.7
old 7.7
drum 7.6
hot 7.5
mask 7.5
disco 7.5
equipment 7.5
playing 7.3
looking 7.2

Google
created on 2018-02-09

Microsoft
created on 2018-02-09

person 99.9
sitting 95.5
group 93.3
people 92.5
old 69.9
posing 45.5
crowd 33

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 26-43
Gender Male, 94.6%
Happy 6.3%
Disgusted 0.4%
Sad 43.4%
Angry 3.1%
Calm 42.7%
Surprised 1.6%
Confused 2.6%

AWS Rekognition

Age 35-52
Gender Male, 98.5%
Confused 4.2%
Sad 11.6%
Calm 73.4%
Happy 1.6%
Surprised 2.3%
Angry 5.3%
Disgusted 1.6%

AWS Rekognition

Age 35-52
Gender Female, 51.1%
Happy 0.6%
Sad 22.9%
Confused 1.7%
Angry 1.7%
Surprised 0.9%
Disgusted 0.4%
Calm 71.8%

AWS Rekognition

Age 26-44
Gender Male, 99.7%
Sad 24.4%
Surprised 1.4%
Calm 16.7%
Confused 13.1%
Angry 36.5%
Happy 0.5%
Disgusted 7.4%

AWS Rekognition

Age 26-43
Gender Female, 95.6%
Angry 2.4%
Disgusted 1.8%
Calm 19%
Happy 1.5%
Surprised 1.6%
Confused 1.1%
Sad 72.6%

AWS Rekognition

Age 38-59
Gender Male, 98.4%
Calm 32.5%
Angry 24.2%
Surprised 2.6%
Happy 1.3%
Disgusted 2%
Confused 16.7%
Sad 20.6%

AWS Rekognition

Age 10-15
Gender Female, 66.6%
Surprised 0.6%
Angry 1.6%
Disgusted 0.5%
Confused 0.9%
Happy 0.5%
Sad 70.5%
Calm 25.4%

AWS Rekognition

Age 29-45
Gender Male, 85.2%
Angry 7.3%
Surprised 2.7%
Happy 8.7%
Confused 5.4%
Disgusted 33.8%
Sad 29.8%
Calm 12.2%

Microsoft Cognitive Services

Age 22
Gender Male

Microsoft Cognitive Services

Age 30
Gender Male

Microsoft Cognitive Services

Age 33
Gender Male

Microsoft Cognitive Services

Age 32
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.9%
Painting 84.9%

Categories

Imagga

events parties 97.8%
people portraits 2.2%