Human Generated Data

Title

Untitled (women seated holding glasses under Christmas tree)

Date

1940

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4192

Human Generated Data

Title

Untitled (women seated holding glasses under Christmas tree)

People

Artist: Durette Studio, American 20th century

Date

1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4192

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Human 99.6
Person 99.6
Person 98.5
Person 98.4
Person 98.3
Person 96.2
Person 96.1
Poster 95
Advertisement 95
Collage 95
Person 90.7
Person 90.3
Person 87.1
Person 86.1
People 72.9
Crowd 67.6
Face 67.1
Person 66.2
Meal 65.6
Food 65.6
Photography 65.3
Photo 65.3
Photographer 62.3
Female 60.2
Girl 60.2
Indoors 59.2
Room 57.7
Paparazzi 57.1
Skin 55.3

Clarifai
created on 2019-06-01

group 99.7
people 99.5
group together 98.6
man 96.9
many 96.6
woman 96.6
adult 96.4
monochrome 94.9
child 93.4
several 86.1
leader 79.6
indoors 79.3
education 77.7
five 77.6
facial expression 73.9
war 73.5
sit 72.9
music 72.8
enjoyment 71.6
room 71.3

Imagga
created on 2019-06-01

people 30.7
person 28.2
businessman 25.6
business 24.3
man 23.7
male 23.4
men 23.2
group 22.6
team 21.5
adult 21.3
women 19
silhouette 18.2
teamwork 15.8
human 15
worker 13.9
happy 13.8
businesswoman 13.6
room 13.1
office 12.9
success 12.9
negative 12.7
work 12.6
classroom 12.1
corporate 12
portrait 11.6
job 11.5
professional 10.9
suit 10.9
black 10.8
crowd 10.6
boss 10.5
together 10.5
modern 10.5
teacher 10.5
film 10.5
body 10.4
executive 10.1
standing 9.6
blackboard 9.5
talking 9.5
meeting 9.4
day 9.4
grunge 9.4
casual 9.3
successful 9.1
attractive 9.1
dance 9
sexy 8.8
boy 8.8
couple 8.7
class 8.7
lifestyle 8.7
sitting 8.6
smile 8.5
career 8.5
manager 8.4
fashion 8.3
nurse 8.2
indoor 8.2
music 8.1
photographic paper 8.1
symbol 8.1
creation 8
hair 7.9
happiness 7.8
education 7.8
model 7.8
product 7.7
leadership 7.7
youth 7.7
sport 7.6
relax 7.6
desk 7.5
sign 7.5
art 7.5
friendship 7.5
company 7.4
style 7.4
light 7.3
life 7.3
lady 7.3
smiling 7.2
newspaper 7.2
looking 7.2
child 7.1
kin 7.1

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

text 97.4
person 95.3
clothing 94.6
human face 85.5
woman 83
posing 82.4
man 70.6
wedding dress 64
group 55.1
smile 51
old 45.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 26-43
Gender Male, 91%
Disgusted 2.1%
Sad 7.7%
Happy 65.4%
Surprised 6.8%
Angry 11.5%
Calm 3.7%
Confused 2.7%

AWS Rekognition

Age 30-47
Gender Female, 54%
Confused 45.3%
Sad 48.5%
Calm 46.2%
Happy 49%
Surprised 45.3%
Disgusted 45.2%
Angry 45.5%

AWS Rekognition

Age 26-43
Gender Male, 98.7%
Confused 5.8%
Happy 49.7%
Surprised 5.8%
Angry 5.9%
Disgusted 5%
Calm 15.5%
Sad 12.4%

AWS Rekognition

Age 26-43
Gender Female, 51%
Calm 46.2%
Surprised 45.7%
Sad 46.3%
Confused 45.8%
Disgusted 45.2%
Happy 50.3%
Angry 45.5%

AWS Rekognition

Age 26-43
Gender Female, 53.9%
Angry 45.3%
Sad 47.6%
Happy 51%
Calm 45.5%
Confused 45.2%
Disgusted 45.1%
Surprised 45.3%

AWS Rekognition

Age 26-43
Gender Male, 54.9%
Disgusted 45.2%
Sad 45.5%
Happy 46%
Surprised 45.6%
Angry 45.3%
Calm 52.2%
Confused 45.3%

AWS Rekognition

Age 30-47
Gender Male, 53.7%
Angry 45.6%
Sad 50.9%
Disgusted 45.4%
Surprised 45.5%
Happy 45.6%
Calm 46.5%
Confused 45.5%

AWS Rekognition

Age 26-43
Gender Female, 98.5%
Disgusted 1.2%
Sad 29.4%
Happy 60.4%
Surprised 2.3%
Calm 1.7%
Angry 3%
Confused 2%

AWS Rekognition

Age 45-65
Gender Male, 54.8%
Angry 45.9%
Surprised 45.6%
Disgusted 45.6%
Sad 45.9%
Calm 49.3%
Happy 47%
Confused 45.6%

AWS Rekognition

Age 17-27
Gender Female, 54.7%
Confused 45.4%
Surprised 45.4%
Calm 45.6%
Sad 45.9%
Happy 52.1%
Disgusted 45.2%
Angry 45.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Categories

Imagga

paintings art 90.8%
text visuals 8.1%