Human Generated Data

Title

Untitled (group of woman standing and seated in living room)

Date

1940

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4196

Human Generated Data

Title

Untitled (group of woman standing and seated in living room)

People

Artist: Durette Studio, American 20th century

Date

1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4196

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Person 99.5
Human 99.5
Person 99.2
Person 99.1
Person 98.7
Person 98.5
Apparel 98.3
Clothing 98.3
Person 97.9
Person 97.7
Person 97.6
Person 97.4
Person 97.4
Person 96.9
Person 96.1
Shorts 94.8
Person 94.8
Person 94.7
Person 93.4
Female 92
Dress 91.5
Footwear 91.3
Shoe 91.3
Person 89.8
Indoors 89
Room 89
Shoe 86.6
Person 86.1
Person 86
School 85.2
Person 80.2
Face 79.4
People 79.2
Furniture 76.7
Suit 75.5
Overcoat 75.5
Coat 75.5
Classroom 74.7
Child 72.5
Kid 72.5
Girl 69.4
Woman 67.7
Person 66.6
Portrait 61.6
Photography 61.6
Photo 61.6
Stage 61.2
Skirt 56.9
Chair 56.5
Table 55.1
Shoe 50.1

Clarifai
created on 2019-06-01

people 99.8
group together 97.6
adult 97.5
group 96
many 94.7
woman 93.9
man 93
education 91.4
child 88.7
school 88.3
uniform 81.6
portrait 79.6
wear 78.6
several 78.6
teacher 76.8
crowd 74.9
indoors 69.1
classroom 66.7
recreation 66.4
monochrome 64.8

Imagga
created on 2019-06-01

people 40.1
group 33
person 31.6
men 27.5
silhouette 27.3
nurse 27.3
man 27.1
male 24.1
adult 21
team 19.7
businessman 19.4
crowd 19.2
business 18.8
couple 18.3
human 18
happy 15
women 15
art 14.5
runner 14.5
boy 13.9
fashion 13.6
professional 13.2
portrait 12.9
athlete 12.9
kin 12.8
dance 12.8
graphic 12.4
teamwork 12
sport 11.9
gymnasium 11.6
together 11.4
success 11.3
negative 11.2
party 11.2
occupation 11
family 10.7
contestant 10.5
friendship 10.3
grunge 10.2
black 10.2
design 10.1
life 9.9
suit 9.9
pretty 9.8
silhouettes 9.7
dancer 9.7
style 9.6
body 9.6
child 9.5
outline 9.5
smiling 9.4
happiness 9.4
lifestyle 9.4
youth 9.4
athletic facility 9.3
casual 9.3
dress 9
film 9
symbol 8.7
standing 8.7
love 8.7
friends 8.4
worker 8.2
girls 8.2
businesswoman 8.2
copy space 8.1
celebration 8
job 8
work 7.8
teacher 7.8
modern 7.7
walking 7.6
togetherness 7.5
fun 7.5
player 7.4
performer 7.3
active 7.3
figure 7.3
office 7.2
smile 7.1
facility 7.1
mother 7

Google
created on 2019-06-01

Photograph 97.2
Class 89.2
Snapshot 85.9
Team 83.5
Black-and-white 74.4
Photography 73.8
Room 65.7
State school 62.2
Crew 59.5
Stock photography 59.4
Uniform 52.8

Microsoft
created on 2019-06-01

clothing 97.5
person 96
woman 92.1
dress 91.6
outdoor 85.2
smile 83.5
footwear 71.2
man 71.2
posing 54.9
wedding dress 50.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Female, 52.2%
Happy 45%
Calm 47.5%
Angry 45.3%
Surprised 45.1%
Disgusted 45%
Confused 45.3%
Sad 51.7%

AWS Rekognition

Age 17-27
Gender Female, 54.1%
Disgusted 45.5%
Sad 46.3%
Happy 47%
Surprised 46.1%
Angry 45.4%
Calm 49.6%
Confused 45.2%

AWS Rekognition

Age 23-38
Gender Female, 52.9%
Confused 45.4%
Surprised 45.5%
Calm 50.8%
Sad 45.9%
Happy 46.5%
Disgusted 45.6%
Angry 45.4%

AWS Rekognition

Age 26-43
Gender Male, 50.4%
Sad 48.4%
Surprised 45.4%
Happy 47.8%
Angry 45.4%
Calm 47.5%
Confused 45.3%
Disgusted 45.3%

AWS Rekognition

Age 26-43
Gender Female, 50.6%
Happy 46.3%
Disgusted 45.8%
Angry 46%
Calm 47.6%
Sad 48.2%
Confused 45.5%
Surprised 45.6%

AWS Rekognition

Age 35-52
Gender Female, 51.8%
Angry 45.8%
Confused 45.4%
Surprised 45.4%
Happy 47.2%
Sad 45.7%
Calm 45.4%
Disgusted 50%

AWS Rekognition

Age 35-52
Gender Male, 54.2%
Sad 53.9%
Surprised 45.1%
Disgusted 45.1%
Angry 45.2%
Calm 45.6%
Happy 45.1%
Confused 45.1%

AWS Rekognition

Age 26-43
Gender Male, 54.4%
Disgusted 45.4%
Sad 47.1%
Happy 49.9%
Confused 45.3%
Surprised 45.5%
Angry 45.5%
Calm 46.3%

AWS Rekognition

Age 29-45
Gender Male, 50.2%
Calm 52.1%
Surprised 45.3%
Sad 45.6%
Confused 45.4%
Disgusted 45.1%
Happy 46.3%
Angry 45.3%

AWS Rekognition

Age 26-43
Gender Female, 53%
Sad 49.2%
Angry 45.9%
Disgusted 45.6%
Surprised 45.3%
Happy 45.3%
Calm 48.3%
Confused 45.3%

AWS Rekognition

Age 26-43
Gender Male, 52.6%
Sad 45.2%
Confused 45.3%
Disgusted 52%
Surprised 45.3%
Angry 45.3%
Happy 45.3%
Calm 46.5%

AWS Rekognition

Age 26-43
Gender Male, 53.9%
Happy 49.5%
Confused 45.4%
Disgusted 45.5%
Sad 46.7%
Calm 46.9%
Angry 45.5%
Surprised 45.5%

AWS Rekognition

Age 15-25
Gender Male, 52.6%
Happy 45.6%
Disgusted 45.4%
Angry 45.3%
Calm 51.6%
Surprised 45.7%
Confused 45.4%
Sad 46%

AWS Rekognition

Age 48-68
Gender Male, 54.4%
Angry 45.3%
Calm 51.4%
Sad 45.5%
Surprised 45.5%
Disgusted 45.7%
Happy 46.3%
Confused 45.3%

AWS Rekognition

Age 26-43
Gender Female, 50.1%
Confused 45.5%
Calm 48.2%
Sad 47.4%
Surprised 45.7%
Angry 45.4%
Disgusted 45.3%
Happy 47.6%

AWS Rekognition

Age 23-38
Gender Female, 51.4%
Sad 47.6%
Calm 49.6%
Surprised 45.5%
Angry 45.5%
Disgusted 45.7%
Happy 45.7%
Confused 45.5%

AWS Rekognition

Age 26-43
Gender Female, 50.4%
Surprised 49.5%
Happy 49.7%
Disgusted 49.5%
Calm 50.2%
Sad 49.6%
Confused 49.5%
Angry 49.5%

Feature analysis

Amazon

Person 99.5%
Shoe 91.3%

Categories

Text analysis

Amazon

a