Human Generated Data

Title

Untitled (portrait of nine men with seven dressed as women)

Date

c. 1930-1940

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5940

Human Generated Data

Title

Untitled (portrait of nine men with seven dressed as women)

People

Artist: Durette Studio, American 20th century

Date

c. 1930-1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5940

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Human 99.7
Person 99.7
Person 99.4
Person 99.4
Person 99.3
Person 99.1
Person 99
Apparel 98.9
Clothing 98.9
Person 98.2
Person 97.7
Shorts 97.5
Person 95.8
Face 89.8
Outdoors 86.4
People 83.7
Nature 77.3
Person 62.6
Photo 62.2
Portrait 62.2
Photography 62.2
Furniture 62
Chair 62
Countryside 60.4
Sailor Suit 59.8
Child 59.7
Kid 59.7
Rural 55.7
Hut 55.7
Building 55.7
Shack 55.7
Crowd 55.3
Person 48.6

Clarifai
created on 2019-06-01

people 99.9
group together 99.7
group 99.2
adult 98.7
many 98
man 97.3
woman 96.1
several 94.6
wear 92.1
five 89
four 87.6
facial expression 87.5
athlete 86
sports equipment 85.2
boxer 84.6
outfit 83.9
child 83.4
three 83.3
music 82.2
leader 81.8

Imagga
created on 2019-06-01

kin 69.5
athlete 28.4
person 28.4
beach 27.9
ballplayer 27.9
player 27.2
people 26.8
man 24.8
contestant 21.7
male 21.3
sky 21
sport 20.6
silhouette 19.9
sunset 19.8
fun 19.4
summer 19.3
adult 19.1
active 17.6
couple 17.4
sea 16.4
men 16.3
happy 16.3
lifestyle 15.9
love 15.8
outdoors 15.7
happiness 15.7
water 14.7
exercise 14.5
ocean 14.1
friendship 14
family 13.3
child 13.3
world 13.2
outdoor 13
joy 12.5
sand 12.2
together 11.4
fitness 10.8
group 10.5
boy 10.4
women 10.3
leisure 10
vacation 9.8
life 9.5
youth 9.4
training 9.2
travel 9.1
girls 9.1
portrait 9.1
black 9
sun 8.8
body 8.8
grass 8.7
standing 8.7
run 8.7
mother 8.6
play 8.6
party 8.6
dusk 8.6
sibling 8.5
two 8.5
friends 8.4
clouds 8.4
evening 8.4
father 8.3
human 8.2
coast 8.1
recreation 8.1
smiling 8
bride 7.7
energy 7.6
healthy 7.6
parent 7.5
landscape 7.4
holding 7.4
wedding 7.4
children 7.3
sexy 7.2
holiday 7.2
dance 7.1

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

posing 99.7
person 97.7
smile 94.3
clothing 93.4
man 91.3
old 86.5
group 85.4
standing 83.6
player 67.1
white 65.8
team 48.5
vintage 32.9
female 29.3
male 15.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Female, 54.9%
Disgusted 1.3%
Confused 2.6%
Surprised 5%
Calm 9.6%
Happy 76.4%
Angry 2.7%
Sad 2.4%

AWS Rekognition

Age 26-43
Gender Female, 57.1%
Disgusted 3.9%
Surprised 22.1%
Angry 8.2%
Confused 5.7%
Sad 18.7%
Calm 31.7%
Happy 9.6%

AWS Rekognition

Age 26-43
Gender Male, 54.3%
Angry 45.5%
Calm 46%
Sad 46.1%
Surprised 45.7%
Disgusted 45.4%
Happy 50.8%
Confused 45.4%

AWS Rekognition

Age 26-43
Gender Male, 97.1%
Surprised 3.8%
Disgusted 1.8%
Calm 2.5%
Happy 82.3%
Sad 3%
Angry 4.6%
Confused 2%

AWS Rekognition

Age 35-52
Gender Male, 54.7%
Surprised 46.2%
Happy 48%
Disgusted 45.7%
Calm 47.8%
Sad 46.2%
Confused 45.5%
Angry 45.7%

AWS Rekognition

Age 26-43
Gender Male, 53.2%
Sad 45.4%
Happy 53.6%
Confused 45.2%
Angry 45.2%
Surprised 45.2%
Disgusted 45.1%
Calm 45.4%

AWS Rekognition

Age 23-38
Gender Male, 54.9%
Angry 45.6%
Happy 45.5%
Confused 45.4%
Sad 45.8%
Calm 52.4%
Surprised 45.3%
Disgusted 45.1%

AWS Rekognition

Age 26-43
Gender Male, 51.8%
Disgusted 47.4%
Happy 49.8%
Surprised 46.1%
Sad 45.3%
Angry 45.5%
Confused 45.3%
Calm 45.6%

AWS Rekognition

Age 30-47
Gender Female, 52.6%
Disgusted 45.1%
Confused 45.1%
Angry 45.2%
Calm 45.2%
Surprised 45.2%
Happy 53.8%
Sad 45.2%

Feature analysis

Amazon

Person 99.7%

Categories