Human Generated Data

Title

Untitled (portrait of nine men with seven dressed as women)

Date

c. 1930-1940

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5942

Human Generated Data

Title

Untitled (portrait of nine men with seven dressed as women)

People

Artist: Durette Studio, American 20th century

Date

c. 1930-1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5942

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Clothing 99.7
Apparel 99.7
Human 99.7
Person 99.7
Person 99.6
Person 99.6
Person 99.4
Shorts 99
Person 98.8
Person 98.6
Person 98
Person 97.7
Furniture 97.6
Chair 97.6
Face 97.2
Person 96.6
People 87.7
Person 77.3
Shirt 73.7
Portrait 71.8
Photo 71.8
Photography 71.8
Crowd 68.6
Smile 68.1
Sailor Suit 63.3
Female 60.7
Overcoat 60.2
Suit 60.2
Coat 60.2
Outdoors 57.1

Clarifai
created on 2019-06-01

people 100
group together 99.7
group 99.4
adult 99.2
many 98.7
man 98.2
woman 97.7
several 96.6
wear 95.8
five 94.9
four 92.9
facial expression 91.6
portrait 89.8
sports equipment 89.1
administration 89.1
three 89
boxer 88.1
child 87.8
leader 87.4
athlete 86.8

Imagga
created on 2019-06-01

people 29
person 25.3
silhouette 24.8
male 24.8
man 24.2
athlete 21.9
ballplayer 20.9
beach 19.4
sky 19.1
player 18.9
sport 18.8
men 18
brass 17.1
kin 16.7
summer 16.7
sunset 16.2
wind instrument 16.2
contestant 15.5
active 15.5
adult 15.2
ocean 14.1
group 13.7
sea 13.3
musical instrument 13.2
friendship 13.1
couple 13
exercise 12.7
travel 12.7
lifestyle 12.3
black 12
water 12
outdoor 11.5
fun 11.2
love 11
world 10.9
boy 10.4
women 10.3
life 10.1
girls 10
happy 10
professional 10
businessman 9.7
outdoors 9.7
together 9.6
happiness 9.4
evening 9.3
recreation 9
family 8.9
nurse 8.8
sand 8.7
dusk 8.6
business 8.5
leisure 8.3
freedom 8.2
landscape 8.2
child 8.1
activity 8.1
success 8
art 7.9
dance 7.8
play 7.7
motion 7.7
youth 7.7
clothing 7.6
two 7.6
power 7.5
friends 7.5
human 7.5
teen 7.3
tourist 7.2
pose 7.2
sun 7.2
fitness 7.2
body 7.2
coast 7.2
team 7.2
teacher 7

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

person 99.8
posing 99.5
clothing 95.3
text 92.7
man 91.2
outdoor 90.6
smile 90.3
standing 88.4
group 81.1
old 62.4
footwear 58.8
people 58.1
team 33.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Female, 58.8%
Angry 6.4%
Surprised 8.6%
Disgusted 2.7%
Sad 13.4%
Calm 61.1%
Happy 4.7%
Confused 3.1%

AWS Rekognition

Age 26-43
Gender Male, 70.5%
Disgusted 3.4%
Happy 64.5%
Surprised 12.7%
Sad 4.5%
Angry 5%
Confused 2.8%
Calm 7.1%

AWS Rekognition

Age 20-38
Gender Female, 52.8%
Confused 45.4%
Surprised 45.8%
Angry 46.5%
Calm 47.6%
Disgusted 48.6%
Sad 45.6%
Happy 45.6%

AWS Rekognition

Age 35-52
Gender Male, 54.6%
Happy 46.9%
Disgusted 45.5%
Sad 45.3%
Surprised 45.7%
Angry 45.3%
Calm 50.9%
Confused 45.3%

AWS Rekognition

Age 26-43
Gender Male, 94.9%
Calm 7.6%
Disgusted 0.8%
Sad 2.6%
Surprised 1.8%
Happy 85%
Confused 0.8%
Angry 1.4%

AWS Rekognition

Age 26-43
Gender Male, 54.9%
Disgusted 45.1%
Happy 46.6%
Surprised 45.3%
Sad 45.9%
Angry 45.3%
Confused 45.4%
Calm 51.3%

AWS Rekognition

Age 26-43
Gender Male, 54.4%
Confused 45.3%
Surprised 45.5%
Calm 46.3%
Sad 45.7%
Happy 51.3%
Disgusted 45.3%
Angry 45.6%

AWS Rekognition

Age 26-43
Gender Male, 50%
Angry 45.3%
Calm 45.8%
Sad 45.3%
Surprised 45.6%
Disgusted 46.1%
Happy 51.8%
Confused 45.2%

AWS Rekognition

Age 20-38
Gender Male, 53.6%
Disgusted 45%
Calm 45.1%
Sad 45%
Confused 45%
Angry 45%
Surprised 45.1%
Happy 54.8%

Feature analysis

Amazon

Person 99.7%

Categories

Imagga

people portraits 97.7%