Human Generated Data

Title

Untitled (portrait of nine men with seven dressed as women)

Date

c. 1930-1940

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5943

Human Generated Data

Title

Untitled (portrait of nine men with seven dressed as women)

People

Artist: Durette Studio, American 20th century

Date

c. 1930-1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5943

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Human 99.7
Person 99.7
Apparel 99.5
Clothing 99.5
Person 99.4
Person 99.4
Person 99.3
Person 99.3
Person 99.1
Shorts 97.9
Person 97.5
Person 96.1
Person 95.5
Face 93.1
People 84.6
Female 77.4
Suit 76.2
Overcoat 76.2
Coat 76.2
Photography 70.9
Portrait 70.9
Photo 70.9
Advertisement 69
Poster 69
Shirt 66.8
Person 60
Crowd 59.7
Woman 59.4
Dress 55.5

Clarifai
created on 2019-06-01

people 100
group together 99.8
group 99.6
adult 99
many 98.6
man 97.8
several 97.6
woman 97.3
five 96.2
four 94.8
wear 94.8
child 92.6
offspring 91.3
athlete 91.2
three 90.7
sibling 90
facial expression 89.7
administration 88.8
sports equipment 87.9
boxer 87.8

Imagga
created on 2019-06-01

kin 64.3
beach 27.9
people 27.9
man 27.5
male 26.9
silhouette 25.6
person 24.9
sky 22.9
sunset 20.7
sea 18
ocean 17.4
summer 17.3
sport 17
water 16.7
couple 16.5
athlete 15.3
love 15
outdoors 14.9
men 14.6
adult 14.1
sand 14.1
group 13.7
active 13.2
friendship 13.1
lifestyle 13
world 12.6
ballplayer 12.3
fun 12
player 11.6
life 11.6
boy 11.3
happy 11.3
sun 11.3
travel 11.3
clouds 11
child 10.8
coast 10.8
outdoor 10.7
together 10.5
landscape 10.4
evening 10.3
shore 10.2
black 10.2
exercise 10
contestant 9.8
vacation 9.8
businessman 9.7
dusk 9.5
happiness 9.4
coastline 9.4
friends 9.4
girls 9.1
portrait 9.1
recreation 9
family 8.9
women 8.7
business 8.5
human 8.2
island 8.2
pose 8.1
horizon 8.1
romantic 8
day 7.8
play 7.7
bride 7.7
youth 7.7
seascape 7.6
guy 7.6
fashion 7.5
joy 7.5
groom 7.5
leisure 7.5
wedding 7.3
light 7.3
success 7.2
fitness 7.2
dress 7.2
kid 7.1

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

person 99.7
posing 99.6
man 93.5
clothing 93.4
outdoor 90.3
standing 86.1
smile 82
group 78.1
footwear 59
old 57.7
people 55.9
team 32.8
male 15.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Female, 50.5%
Calm 51.6%
Disgusted 45.3%
Confused 45.3%
Surprised 45.4%
Happy 45.3%
Sad 46.7%
Angry 45.5%

AWS Rekognition

Age 26-43
Gender Male, 71.2%
Angry 1.6%
Sad 2%
Surprised 3%
Happy 88%
Calm 3.6%
Disgusted 0.9%
Confused 0.9%

AWS Rekognition

Age 20-38
Gender Male, 54.5%
Happy 54.8%
Disgusted 45%
Angry 45%
Calm 45%
Surprised 45.1%
Confused 45%
Sad 45.1%

AWS Rekognition

Age 23-38
Gender Male, 54.6%
Disgusted 45.2%
Sad 46.1%
Confused 45.4%
Happy 47.2%
Angry 45.3%
Surprised 45.4%
Calm 50.3%

AWS Rekognition

Age 26-43
Gender Female, 52%
Calm 46.8%
Surprised 46.9%
Disgusted 46%
Happy 47.8%
Sad 46%
Confused 45.6%
Angry 45.9%

AWS Rekognition

Age 35-52
Gender Male, 54.9%
Angry 45.4%
Disgusted 45.3%
Happy 47.4%
Sad 45.9%
Surprised 45.7%
Calm 50.1%
Confused 45.3%

AWS Rekognition

Age 26-43
Gender Male, 53.8%
Happy 46.9%
Disgusted 45.9%
Angry 46.2%
Surprised 45.9%
Sad 47.4%
Calm 47.2%
Confused 45.5%

AWS Rekognition

Age 26-43
Gender Male, 96.3%
Disgusted 0.7%
Calm 5.6%
Angry 1.3%
Sad 2.6%
Happy 87.4%
Surprised 1.6%
Confused 0.8%

AWS Rekognition

Age 35-52
Gender Male, 53.3%
Disgusted 50.3%
Confused 45.2%
Surprised 45.5%
Calm 45.7%
Happy 47.8%
Angry 45.3%
Sad 45.2%

Feature analysis

Amazon

Person 99.7%
Poster 69%

Categories