Human Generated Data

Title

Untitled (portrait of nine men with seven dressed as women)

Date

c. 1930-1940

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5936

Human Generated Data

Title

Untitled (portrait of nine men with seven dressed as women)

People

Artist: Durette Studio, American 20th century

Date

c. 1930-1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5936

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Person 99.7
Human 99.7
Person 99.4
Person 99.3
Person 99.2
Person 99.2
Person 99.1
Person 97.5
Clothing 97
Apparel 97
Person 96.8
Face 95.9
Shorts 92.1
People 79.3
Portrait 68.6
Photography 68.6
Photo 68.6
Smile 67.6
Sports 67
Sport 67
Sailor Suit 60.9
Chair 58.6
Furniture 58.6
Crowd 57.2
Shirt 57
Female 56.3
Sleeve 55.2
Person 45.5

Clarifai
created on 2019-06-01

people 99.9
group 99.8
group together 99.3
many 97.9
adult 96.9
several 96.3
man 96
woman 93.7
child 90.5
wear 90.4
five 87.5
four 85.5
music 82.7
medical practitioner 81.1
boy 80.8
actor 80.2
education 80.1
three 79.7
facial expression 79
outfit 73.7

Imagga
created on 2019-06-01

kin 34.5
people 30.7
person 27.4
sport 24.2
man 22.8
beach 21.2
sibling 21.1
exercise 20.9
adult 20.8
male 19.8
active 19.3
fitness 19
lifestyle 18.8
fun 18.7
love 18.1
happy 17.5
silhouette 16.5
summer 16.1
happiness 15.7
sky 15.3
men 14.6
outdoors 14.2
sea 14.1
couple 13.9
player 13.7
youth 13.6
portrait 13.6
fashion 13.6
outdoor 13
body 12.8
smiling 12.3
sand 11.5
together 11.4
athlete 11.3
friendship 11.2
training 11.1
joy 10.9
leisure 10.8
sunset 10.8
activity 10.7
healthy 10.7
attractive 10.5
group 10.5
health 10.4
play 10.3
women 10.3
black 10.2
brass 10.1
energy 10.1
water 10
dance 9.9
child 9.9
run 9.6
life 9.5
action 9.3
art 9.3
contestant 9.2
human 9
practice 8.7
grass 8.7
married 8.6
model 8.5
cornet 8.5
relationship 8.4
ocean 8.3
wind instrument 8.3
freedom 8.2
teenager 8.2
vacation 8.2
style 8.2
clothing 8.1
sexy 8
family 8
posing 8
bride 7.9
holiday 7.9
standing 7.8
motion 7.7
crowd 7.7
running 7.7
wedding 7.4
competition 7.3
wellness 7.3
cheerful 7.3
girls 7.3
pose 7.2
sun 7.2
dress 7.2
world 7.2
recreation 7.2
team 7.2
romance 7.1

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

posing 99.9
smile 97.7
clothing 96.7
person 96
old 89.5
text 88.7
man 87.9
group 86.8
standing 85.6
woman 84.8
black 71.3
white 61.6
black and white 52.6
team 41.8
vintage 35.1
female 32.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Male, 73.8%
Calm 9.2%
Disgusted 1.7%
Confused 3.7%
Surprised 4.5%
Happy 74.6%
Sad 3.7%
Angry 2.8%

AWS Rekognition

Age 35-52
Gender Male, 54.7%
Confused 45.5%
Calm 46%
Sad 47.2%
Surprised 45.9%
Angry 45.9%
Disgusted 45.7%
Happy 48.7%

AWS Rekognition

Age 26-43
Gender Male, 98.7%
Happy 74%
Confused 2.8%
Angry 6.3%
Sad 5.1%
Calm 3.8%
Surprised 5.1%
Disgusted 2.8%

AWS Rekognition

Age 17-27
Gender Male, 54.7%
Happy 45.1%
Sad 45.4%
Surprised 45.1%
Confused 45.1%
Disgusted 45.1%
Calm 54.1%
Angry 45.1%

AWS Rekognition

Age 26-43
Gender Male, 53.8%
Surprised 45.2%
Confused 45.2%
Disgusted 45.1%
Happy 53.1%
Sad 46%
Calm 45.3%
Angry 45.2%

AWS Rekognition

Age 26-43
Gender Male, 53%
Disgusted 45.4%
Confused 45.4%
Surprised 46.4%
Calm 45.9%
Happy 50.2%
Angry 45.5%
Sad 46.2%

AWS Rekognition

Age 23-38
Gender Male, 51.8%
Sad 45.2%
Confused 45.2%
Disgusted 45.9%
Surprised 45.5%
Angry 45.3%
Happy 52.3%
Calm 45.4%

AWS Rekognition

Age 35-52
Gender Female, 50.1%
Confused 45.3%
Disgusted 45.3%
Happy 51.9%
Surprised 45.3%
Calm 46.1%
Sad 45.7%
Angry 45.4%

AWS Rekognition

Age 26-43
Gender Female, 50.3%
Calm 47.8%
Disgusted 45.2%
Sad 50.4%
Surprised 45.5%
Happy 45.3%
Confused 45.4%
Angry 45.4%

Feature analysis

Amazon

Person 99.7%

Categories