Human Generated Data

Title

Untitled (Clover basketball team, 1923)

Date

1923

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4281

Human Generated Data

Title

Untitled (Clover basketball team, 1923)

People

Artist: Durette Studio, American 20th century

Date

1923

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4281

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Human 99.8
Person 99.8
Person 99.7
Person 99.6
Person 99.4
Person 99
Apparel 98.7
Clothing 98.7
Person 98.3
Person 97.6
Face 96.2
Smile 87.6
People 83.1
Female 78.9
Shorts 78.4
Poster 77.5
Advertisement 77.5
Collage 77.5
Portrait 69.9
Photo 69.9
Photography 69.9
Head 65.7
Text 63.9
Girl 60.8
Shoe 60
Footwear 60
Dress 59.6
Woman 59.5
Door 58
Suit 57.4
Coat 57.4
Overcoat 57.4
Indoors 56.1

Clarifai
created on 2019-06-01

people 100
group 99.7
group together 98.7
many 98
several 97.4
adult 97.3
man 96.7
woman 95.8
five 94
child 92.9
four 92.7
wear 91.9
monochrome 90.6
education 89.5
indoors 87
facial expression 86.8
medical practitioner 86.1
portrait 83.9
uniform 81.6
three 81.2

Imagga
created on 2019-06-01

people 24
person 20.3
kin 19.7
sexy 19.3
sibling 19.2
fashion 18.8
portrait 18.8
adult 17.5
pretty 16.8
male 16.7
posing 15.1
white 15
happy 15
man 14.8
face 14.2
negative 13.3
sport 13.2
lady 13
human 12.7
film 12.7
child 12.7
art 12.6
attractive 12.6
statue 12.5
health 12.5
model 12.4
body 12
clothing 11.6
smile 11.4
couple 11.3
men 11.2
women 11.1
love 11
smiling 10.8
fitness 10.8
lifestyle 10.8
black 10.8
action 10.2
healthy 10.1
exercise 10
dress 9.9
sculpture 9.8
fun 9.7
style 9.6
standing 9.6
happiness 9.4
mother 9.3
head 9.2
active 9
representation 8.9
family 8.9
cute 8.6
costume 8.6
youth 8.5
photographic paper 8.3
makeup 8.2
figure 8.2
girls 8.2
look 7.9
brunette 7.8
marble 7.7
old 7.7
drawing 7.6
teenager 7.3
gorgeous 7.2
pose 7.2
decoration 7.2
architecture 7

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

posing 98.7
wall 97
person 85
clothing 80.7
smile 74.7
human face 70.1
old 69.9
group 63.2
team 26.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 38-59
Gender Male, 54.7%
Happy 45.2%
Confused 45.3%
Disgusted 45.1%
Sad 45.5%
Calm 53.3%
Angry 45.3%
Surprised 45.3%

AWS Rekognition

Age 26-43
Gender Male, 54.9%
Calm 53.7%
Disgusted 45.1%
Confused 45.2%
Surprised 45.2%
Happy 45.2%
Sad 45.5%
Angry 45.2%

AWS Rekognition

Age 26-43
Gender Male, 54.5%
Confused 45.3%
Disgusted 45.1%
Happy 45.3%
Surprised 45.2%
Calm 52.7%
Sad 45.4%
Angry 46.1%

AWS Rekognition

Age 20-38
Gender Male, 50%
Surprised 45.6%
Calm 51.1%
Disgusted 45.2%
Confused 45.4%
Sad 46%
Happy 46.1%
Angry 45.6%

AWS Rekognition

Age 26-43
Gender Male, 89.2%
Surprised 4.1%
Sad 2.9%
Happy 5.4%
Angry 2.1%
Disgusted 1.1%
Confused 1.7%
Calm 82.8%

AWS Rekognition

Age 20-38
Gender Male, 54.9%
Confused 45.1%
Surprised 45%
Sad 45.3%
Angry 45%
Happy 45.1%
Calm 54.5%
Disgusted 45%

Feature analysis

Amazon

Person 99.8%
Shoe 60%

Categories

Imagga

paintings art 90.6%
people portraits 8.4%

Text analysis

Amazon

1923

Google

1923
1923