Human Generated Data

Title

Untitled (two photographs: studio portrait of woman holding small tennis racket; studio portrait of man in suit smoking cigar)

Date

c. 1935, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6081

Human Generated Data

Title

Untitled (two photographs: studio portrait of woman holding small tennis racket; studio portrait of man in suit smoking cigar)

People

Artist: Durette Studio, American 20th century

Date

c. 1935, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6081

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 99.1
Human 99.1
Apparel 98.4
Clothing 98.4
Person 98
Chair 97.7
Furniture 97.7
Suit 95.1
Overcoat 95.1
Coat 95.1
Man 62.4
Photo 60.1
Portrait 60.1
Photography 60.1
Face 60.1
Crowd 59.5

Clarifai
created on 2019-11-16

people 99.9
adult 98.6
woman 97.9
man 97.2
music 96.7
two 96.3
group 95.2
actor 94
musician 93.9
actress 93.4
movie 92.6
wear 92
portrait 91.5
singer 90.9
furniture 88.7
monochrome 88.4
dancer 87.1
outfit 86.1
theater 84.5
dancing 83.9

Imagga
created on 2019-11-16

television 30.7
person 26.9
man 26.4
adult 24.2
people 22.9
black 21
dark 20.9
telecommunication system 20.6
male 19.9
attractive 16.8
portrait 16.2
body 16
model 15.6
sexy 15.3
one 14.9
silhouette 14.9
fashion 14.3
hair 14.3
business 13.4
posing 13.3
love 12.6
suit 11.9
musical instrument 11.9
world 11.7
businessman 11.5
studio 11.4
lady 11.4
passion 11.3
spotlight 11.3
human 11.3
pretty 11.2
style 11.1
couple 10.5
office 10.4
dance 10.1
happy 10
pose 10
clothing 9.9
sport 9.9
performer 9.9
sitting 9.5
lifestyle 9.4
expression 9.4
light 9.4
face 9.2
stringed instrument 9.1
dress 9
shadow 9
boy 8.8
looking 8.8
room 8.7
motion 8.6
erotic 8.5
casual 8.5
action 8.3
sensual 8.2
sensuality 8.2
exercise 8.2
sunset 8.1
dancer 7.8
art 7.8
hands 7.8
youth 7.7
performance 7.7
outdoor 7.6
skin 7.6
energy 7.6
elegance 7.6
power 7.6
indoor 7.3
dirty 7.2
fitness 7.2
jacket 7.2
happiness 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 96.5
clothing 95.7
person 85.5
black and white 79.7
man 76.6
woman 54.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 17-29
Gender Male, 99.2%
Disgusted 0.1%
Surprised 0.1%
Happy 0.1%
Sad 1.1%
Angry 1.9%
Fear 0%
Calm 95.9%
Confused 0.7%

AWS Rekognition

Age 29-45
Gender Female, 54.9%
Calm 51%
Surprised 45.1%
Happy 45.1%
Sad 47.8%
Confused 45.2%
Angry 45.4%
Fear 45.3%
Disgusted 45.2%

Microsoft Cognitive Services

Age 34
Gender Male

Microsoft Cognitive Services

Age 30
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Chair 97.7%

Categories