Human Generated Data

Title

Untitled (studio portrait of man seated on arm of chair with cigar)

Date

c. 1905-1910, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5959

Human Generated Data

Title

Untitled (studio portrait of man seated on arm of chair with cigar)

People

Artist: Durette Studio, American 20th century

Date

c. 1905-1910, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5959

Machine Generated Data

Tags

Amazon
created on 2019-05-30

Apparel 99.6
Clothing 99.6
Human 98.8
Person 98.8
Coat 94.9
Overcoat 94.9
Suit 94.3
Face 76.9
Jacket 70.1
Man 69.2
Photo 61.8
Photography 61.8
Portrait 61.8
Female 57.6
Finger 57

Clarifai
created on 2019-05-30

people 99.6
portrait 98.4
one 97.8
adult 97.7
monochrome 94.8
man 94.8
wear 94.5
music 93.2
outfit 88.1
retro 85.2
woman 84.9
two 84.4
musician 83.9
art 80.2
leader 79.6
military 76.4
black and white 76.2
actor 75.8
profile 75.4
singer 75.2

Imagga
created on 2019-05-30

person 32.2
man 22.9
adult 22
people 20.6
clothing 19.8
athlete 18
ballplayer 17.3
player 16.2
black 16.1
male 15.6
sport 15.2
portrait 14.9
statue 13.8
adolescent 13.7
dark 13.4
mask 13.2
fashion 12.8
human 12.7
contestant 12.5
lifestyle 12.3
silhouette 11.6
juvenile 11.6
art 11.5
protection 10.9
model 10.9
outdoor 10.7
posing 10.7
expression 10.2
casual 10.2
danger 10
uniform 9.9
one 9.7
action 9.3
power 9.2
face 9.2
suit 9.2
style 8.9
light 8.8
life 8.8
protective 8.8
helmet 8.8
nuclear 8.7
women 8.7
standing 8.7
smoke 8.5
attractive 8.4
holding 8.3
exercise 8.2
dirty 8.1
music 8.1
sunset 8.1
hair 7.9
love 7.9
radioactive 7.9
radiation 7.8
destruction 7.8
chemical 7.7
elegant 7.7
gas 7.7
youth 7.7
sky 7.7
studio 7.6
symbol 7.4
dress 7.2
body 7.2

Google
created on 2019-05-30

Microsoft
created on 2019-05-30

clothing 94
person 91.8
man 83.7
human face 80.2
black and white 67.3
white 61.3
old 60
portrait 51.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 11-18
Gender Male, 78.3%
Disgusted 2.3%
Angry 7.8%
Calm 63%
Confused 11.3%
Sad 8.4%
Surprised 5.5%
Happy 1.7%

Microsoft Cognitive Services

Age 29
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%

Categories