Human Generated Data

Title

Untitled (two photographs: studio portrait of older woman posed in three-quarter turn)

Date

1925-1940, printed later

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10321

Human Generated Data

Title

Untitled (two photographs: studio portrait of older woman posed in three-quarter turn)

People

Artist: Martin Schweig, American 20th century

Date

1925-1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10321

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 99.2
Person 99.2
Person 97.8
Finger 84.6
Face 74
Musician 65.8
Musical Instrument 65.8
Leisure Activities 60.8
Clothing 57.7
Undershirt 57.7
Apparel 57.7
Performer 56.3

Clarifai
created on 2019-11-16

people 99.9
portrait 99.5
monochrome 98.9
music 98.2
stage 97.6
adult 97.6
one 96.5
musician 96.1
man 95.9
art 95.3
singer 94.1
profile 93.4
light 92.5
concert 92.4
theater 92.2
jazz 90.9
performance 90.1
theatre 89.4
ballet 89.3
opera 89.2

Imagga
created on 2019-11-16

black 39.3
sculpture 30
person 29.6
male 28.4
man 25.7
bust 25.3
body 24.8
dark 24.2
people 22.3
model 21
sexy 20.9
face 19.9
portrait 19.4
art 19.4
plastic art 18.9
adult 18.8
human 18.8
attractive 17.5
figure 15.6
pose 15.4
light 14.9
nude 14.6
naked 14.5
looking 14.4
skin 14.4
one 14.2
hands 13.9
spotlight 13.4
studio 12.9
men 12.9
fashion 12.8
expression 12.8
style 12.6
erotic 12.3
column 12
head 11.8
silhouette 11.6
marble 11.4
pretty 11.2
lighting 11.2
performer 11
statue 10.8
hair 10.3
shadow 9.9
posing 9.8
lady 9.7
hand 9.7
legs 9.4
healthy 8.8
conceptual 8.8
horror 8.7
fear 8.7
device 8.5
apparatus 8.3
emotion 8.3
fitness 8.1
night 8
horn 7.9
torso 7.8
sexual 7.7
dancer 7.7
old 7.7
serious 7.6
lamp 7.6
power 7.6
passion 7.5
suit 7.5
fit 7.4
sensuality 7.3
lifestyle 7.2

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 99.7
monitor 97.4
concert 97
human face 96.2
man 94.1
person 92.3
microphone 86.4
screen 81.8
electronics 74.4
image 48.4
display 29

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 47-65
Gender Male, 90.2%
Calm 55.3%
Sad 15.9%
Happy 1.8%
Angry 4%
Fear 1.4%
Surprised 1.3%
Confused 5.6%
Disgusted 14.7%

AWS Rekognition

Age 51-69
Gender Male, 91%
Sad 1%
Disgusted 0%
Confused 0.3%
Surprised 0%
Happy 0%
Fear 0%
Angry 0.1%
Calm 98.4%

Microsoft Cognitive Services

Age 64
Gender Female

Microsoft Cognitive Services

Age 56
Gender Female

Feature analysis

Amazon

Person 99.2%

Categories