Human Generated Data

Title

Untitled (two photographs: studio portrait of bride and groom; studio portrait of older woman with glasses)

Date

c. 1935, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6091

Human Generated Data

Title

Untitled (two photographs: studio portrait of bride and groom; studio portrait of older woman with glasses)

People

Artist: Durette Studio, American 20th century

Date

c. 1935, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6091

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Apparel 99.1
Clothing 99.1
Human 98.4
Person 98.4
Person 97.8
Overcoat 97
Coat 97
Suit 97
Person 96.4
Indoors 94.8
Interior Design 94.8
Fashion 90.6
Gown 90.6
Robe 90.6
Evening Dress 90.6
Leisure Activities 83.1
Dance Pose 83.1
Tuxedo 81.6
Female 69.6
Room 64.2
Face 61.3
Portrait 60.3
Photography 60.3
Photo 60.3
Glasses 56.8
Accessory 56.8
Accessories 56.8
Performer 56.6

Clarifai
created on 2019-11-16

people 100
group 99.1
adult 99
portrait 98
wear 97.7
woman 97.2
man 97.1
two 96.5
music 96.1
actress 95.6
musician 94.1
movie 93.7
outfit 93.2
actor 92.8
leader 89.9
three 89.8
facial expression 88.8
singer 87.3
television 85.2
dress 84.7

Imagga
created on 2019-11-16

black 38.6
world 29.4
kin 28.9
portrait 28.5
person 26
model 24.1
people 21.8
attractive 21.7
adult 19.9
pretty 19.6
male 19.2
man 18.1
sexy 17.7
face 17
fashion 16.6
dark 15.9
lady 15.4
expression 12.8
hair 11.9
posing 11.5
human 11.2
style 11.1
lifestyle 10.8
happy 10.7
brunette 10.5
one 10.5
looking 10.4
body 10.4
eyes 10.3
pose 10
studio 9.9
blackboard 9.7
couple 9.6
blond 9.3
smile 9.3
alone 9.1
gorgeous 9.1
hands 8.7
love 8.7
cute 8.6
youth 8.5
business 8.5
clothing 8.3
smiling 8
sitting 7.7
child 7.7
elegant 7.7
passion 7.5
silhouette 7.4
emotion 7.4
makeup 7.3
make 7.3
dress 7.2
family 7.1
happiness 7.1
together 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

monitor 97.9
text 96.9
clothing 94.2
person 89.6
human face 88.8
smile 87.9
standing 87.4
dress 86.2
woman 80.2
wedding dress 78.6
man 78.2
screen 77.9
black 75.8
bride 67.7
posing 62.1
black and white 56.2
image 30.6
picture frame 13.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 49-67
Gender Male, 69.3%
Angry 61.5%
Surprised 0.5%
Confused 0.4%
Disgusted 1%
Sad 0.2%
Happy 0.1%
Calm 36.2%
Fear 0%

AWS Rekognition

Age 23-35
Gender Male, 54.5%
Angry 45.1%
Happy 45.9%
Disgusted 45%
Sad 45.1%
Calm 53.7%
Surprised 45.1%
Confused 45%
Fear 45%

AWS Rekognition

Age 22-34
Gender Female, 52.8%
Happy 47.3%
Disgusted 45.1%
Angry 45.2%
Fear 45.2%
Calm 50.7%
Surprised 45.3%
Sad 45.9%
Confused 45.3%

Microsoft Cognitive Services

Age 65
Gender Female

Microsoft Cognitive Services

Age 30
Gender Female

Microsoft Cognitive Services

Age 37
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.4%
Glasses 56.8%