Human Generated Data

Title

Untitled (two photographs: portrait of bride and groom, both wearing glasses; portrait of a nun seated outside brick building)

Date

c. 1935, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6075

Human Generated Data

Title

Untitled (two photographs: portrait of bride and groom, both wearing glasses; portrait of a nun seated outside brick building)

People

Artist: Durette Studio, American 20th century

Date

c. 1935, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6075

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 97.5
Person 97.5
Person 94.9
Apparel 93.7
Clothing 93.7
Person 93.3
Art 86.6
Painting 77
Performer 69.3
Coat 67.1
Suit 67.1
Overcoat 67.1
People 65.8
Face 65.6
Photography 61.6
Photo 61.6
Portrait 60.9
Advertisement 56.8
Hat 56.4

Clarifai
created on 2019-11-16

people 100
adult 99.7
group 99.7
portrait 99.7
two 99.2
wear 98.9
woman 98.4
man 98.3
veil 98
wedding 97.7
actress 97.4
movie 96.1
three 95.8
outfit 95.7
leader 95.6
child 95.3
actor 95.2
facial expression 95.1
furniture 94.3
offspring 93.5

Imagga
created on 2019-11-16

groom 100
bride 36.8
kin 35.9
couple 31.4
love 30.8
wedding 29.4
dress 28
people 25.1
man 24.9
portrait 23.3
happiness 22.7
bouquet 22.6
happy 20.7
person 20.3
male 19.9
adult 19.4
romantic 18.7
married 17.3
marriage 17.1
face 16.3
two 16.1
wife 15.2
flowers 14.8
bridal 14.6
celebration 14.4
husband 13.6
smile 12.8
fashion 12.8
veil 12.7
romance 12.5
wed 11.8
gown 11.7
family 11.6
flower 11.5
sexy 11.2
attractive 11.2
religion 10.8
ceremony 10.7
pretty 10.5
black 10.3
youth 10.2
traditional 10
window 9.6
together 9.6
look 9.6
smiling 9.4
culture 9.4
lifestyle 9.4
relationship 9.4
head 9.2
elegance 9.2
make 9.1
looking 8.8
day 8.6
loving 8.6
luxury 8.6
joy 8.4
color 8.3
silhouette 8.3
human 8.2
one 8.2
world 8.2
lady 8.1
detail 8
hair 7.9
matrimony 7.9
art 7.8
men 7.7
old 7.7
outdoor 7.6
passion 7.5
rose 7.5
outdoors 7.5
future 7.4
joyful 7.4
makeup 7.3
room 7.3
suit 7.2
home 7.2
life 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 97.2
human face 95.8
person 95.5
clothing 93.9
smile 87.5
window 87
painting 81.7
drawing 80.9
woman 80.2
black 74.2
gallery 60.4
black and white 58.6
old 58.4
sketch 54
posing 35.8
image 35.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-35
Gender Male, 98.9%
Disgusted 0.1%
Happy 0.3%
Surprised 0.2%
Sad 0.9%
Fear 0.1%
Calm 98.1%
Angry 0.3%
Confused 0.1%

AWS Rekognition

Age 30-46
Gender Female, 69.9%
Calm 55.3%
Fear 0.8%
Angry 1%
Disgusted 0.3%
Sad 10.9%
Happy 0.1%
Surprised 0.5%
Confused 31.1%

AWS Rekognition

Age 31-47
Gender Female, 98.6%
Happy 90.4%
Sad 0.3%
Angry 0.2%
Calm 8.3%
Fear 0.1%
Surprised 0.4%
Disgusted 0.2%
Confused 0.2%

Microsoft Cognitive Services

Age 30
Gender Male

Microsoft Cognitive Services

Age 50
Gender Female

Microsoft Cognitive Services

Age 43
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.5%
Suit 67.1%

Categories

Imagga

paintings art 99%