Human Generated Data

Title

Untitled (studio portrait of two couples)

Date

c. 1935, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5833

Human Generated Data

Title

Untitled (studio portrait of two couples)

People

Artist: Durette Studio, American 20th century

Date

c. 1935, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5833

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Clothing 100
Apparel 100
Human 99.2
Person 98.5
Robe 98.3
Fashion 98.3
Person 98.1
Person 97.5
Gown 97.5
Person 97.4
Wedding 96.7
Wedding Gown 94.9
Bride 94.9
Plant 88.7
Flower Arrangement 85.3
Flower Bouquet 85.3
Flower 85.3
Blossom 85.3
Overcoat 68.3
Suit 68.3
Coat 68.3
Evening Dress 66.3
Female 64.4

Clarifai
created on 2019-11-16

people 99.9
wedding 99.7
group 99.5
groom 99
bride 98.9
veil 98.6
woman 98.3
flower arrangement 98.1
adult 98.1
group together 97.7
four 97.1
actress 96.9
ceremony 96.7
dinner jacket 96.7
wear 96.2
man 94.7
two 94.2
outfit 93.8
dress 93.8
several 93.3

Imagga
created on 2019-11-16

people 18.9
shop 18.7
bride 17.6
fashion 17.3
man 16.8
call 16
barbershop 16
window 15.8
black 15.8
person 15.6
groom 15.5
adult 14.9
door 14.1
dress 13.5
mercantile establishment 13.3
old 13.2
couple 13.1
portrait 12.9
wedding 11.9
telephone 11.9
history 11.6
urban 11.3
glass 10.9
dark 10.8
happy 10.6
male 10.6
entrance 10.6
flowers 10.4
style 10.4
love 10.3
happiness 10.2
building 10.1
one 9.7
bouquet 9.7
clothing 9.7
brass 9.5
wind instrument 9.5
wall 9.4
two 9.3
face 9.2
musical instrument 9.2
city 9.1
place of business 8.9
world 8.8
sexy 8.8
hair 8.7
architecture 8.6
marriage 8.5
color 8.3
garment 8.2
art 8.2
religion 8.1
interior 8
businessman 7.9
kin 7.8
ancient 7.8
pretty 7.7
attractive 7.7
relationship 7.5
office 7.4
historic 7.3
device 7.2
looking 7.2
robe 7.1
romantic 7.1
posing 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 99.2
wedding dress 99
bride 98.4
flower 98
wedding 92.8
dress 85.6
clothing 84.9
person 79.7
suit 77.8
woman 73.8
black and white 73.1
posing 63.8
man 52.7
old 43.7
picture frame 9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 19-31
Gender Female, 99.9%
Disgusted 0%
Sad 0%
Calm 0.1%
Fear 0%
Happy 99.9%
Confused 0%
Angry 0%
Surprised 0%

AWS Rekognition

Age 25-39
Gender Male, 54.7%
Sad 45%
Fear 45%
Disgusted 45%
Surprised 45%
Angry 45%
Calm 46.7%
Happy 53.2%
Confused 45%

AWS Rekognition

Age 19-31
Gender Female, 99.6%
Surprised 0.3%
Fear 0.3%
Angry 0.2%
Disgusted 0.1%
Sad 0.2%
Calm 0.9%
Happy 97.7%
Confused 0.3%

AWS Rekognition

Age 23-37
Gender Male, 54.9%
Calm 48.5%
Surprised 45.1%
Angry 45.2%
Confused 45.1%
Happy 50.8%
Fear 45.1%
Sad 45.2%
Disgusted 45.1%

Microsoft Cognitive Services

Age 33
Gender Male

Microsoft Cognitive Services

Age 41
Gender Female

Microsoft Cognitive Services

Age 31
Gender Male

Microsoft Cognitive Services

Age 39
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.5%
Suit 68.3%

Categories

Imagga

paintings art 93.9%
people portraits 5.8%

Text analysis

Google

MJ3TARTI A10A
MJ3TARTI
A10A