Human Generated Data

Title

Untitled (king and queen process through group)

Date

c. 1935-1940

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1473

Human Generated Data

Title

Untitled (king and queen process through group)

People

Artist: Durette Studio, American 20th century

Date

c. 1935-1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1473

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Clothing 99.9
Apparel 99.9
Person 99.5
Human 99.5
Person 98.3
Person 98.2
Person 98.1
Person 97.4
Person 97.2
Person 96.6
Gown 96.1
Fashion 96.1
Robe 95.1
Wedding 93.5
Person 91.2
Person 90.7
Bride 86.8
Wedding Gown 86.8
Evening Dress 79.2
Person 77.4
Person 77.3
People 75.4
Female 64.6
Person 48.4

Clarifai
created on 2019-06-01

people 98.7
man 94.3
group 92.5
woman 91.5
wedding 90.7
adult 89
group together 86
many 85.9
veil 82.4
wear 81.6
love 80.1
figurine 79.1
leader 78.1
bride 75
ceremony 72.1
victory 72
groom 71.2
music 69.3
crowd 68.5
one 66.2

Imagga
created on 2019-06-01

people 18.4
boutique 17.3
person 15.3
art 14.2
male 12.8
old 12.5
black 12
film 11.9
adult 11.7
business 11.5
man 11.3
design 11.2
negative 11.1
love 11
symbol 10.8
men 10.3
light 10
vintage 10
statue 10
businessman 9.7
cemetery 9.7
decoration 9.6
bride 9.4
historical 9.4
groom 9.3
businesswoman 9.1
drawing 8.9
crowd 8.6
happiness 8.6
marriage 8.5
grunge 8.5
cross 8.5
silhouette 8.3
human 8.2
dress 8.1
glass 8.1
religion 8.1
team 8.1
history 8
antique 8
celebration 8
women 7.9
president 7.9
couple 7.8
sculpture 7.4
white 7.4
occupation 7.3
sexy 7.2
face 7.1

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

wedding dress 92.2
bride 83.6
wedding 82.4
clothing 78.5
old 76.4
white 72
dress 70
black and white 66.4
dance 64.1
person 63.2
posing 52.3
picture frame 17.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Female, 52.7%
Confused 45.4%
Surprised 45.3%
Sad 46.9%
Angry 45.5%
Happy 46%
Calm 50.7%
Disgusted 45.2%

AWS Rekognition

Age 26-43
Gender Female, 53.2%
Disgusted 46.1%
Calm 47.1%
Sad 46.9%
Confused 46%
Angry 46.2%
Surprised 46.5%
Happy 46.3%

AWS Rekognition

Age 35-52
Gender Female, 52.1%
Confused 45.3%
Disgusted 45.7%
Calm 48%
Angry 45.3%
Sad 49.8%
Happy 45.6%
Surprised 45.3%

AWS Rekognition

Age 26-44
Gender Male, 50.9%
Disgusted 52.5%
Sad 45.4%
Happy 45.6%
Surprised 45.4%
Angry 45.3%
Calm 45.5%
Confused 45.3%

AWS Rekognition

Age 26-43
Gender Female, 50.4%
Angry 45.2%
Happy 45.1%
Sad 45.1%
Disgusted 53.7%
Confused 45.1%
Calm 45.5%
Surprised 45.2%

Feature analysis

Amazon

Person 99.5%

Categories

Imagga

paintings art 100%

Text analysis

Amazon

2nbeb