Human Generated Data

Title

Untitled (bride and groom seated in backseat of car)

Date

1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.432.10

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (bride and groom seated in backseat of car)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.432.10

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2019-03-25

Apparel 99.7
Clothing 99.7
Plant 99.5
Human 99.3
Blossom 98.1
Flower Bouquet 98.1
Flower 98.1
Flower Arrangement 98.1
Person 97.2
Coat 89.1
Overcoat 89.1
Suit 89.1
Fashion 87.6
Robe 87.6
Face 85.2
Gown 83.3
Female 78
Wedding 76.7
Bridegroom 72.8
Wedding Gown 68.8
Photography 67.7
Photo 67.7
Portrait 67.7
Woman 60.1
Bride 55.7

Clarifai
created on 2019-03-25

people 99.9
adult 99.2
two 98.9
man 98.2
portrait 98
flower arrangement 97.2
facial expression 95.4
wedding 95.1
veil 95
group 94.7
administration 94.6
one 94.6
groom 93.6
leader 93
woman 91.7
wear 91.2
vehicle 91.1
actress 88.6
furniture 87.2
ceremony 87

Imagga
created on 2019-03-25

groom 49.3
couple 39.2
man 35.6
happy 35.1
person 34.4
male 32.8
people 29.6
bride 28.8
bouquet 28.8
adult 28.5
love 27.6
happiness 27.4
wedding 26.7
smiling 25.3
director 25.1
flowers 24.3
married 23
smile 22.1
portrait 22
sitting 21.5
dress 19.9
suit 19
marriage 19
supporter 18.7
together 18.4
senior 17.8
home 17.6
businessman 16.8
men 16.3
cheerful 16.3
two 16.1
romance 16.1
women 15.8
business 15.2
indoors 14.9
mature 14.9
romantic 14.3
face 14.2
holding 14
husband 13.7
bridal 12.6
scholar 12.5
wife 12.3
table 12.1
office 12
pretty 11.9
veil 11.8
professional 11.6
family 11.6
elderly 11.5
gown 11.2
intellectual 11
horizontal 10.9
30s 10.6
tie 10.4
looking 10.4
20s 10.1
flower 10
fashion 9.8
roses 9.7
lifestyle 9.4
expression 9.4
relationship 9.4
dinner 9.3
life 9.1
attractive 9.1
executive 9
restaurant 8.8
celebration 8.8
brunette 8.7
work 8.6
future 8.4
old 8.4
black 8.3
indoor 8.2
businesswoman 8.2
meal 8.1
group 8.1
working 8
hair 7.9
mother 7.9
standing 7.8
corporate 7.7
formal 7.6
businesspeople 7.6
eating 7.6
desk 7.6
meeting 7.5
glasses 7.4
camera 7.4
lunch 7.1
handsome 7.1
student 7
look 7

Google
created on 2019-03-25

Microsoft
created on 2019-03-25

person 97
black 83.8
white 79.9
posing 77.4
old 77.4
retro 12.5
wedding 12.3
black and white 11.3
formal 8.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 26-43
Gender Male, 98.5%
Angry 2.5%
Happy 86.2%
Sad 0.7%
Confused 0.8%
Disgusted 2.4%
Surprised 6.5%
Calm 0.9%

AWS Rekognition

Age 35-52
Gender Female, 100%
Angry 1.1%
Disgusted 0.4%
Sad 0.1%
Surprised 1.5%
Happy 96.3%
Calm 0%
Confused 0.5%

Microsoft Cognitive Services

Age 41
Gender Male

Microsoft Cognitive Services

Age 52
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.2%

Categories

Imagga

people portraits 95.5%
paintings art 4.2%