Human Generated Data

Title

Untitled (two women talking at wedding reception)

Date

1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8405

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two women talking at wedding reception)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8405

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Clothing 100
Apparel 100
Person 99.3
Human 99.3
Person 98.2
Person 98.1
Hat 94.3
Suit 91.1
Overcoat 91.1
Coat 91.1
Robe 91.1
Fashion 91.1
Person 89.7
Gown 89.1
Wedding 84.1
Wedding Gown 74.9
Female 73.3
Bridegroom 69.7
Sleeve 69.6
Dress 66.9
Face 65.4
Tuxedo 65.4
Bride 63.4
Hand 62
Woman 61.5
Veil 61.4
Portrait 60.8
Photography 60.8
Photo 60.8
Finger 60.4
Hat 56.5
Texture 55

Clarifai
created on 2023-10-25

people 99.7
group 96.9
woman 96.9
adult 96.4
wear 94.1
man 93.4
two 92.5
three 91.9
monochrome 90.7
retro 89.9
coat 88.9
veil 88.7
commerce 86.4
fashion 85.9
nostalgia 83.2
actress 81.6
group together 80
leader 79
one 78
dress 77

Imagga
created on 2022-01-09

world 25.2
people 24.5
man 23.5
person 23
portrait 22.6
dress 22.6
adult 22.2
happy 21.9
male 19.8
bride 19.2
couple 18.3
women 16.6
wedding 16.5
happiness 16.4
fashion 15.8
groom 15.3
love 15
pretty 14.7
attractive 14.7
lady 14.6
smiling 14.5
clothing 12.9
face 12.8
human 12.7
cheerful 12.2
business 11.5
black 11.4
smile 11.4
home 11.2
lifestyle 10.8
holiday 10.7
group 10.5
sexy 10.4
husband 10
businessman 9.7
child 9.7
indoors 9.7
life 9.6
hair 9.5
expression 9.4
model 9.3
casual 9.3
two 9.3
grandma 9.2
indoor 9.1
together 8.8
married 8.6
bouquet 8.6
cute 8.6
sitting 8.6
youth 8.5
joy 8.3
20s 8.2
room 8.2
office 8
family 8
look 7.9
veil 7.8
boy 7.8
gown 7.8
men 7.7
hand 7.6
suit 7.6
light 7.3
year 7.3
new 7.3
teacher 7.3
businesswoman 7.3
color 7.2
body 7.2
team 7.2
kid 7.1
day 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

person 95.7
text 94.5
human face 87.7
clothing 84.9
woman 83.2
dress 74.6
black and white 63

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 47-53
Gender Female, 85.8%
Calm 41.2%
Happy 19.3%
Disgusted 15.6%
Fear 5.4%
Surprised 5.2%
Angry 5.1%
Sad 4.2%
Confused 3.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Hat 94.3%

Categories

Text analysis

Amazon

12156.
12156
TS3

Google

12156. 12156. NAGON-YT33A2A AMI
12156.
NAGON-YT33A2A
AMI