Human Generated Data

Title

Untitled (woman leaning off bunk on circus train)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7108

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman leaning off bunk on circus train)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7108

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Clothing 100
Apparel 100
Human 98.9
Robe 98.8
Fashion 98.8
Gown 98.4
Wedding 96.9
Bride 96.2
Wedding Gown 96.2
Female 93.5
Person 91.2
Bridegroom 91.1
Face 84.4
Woman 82
Dress 74.7
Portrait 65.9
Photography 65.9
Photo 65.9
Evening Dress 61.9
Veil 61.2
Flower 56.5
Plant 56.5
Blossom 56.5

Clarifai
created on 2023-10-15

wedding 96.8
woman 96
bride 94.7
fashion 93.9
veil 93.5
girl 92.3
people 90
dress 88.4
elegant 88.2
beautiful 87.4
monochrome 87.2
portrait 84.5
desktop 83.9
art 83.8
bone 83
sexy 82.9
young 82.9
luxury 82.2
glamour 82
bridal 80.9

Imagga
created on 2021-12-15

design 20.8
art 20.8
amulet 18
charm 15.2
people 15.1
light 13.4
curve 12.2
elegance 11.8
elegant 11.1
graphic 10.9
pattern 10.9
fashion 10.6
modern 10.5
person 10.5
wave 10.4
business 10.3
motion 10.3
style 9.6
glass 9.5
symbol 9.4
smoke 9.4
decoration 9.4
silhouette 9.1
health 9
human 9
technology 8.9
shape 8.9
medical 8.8
life 8.6
luxury 8.6
professional 8.4
black 8.4
color 8.3
clip art 8.3
man 8.1
transparent 8.1
office 8
computer 8
portrait 7.8
adult 7.8
web 7.6
form 7.4
sketch 7.4
digital 7.3
smooth 7.3
team 7.2
science 7.1
male 7.1

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

wedding dress 98.9
text 98.4
bride 97.4
human face 85.1
clothing 80.3
woman 76.9
person 76.1
dress 75.9
black and white 71.3
wedding 50.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 21-33
Gender Female, 91%
Surprised 56%
Calm 20.6%
Sad 8.9%
Fear 7%
Confused 4.6%
Angry 1.3%
Happy 1.2%
Disgusted 0.4%

Feature analysis

Amazon

Person 91.2%

Categories

Imagga

paintings art 100%

Text analysis

Amazon

Q
16161.
16161
16161 AA-KOA
AA-KOA

Google

HAGOX-YT37A2- NAMT20 16164 16161.
HAGOX-YT37A2-
NAMT20
16164
16161.