Human Generated Data

Title

Untitled (trainer leaning towards gorilla)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7106

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (trainer leaning towards gorilla)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7106

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Human 97.5
Clothing 96.2
Apparel 96.2
Person 91.5
Person 85.7
Dog 84.4
Mammal 84.4
Canine 84.4
Animal 84.4
Pet 84.4
Face 67.8
Portrait 63.7
Photography 63.7
Photo 63.7
Cat 60.4
Female 59.7
Leisure Activities 58.3
Flooring 57

Clarifai
created on 2023-10-15

people 99
monochrome 97.4
man 96.3
adult 96.3
woman 94.9
wear 88.7
dancer 88.2
dancing 88.2
motion 87.3
bride 83.1
one 83
fashion 82.9
girl 82.5
wedding 82.5
black and white 81.5
young 81.3
health 81.1
vertical 81
two 80.8
group 80.1

Imagga
created on 2021-12-15

groom 27.1
bride 22.4
wedding 22.1
negative 21.4
people 20.6
love 18.9
film 18.8
couple 18.3
dress 18.1
cradle 15.3
portrait 14.9
married 14.4
person 13.5
happiness 13.3
marriage 13.3
happy 13.2
photographic paper 13
celebration 12.8
baby bed 12
women 11.9
adult 11.6
man 11.4
bouquet 11.3
elegance 10.9
veil 10.8
smile 10.7
hand 10.6
furniture 10.6
human 10.5
men 10.3
two 10.2
wed 9.8
romantic 9.8
family 9.8
smiling 9.4
face 9.2
child 9.1
cheerful 8.9
ceremony 8.7
light 8.7
photographic equipment 8.7
life 8.7
wife 8.5
attractive 8.4
fashion 8.3
toilet tissue 8.1
male 7.8
pretty 7.7
youth 7.7
traditional 7.5
city 7.5
tradition 7.4
sexy 7.2
black 7.2
suit 7.2
looking 7.2
romance 7.1
day 7.1
furnishing 7

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

text 99.8
black and white 86.3
clothing 65.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 34-50
Gender Male, 71.3%
Calm 86%
Sad 10.8%
Happy 1%
Confused 0.9%
Surprised 0.7%
Angry 0.4%
Disgusted 0.2%
Fear 0.1%

Feature analysis

Amazon

Person 91.5%
Dog 84.4%
Cat 60.4%

Categories

Imagga

paintings art 99.8%

Captions

Text analysis

Amazon

16042
a
zhogi
74091
НАСОЛ
-ИАМТГАЗ
9AS -ИАМТГАЗ
9AS

Google

16042 HAGO A2-MAMTZA3 16042
16042
HAGO
A2-MAMTZA3