Human Generated Data

Title

Untitled (gorilla holding hand of trainer)

Date

c. 1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4968

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (gorilla holding hand of trainer)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4968

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Furniture 88.7
Outdoors 79.6
Home Decor 76.3
Cat 73.9
Animal 73.9
Mammal 73.9
Pet 73.9
Nature 68
Person 65.5
Human 65.5
Clothing 63.9
Apparel 63.9
Face 62.6
Sculpture 55.4
Art 55.4

Clarifai
created on 2023-10-26

monochrome 98.6
portrait 94.7
people 93.8
one 92.4
art 90.3
man 89.2
old 87.5
monochromatic 84.9
black and white 84.7
sit 83.4
adult 83.4
monkey 83.3
no person 83.3
antique 81.9
classic 81.6
statue 76.1
face 75.7
fur 75.4
indoors 74.6
ancient 73.9

Imagga
created on 2022-01-23

negative 100
film 100
photographic paper 80.7
photographic equipment 53.8
hair 26.9
portrait 25.9
face 23.4
adult 22
pretty 21
person 20.5
people 20.1
sexy 19.3
attractive 18.2
body 16.8
smile 14.3
eyes 13.8
skin 13.5
lady 13
one 12.7
love 12.6
black 12.6
hand 12.2
fashion 12.1
human 12
blond 11.5
cute 11.5
looking 11.2
makeup 11
head 10.9
lifestyle 10.8
happy 10.7
health 10.4
model 10.1
healthy 10.1
man 10.1
clean 10
dress 9.9
gray 9.9
bathroom 9.9
bride 9.6
bath 9.5
smiling 9.4
old 9.1
care 9.1
wet 8.9
child 8.9
little 8.8
expression 8.5
erotic 8.5
senior 8.4
shower 8.4
alone 8.2
indoors 7.9
male 7.8
sad 7.7
nose 7.6
relaxation 7.5
mature 7.4
water 7.3
sensuality 7.3
home 7.2
women 7.1
kid 7.1
life 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 99.9
man 91
black and white 85.4
human face 77
person 68.3
clothing 66.9

Color Analysis

Feature analysis

Amazon

Cat 73.9%
Person 65.5%

Categories

Imagga

paintings art 97.2%
people portraits 2.2%

Captions

Text analysis

Amazon

16123.
ET191
a

Google

YAGON-YT3RA2- AMT2A 16123. 16123.
YAGON-YT3RA2-
AMT2A
16123.