Human Generated Data

Title

Untitled (gorilla sitting on swing; trainer crouching)

Date

c. 1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4883

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (gorilla sitting on swing; trainer crouching)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4883

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Human 99.4
Person 99.4
Clothing 99.3
Apparel 99.3
Shorts 99.3
Shoe 91.9
Footwear 91.9
Dog 89.7
Canine 89.7
Mammal 89.7
Animal 89.7
Pet 89.7
Flooring 78.9
Female 75.1
Face 66.9
Indoors 61.7
Room 58.8
Floor 58.4
Handrail 55.9
Banister 55.9
Girl 55.4

Clarifai
created on 2023-10-26

people 99.7
two 98.2
adult 98
man 97.3
monochrome 96.2
woman 93.8
dog 93.5
street 92.8
canine 90.6
one 89.9
wear 87.9
group 87.3
portrait 86.4
three 86.3
couple 82.4
actor 81.9
group together 80.6
music 78.6
window 78.5
food 76.5

Imagga
created on 2022-01-23

people 21.7
plastic bag 20.3
portrait 18.8
person 18.5
bag 17.4
dress 17.2
face 15.6
head 15.1
container 15
fashion 14.3
human 14.2
man 14.1
clothing 13.8
adult 13.6
hair 13.5
bride 13.4
one 12.7
black 12
male 11.3
pretty 11.2
statue 11.2
attractive 11.2
wedding 11
modern 10.5
sculpture 10.5
looking 10.4
women 10.3
love 10.3
model 10.1
urban 9.6
sitting 9.4
architecture 9.4
smile 9.3
city 9.1
newspaper 9.1
culture 8.5
clothes 8.4
summer 8.4
hand 8.4
color 8.3
style 8.2
happy 8.1
sexy 8
cute 7.9
couple 7.8
happiness 7.8
elegant 7.7
skin 7.6
elegance 7.6
art 7.5
groom 7.5
fun 7.5
outdoors 7.5
water 7.3
building 7.3
lifestyle 7.2
wall 7.2
negative 7.2
body 7.2
celebration 7.2
product 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 96
black and white 84.6
dog 72.1
clothing 70.7
person 54.7
monochrome 50.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 50-58
Gender Male, 99.1%
Calm 100%
Sad 0%
Surprised 0%
Disgusted 0%
Fear 0%
Confused 0%
Angry 0%
Happy 0%

Feature analysis

Amazon

Person 99.4%
Shoe 91.9%
Dog 89.7%

Categories

Captions

Text analysis

Amazon

16132
a
16132.

Google

HAGON-YT3RA2- NAMT2A I6132 1613 2. 16132.
HAGON-YT3RA2-
NAMT2A
I6132
1613
2.
16132.