Unable to open [object Object]: HTTP 0 attempting to load TileSource

Human Generated Data

Title

Untitled (tiger on ball with trainer)

Date

c. 1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4751

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (tiger on ball with trainer)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4751

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.8
Human 99.8
Person 90
Dog 89.3
Mammal 89.3
Canine 89.3
Pet 89.3
Animal 89.3
Musician 88.2
Musical Instrument 88.2
Drum 80.8
Percussion 80.8
People 70.1
Leisure Activities 69.2
Tiger 60.7
Wildlife 60.7
Drummer 56.5

Clarifai
created on 2023-10-27

people 99
monochrome 92
man 91.4
street 88.9
group 88.5
one 88.5
adult 88.3
picture frame 86.7
woman 86.7
two 86.6
art 85.4
transportation system 83.3
square 82.3
child 82.1
vehicle 81.8
girl 81.6
group together 80.6
illustration 80.4
print 79.7
portrait 79.1

Imagga
created on 2022-01-23

volleyball net 54.2
net 48
game equipment 33.8
equipment 29.1
sport 18.5
sky 17.8
people 14.5
building 13.4
water 13.3
outdoors 12.8
man 12.1
fire screen 11.6
travel 11.3
protective covering 11.1
adult 11
lifestyle 10.8
city 10.8
outdoor 10.7
window 10.7
screen 10.7
vacation 10.6
black 10.4
architecture 10.3
beach 10.1
silhouette 9.9
device 9.8
gate 9.7
landscape 9.7
tree 9.2
relaxation 9.2
business 9.1
summer 9
technology 8.9
interior 8.8
women 8.7
sitting 8.6
relax 8.4
modern 8.4
inside 8.3
transportation 8.1
structure 8
river 8
trees 8
airport 7.8
industry 7.7
old 7.7
trip 7.5
child 7.5
ocean 7.5
machine 7.4
covering 7.4
street 7.4
light 7.3
passenger 7.3
sunset 7.2
holiday 7.2
male 7.1
sea 7

Microsoft
created on 2022-01-23

text 91.7
person 87.3
clothing 77.9
man 73.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 21-29
Gender Male, 84%
Calm 74.4%
Sad 10.2%
Angry 5.8%
Happy 5%
Confused 1.7%
Surprised 1.2%
Disgusted 0.9%
Fear 0.8%

Feature analysis

Amazon

Person
Dog
Tiger
Person 99.8%