Human Generated Data

Title

Untitled (gorilla on swing, trainer with metal rod)

Date

c. 1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4759

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (gorilla on swing, trainer with metal rod)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4759

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.7
Human 99.7
Clothing 99.4
Apparel 99.4
Dog 89.5
Canine 89.5
Pet 89.5
Mammal 89.5
Animal 89.5
Footwear 86.9
Riding Boot 79.6
Boot 79.6
Door 61.8
Flooring 59.8
Floor 55

Clarifai
created on 2023-10-26

dog 99.6
people 99.6
canine 97.3
monochrome 96.8
adult 96.1
man 96
two 96
one 95.2
street 94.8
portrait 94.1
pet 93
mammal 92
outerwear 92
offense 91
indoors 90.4
room 88.9
door 86
coat 85.5
group together 85.4
wear 85.4

Imagga
created on 2022-01-23

sliding door 31.3
man 30.2
door 28.8
city 22.4
people 22.3
person 21.3
urban 21
male 19.5
movable barrier 18.9
adult 17.6
cleaning implement 17.2
business 15.2
swab 14.1
interior 13.3
black 13.2
barrier 13
wall 12.9
street 12.9
portrait 12.3
men 12
passenger 11.7
window 11.7
walk 11.4
motion 11.1
inside 11
shop 10.8
fashion 10.6
women 10.3
building 9.8
happy 9.4
architecture 9.4
prison 9.1
clothing 9.1
danger 9.1
human 9
one 9
child 8.9
indoors 8.8
mall 8.8
high 8.7
concrete 8.6
walking 8.5
house 8.4
dark 8.3
indoor 8.2
dirty 8.1
worker 8
job 8
businessman 7.9
gate 7.9
travel 7.7
station 7.7
entrance 7.7
old 7.7
office 7.5
life 7.5
vintage 7.4
blur 7.4
holding 7.4
shopping 7.3
correctional institution 7.3
protection 7.3
industrial 7.3
dress 7.2
home 7.2
transportation 7.2
smile 7.1
working 7.1
happiness 7
modern 7

Microsoft
created on 2022-01-23

text 96.4
clothing 79.3
drawing 58
statue 53.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 21-29
Gender Female, 51.5%
Sad 89.5%
Calm 3.4%
Angry 2.6%
Fear 1.7%
Disgusted 1.4%
Confused 0.5%
Happy 0.4%
Surprised 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Dog
Person 99.7%
Dog 89.5%

Categories

Captions

Text analysis

Amazon

40.
160 40.
160
OHO.
16 OHO.
16
16040.
NAOON