Human Generated Data

Title

Untitled (person on horseback jumping over broken wooden fence)

Date

1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5185

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (person on horseback jumping over broken wooden fence)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5185

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 96.4
Human 96.4
Animal 92.4
Horse 92.4
Mammal 92.4
Nature 90.9
Outdoors 82.5
Clothing 75.2
Apparel 75.2
People 74.5
Person 73.6
Person 61.6
Porch 58.7
Horse 57.9
Female 56.8
Standing 55.3

Clarifai
created on 2023-10-26

people 99.8
cavalry 98.7
group together 98.3
adult 97.8
group 97.5
man 96.8
many 93.7
monochrome 93.4
mammal 92.5
vehicle 90.4
two 90.1
seated 87.6
competition 87.5
transportation system 87.1
three 85.3
winter 85.2
military 84.4
woman 83.7
home 82.1
snow 81.8

Imagga
created on 2022-01-23

sky 30.2
landscape 26.8
structure 25.1
snow 22.5
travel 16.2
gravestone 16.2
fence 16.1
old 16
memorial 16
park bench 15
rural 15
bench 14.8
picket fence 14.6
weather 14.4
outdoor 13.8
winter 13.6
shopping cart 13.4
stone 13.4
handcart 13.3
water 12.7
tree 12.4
environment 12.3
summer 12.2
clouds 11.8
road 11.7
field 11.7
barrier 11.6
man 11.4
wheeled vehicle 11.4
building 11.1
beach 11
sea 11
ocean 10.9
destruction 10.8
disaster 10.7
fog 10.6
scenic 10.5
forest 10.4
cloud 10.3
season 10.1
seat 10.1
dark 10
trees 9.8
nuclear 9.7
scene 9.5
grass 9.5
architecture 9.4
protection 9.1
vintage 9.1
danger 9.1
tourism 9.1
scenery 9
outside 8.6
industry 8.5
black 8.4
outdoors 8.3
industrial 8.2
sun 8.1
agriculture 7.9
cold 7.8
obstruction 7.7
person 7.6
city 7.5
smoke 7.4
ice 7.4
vacation 7.4
lake 7.3
countryside 7.3
people 7.3
sunset 7.2
coast 7.2
history 7.2

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 99.2
outdoor 97.9
black and white 94.8
horse 76.6
old 68.8
monochrome 67.2
tent 53.7
outdoor object 36

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 42-50
Gender Male, 99.7%
Calm 99.2%
Surprised 0.3%
Sad 0.2%
Confused 0.1%
Happy 0.1%
Angry 0.1%
Disgusted 0.1%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 96.4%
Horse 92.4%

Categories

Captions

Text analysis

Amazon

11522.
11522

Google

2
IS
05 2 2. IS 22.
05
2.
22.