Human Generated Data

Title

Untitled (circus perfomers standing on the backs of moving horses)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12139

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (circus perfomers standing on the backs of moving horses)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12139

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Horse 98.4
Mammal 98.4
Animal 98.4
Person 98.2
Human 98.2
Person 92.7
Leisure Activities 75.2
Horse 71.4
Circus 68.6
Crowd 66
Tree 61.8
Plant 61.8
Person 60.6
Theme Park 59.8
Amusement Park 59.8
Person 42.1

Clarifai
created on 2023-10-26

cavalry 100
people 99.7
carriage 99.1
seated 98.7
mammal 98.1
transportation system 97.5
group 96.9
monochrome 96.3
child 95.6
wagon 94.6
group together 94
adult 93
dog 92.6
cart 92.4
sledge 92
many 91.6
vehicle 91.3
man 91.1
woman 90.5
canine 89.4

Imagga
created on 2022-01-22

horse 24
snow 20.6
dog 20.1
carriage 19
fence 16.1
winter 15.3
old 12.5
tree 12.3
forest 11.3
canine 11.1
structure 11
brown 11
malamute 11
horse cart 10.8
weather 10.7
outdoor 10.7
fountain 10.6
rural 10.6
landscape 10.4
sled dog 10
equine 9.5
domestic animal 9.4
man 9.4
street 9.2
city 9.1
outdoors 9.1
summer 9
hound 8.9
farm 8.9
night 8.9
day 8.6
hunting dog 8.6
cold 8.6
people 8.4
sky 8.3
picket fence 8.2
trees 8
cart 7.9
stallion 7.8
park 7.6
wheeled vehicle 7.6
beach 7.6
house 7.5
vintage 7.4
speed 7.3
sun 7.2
morning 7.2
grass 7.1
sand 7.1
country 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 99.7
horse 93.9
outdoor 88.9
animal 58.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 35-43
Gender Male, 78.4%
Sad 47.6%
Happy 31.5%
Fear 8.6%
Disgusted 3.9%
Calm 3.8%
Angry 2.1%
Surprised 1.5%
Confused 1.1%

AWS Rekognition

Age 37-45
Gender Male, 95.9%
Calm 50.8%
Happy 43.2%
Fear 2.9%
Sad 1.5%
Disgusted 0.5%
Surprised 0.4%
Angry 0.4%
Confused 0.4%

Feature analysis

Amazon

Horse 98.4%
Person 98.2%

Categories

Captions

Microsoft
created on 2022-01-22

a vintage photo of a person 52%
a vintage photo of some people 46.4%

Text analysis

Amazon

10
12020.
9
12020,
18080

Google

GON-YT3RA2-MAMTZAI 12020.
GON-YT3RA2-MAMTZAI
12020.