Human Generated Data

Title

Untitled (woman seated side-saddle on horse at circus)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4864

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman seated side-saddle on horse at circus)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Human 99.2
Person 99.2
Horse 96.2
Mammal 96.2
Animal 96.2
Person 95.9
Person 92.9
Person 80.4
Tent 72
Tent 62.9
Andalusian Horse 59.2
Leisure Activities 57.7
Rodeo 56.4
Art 55.6
Person 53.2
Person 49.3

Imagga
created on 2022-01-23

horse 100
vaulting horse 73.8
gymnastic apparatus 44.4
saddle 39.8
sidesaddle 39.3
animal 34.4
sports equipment 29.5
canvas tent 23.8
seat 23.6
horses 21.4
farm 21.4
grass 20.6
animals 20.4
rural 20.3
field 20.1
ride 19.3
ranch 18.8
sunset 18
equine 17.3
pasture 17.2
sky 17.2
outdoors 17.2
sun 16.1
silhouette 15.7
mare 15.7
support 15.7
stallion 15.7
riding 15.6
summer 14.8
equipment 14.8
camel 14
people 13.9
landscape 13.4
brown 13.2
rider 12.8
mammal 12.6
meadow 12.6
livestock 11.8
horseback 11.8
equestrian 11.8
wild 11.3
countryside 11
cattle 10.9
mane 10.8
outdoor 9.9
sport 9.9
person 9.8
country 9.7
sunrise 9.4
evening 9.3
park 9.1
jockey 8.9
device 8.9
man 8.7
fence 8.7
running 8.6
cowboy 8.5
head 8.4
cow 8.2
sunlight 8
sand 7.9
travel 7.7
saddle blanket 7.5
tourism 7.4
natural 7.4
morning 7.2
male 7.2
wildlife 7.1

Google
created on 2022-01-23

Horse 96.3
Working animal 89.4
Horse tack 87.9
Bridle 85.8
Horse supplies 85
Style 84.1
Rein 80.4
Wheel 77.9
Monochrome photography 77.4
Pack animal 77
Monochrome 76.6
Mane 74.4
Cart 73.7
Animal sports 73
Racing 72.8
Livestock 70.3
Event 69.6
Carriage 68.7
Mare 67.9
Art 66.3

Microsoft
created on 2022-01-23

text 99.2
horse 98.9
outdoor 92.3
animal 76.8

Face analysis

Amazon

AWS Rekognition

Age 29-39
Gender Female, 64.7%
Happy 97.1%
Fear 1.6%
Sad 0.4%
Calm 0.3%
Surprised 0.3%
Confused 0.1%
Disgusted 0.1%
Angry 0.1%

AWS Rekognition

Age 23-31
Gender Male, 99.2%
Calm 43.8%
Confused 34.5%
Sad 16.7%
Fear 1.8%
Surprised 1%
Angry 1%
Disgusted 0.7%
Happy 0.4%

AWS Rekognition

Age 23-31
Gender Female, 97.6%
Happy 95.1%
Sad 2.7%
Calm 0.8%
Fear 0.5%
Surprised 0.4%
Disgusted 0.3%
Confused 0.2%
Angry 0.2%

AWS Rekognition

Age 23-31
Gender Male, 68.1%
Calm 73.9%
Happy 7.4%
Fear 6.4%
Sad 5.3%
Surprised 3.3%
Disgusted 1.3%
Confused 1.3%
Angry 1.1%

Feature analysis

Amazon

Person 99.2%
Horse 96.2%
Tent 72%

Captions

Microsoft

a person standing in front of a horse 89.5%
a person standing in front of a horse 88.6%
a person standing next to a horse 88.3%

Text analysis

Amazon

AE-KOA
JJS