Human Generated Data

Title

Untitled (attempts at shoeing a mule)

Date

1900s

People

Artist: American Steel & Wire Co.,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of Robert H. Millett, 2.2002.3889

Human Generated Data

Title

Untitled (attempts at shoeing a mule)

People

Artist: American Steel & Wire Co.,

Date

1900s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of Robert H. Millett, 2.2002.3889

Machine Generated Data

Tags

Amazon
created on 2019-11-10

Human 99.5
Person 99.5
Person 99.4
Person 99.2
Person 99
Person 98.7
Mammal 98.6
Horse 98.6
Animal 98.6
Person 98.3
Bull 90.6
Person 89.5
Person 80.6
Person 74
Person 73.7
Rodeo 73.2
Bullfighter 66.8
Person 63.2
Military 61.8
Military Uniform 61.8

Clarifai
created on 2019-11-10

people 99.8
cavalry 98.9
group together 95.9
group 95.7
mammal 94.9
adult 93.9
many 92.9
man 91.6
military 90.7
seated 88.3
transportation system 87.9
woman 87.8
war 85.4
outfit 84.4
administration 84.3
vehicle 83.2
wear 82.4
soldier 80.7
uniform 80.3
recreation 76.7

Imagga
created on 2019-11-10

cowboy 56.5
horse 49.5
pen 40.4
cattle 37.7
enclosure 32.3
farm 31.3
water buffalo 30.5
horses 27.3
ox 26.8
rural 26.5
pasture 24.9
old world buffalo 24.4
ranch 23.8
field 23.4
grass 22.9
bull 21.7
bovine 21.6
cow 21.4
livestock 20.7
animals 19.5
brown 19.2
laborer 18.4
ruminant 17.6
riding 16.6
outdoors 16.4
herd 15.7
sport 14.8
structure 14.2
mare 13.8
stallion 13.7
country 13.2
plow 13.2
wild 13.1
countryside 12.8
equine 12.6
saddle 12.6
agriculture 12.3
outdoor 12.2
cows 11.8
jockey 11.8
farming 11.4
desert 11.2
grazing 10.8
meadow 10.8
running 10.6
group 10.5
rider 9.8
ride 9.7
landscape 9.7
tool 9.6
standing 9.6
camel 9.5
dry 9.3
mountains 9.3
competition 9.2
domestic 9
rope 8.9
horseback 8.9
equestrian 8.8
safari 8.7
dirt 8.6
travel 8.5
dairy 8.4
summer 8.4
sky 8.3
speed 8.3
sand 8
corral 7.9
male 7.9
mane 7.8
black 7.8
scene 7.8
western 7.8
land 7.4
trees 7.1

Google
created on 2019-11-10

Microsoft
created on 2019-11-10

horse 98.2
outdoor 95.5
text 92
animal 85.7
mammal 76.7
person 74.1
bull 69.4
black 69.1
camel 62.3
cattle 59
working animal 56.8
old 42.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 32-48
Gender Female, 51.6%
Confused 45.1%
Happy 45%
Surprised 45.4%
Angry 45%
Fear 45%
Calm 48.2%
Disgusted 45%
Sad 51.2%

AWS Rekognition

Age 32-48
Gender Male, 51.1%
Fear 45%
Calm 54%
Happy 45.1%
Confused 45.1%
Sad 45.1%
Surprised 45.4%
Angry 45.3%
Disgusted 45%

AWS Rekognition

Age 18-30
Gender Female, 50.2%
Surprised 50.1%
Sad 49.5%
Happy 49.5%
Calm 49.5%
Disgusted 49.5%
Fear 49.6%
Confused 49.5%
Angry 49.7%

AWS Rekognition

Age 16-28
Gender Male, 50.4%
Sad 49.6%
Surprised 49.5%
Disgusted 49.5%
Angry 49.6%
Confused 49.5%
Fear 49.5%
Calm 50.4%
Happy 49.5%

AWS Rekognition

Age 30-46
Gender Male, 50.5%
Sad 50.3%
Confused 49.5%
Disgusted 49.5%
Angry 49.5%
Fear 49.6%
Surprised 49.5%
Calm 49.6%
Happy 49.5%

AWS Rekognition

Age 27-43
Gender Male, 50.4%
Surprised 49.6%
Happy 49.5%
Calm 50.3%
Fear 49.5%
Sad 49.5%
Disgusted 49.5%
Angry 49.5%
Confused 49.5%

AWS Rekognition

Age 34-50
Gender Male, 50.3%
Confused 49.5%
Surprised 49.6%
Sad 49.6%
Fear 49.5%
Happy 49.5%
Angry 49.5%
Disgusted 49.5%
Calm 50.2%

AWS Rekognition

Age 54-72
Gender Male, 52.4%
Surprised 45%
Happy 45%
Sad 51.9%
Disgusted 45.1%
Calm 47.8%
Angry 45.1%
Fear 45%
Confused 45%

AWS Rekognition

Age 45-63
Gender Male, 50%
Confused 49.6%
Surprised 49.6%
Calm 49.7%
Happy 49.5%
Sad 50%
Fear 49.5%
Angry 49.5%
Disgusted 49.5%

AWS Rekognition

Age 54-72
Gender Male, 50.2%
Angry 49.5%
Happy 49.6%
Sad 49.5%
Calm 50.3%
Fear 49.5%
Confused 49.5%
Disgusted 49.5%
Surprised 49.5%

Feature analysis

Amazon

Person 99.5%
Horse 98.6%

Categories