Human Generated Data

Title

Untitled (attempts at shoeing a mule)

Date

1900s

People

Artist: American Steel & Wire Co.,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of Robert H. Millett, 2.2002.3894

Human Generated Data

Title

Untitled (attempts at shoeing a mule)

People

Artist: American Steel & Wire Co.,

Date

1900s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of Robert H. Millett, 2.2002.3894

Machine Generated Data

Tags

Amazon
created on 2019-11-10

Human 99.7
Person 99.7
Bull 99.5
Animal 99.5
Mammal 99.5
Person 99.5
Person 99.3
Person 98.7
Person 98.7
Person 97.6
Horse 95.7
Bullfighter 86.8
Rodeo 82.4
Bullfighting 72.1

Clarifai
created on 2019-11-10

people 99.3
cavalry 95.5
man 93.5
group 92.9
mammal 91.9
group together 91.8
adult 90.3
wear 87.3
woman 86.1
war 82.9
military 81.8
street 81.7
child 80.5
seated 79.6
many 79.1
monochrome 77.4
recreation 77.4
transportation system 76
outfit 74.5
sepia 73.3

Imagga
created on 2019-11-10

horse 70.7
cowboy 53.4
pen 33.7
enclosure 28.3
animal 27.2
farm 26.8
horses 24.4
vaulting horse 24.1
ranch 23.4
animals 20.4
bull 19.5
riding 19.5
field 19.3
rural 18.5
cattle 18.5
grass 18.2
brown 17.7
ride 17.5
mammal 17.4
sport 17.3
laborer 17.1
saddle 16.7
plow 16.4
equine 16.3
water buffalo 15.1
cow 15.1
outdoors 14.9
rider 14.8
equestrian 14.7
gymnastic apparatus 14.4
horseback 13.8
outdoor 13.8
herd 13.8
structure 13.3
stallion 12.7
farming 12.3
country 12.3
male 12.1
tool 12
jockey 11.8
mare 11.8
western 11.6
fence 11.6
pasture 11.5
livestock 11.5
wild 11.3
old world buffalo 11.3
speed 11
people 10.6
agriculture 10.5
man 10.1
competition 10.1
countryside 10.1
mane 9.8
sports equipment 9.7
sky 9.6
dirt 9.6
outside 9.4
natural 9.4
mountains 9.3
summer 9
cows 8.9
grazing 8.8
desert 8.4
harness 8.3
landscape 8.2
meadow 8.1
sunlight 8
black 7.8
travel 7.8
tail 7.7
walking 7.6
dry 7.4
action 7.4
hat 7.4
group 7.3
land 7.2
sunset 7.2

Google
created on 2019-11-10

Microsoft
created on 2019-11-10

horse 96.9
animal 96
text 93.4
person 91.5
cattle 90.4
bull 86.9
outdoor 86.1
mammal 82
man 79.9
working animal 75.9
black 75.4
mule 72.6
old 71.5
clothing 68.3
vintage 48.6

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 31-47
Gender Male, 54.9%
Disgusted 45%
Surprised 45.8%
Sad 45.1%
Angry 45.2%
Fear 45.4%
Happy 45%
Calm 53.4%
Confused 45%

AWS Rekognition

Age 39-57
Gender Male, 54.7%
Happy 45%
Confused 45.1%
Fear 45%
Angry 46.2%
Surprised 45%
Disgusted 46.3%
Calm 50.4%
Sad 47%

AWS Rekognition

Age 26-40
Gender Male, 50.5%
Sad 49.5%
Happy 49.5%
Angry 49.5%
Disgusted 49.5%
Calm 50.5%
Fear 49.5%
Confused 49.5%
Surprised 49.5%

AWS Rekognition

Age 49-67
Gender Male, 50.4%
Fear 49.5%
Sad 50.4%
Confused 49.5%
Angry 49.5%
Disgusted 49.5%
Calm 49.6%
Happy 49.5%
Surprised 49.5%

AWS Rekognition

Age 34-50
Gender Male, 50.4%
Happy 49.5%
Calm 50%
Surprised 49.9%
Confused 49.5%
Disgusted 49.5%
Fear 49.5%
Sad 49.5%
Angry 49.5%

AWS Rekognition

Age 30-46
Gender Female, 51.2%
Surprised 45.5%
Happy 45.8%
Sad 45.3%
Disgusted 45.1%
Calm 52.7%
Angry 45.4%
Fear 45.2%
Confused 45.1%

AWS Rekognition

Age 41-59
Gender Male, 50.5%
Disgusted 49.5%
Happy 49.5%
Fear 49.5%
Surprised 49.5%
Confused 49.5%
Calm 50.4%
Angry 49.5%
Sad 49.6%

Microsoft Cognitive Services

Age 35
Gender Male

Feature analysis

Amazon

Person 99.7%
Horse 95.7%

Categories

Imagga

nature landscape 67%
pets animals 31.8%