Human Generated Data

Title

Untitled (attempts at shoeing a mule)

Date

1900s

People

Artist: American Steel & Wire Co.,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of Robert H. Millett, 2.2002.3895

Human Generated Data

Title

Untitled (attempts at shoeing a mule)

People

Artist: American Steel & Wire Co.,

Date

1900s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of Robert H. Millett, 2.2002.3895

Machine Generated Data

Tags

Amazon
created on 2019-11-10

Mammal 99.4
Animal 99.4
Bull 99.4
Person 97.6
Human 97.6
Horse 96.8
Person 96.6
Person 95.9
Person 94.2
Person 93.5
Person 93.3
Cattle 92.6
Ox 92.6
Person 91.7
Person 89
Person 76.1
Person 62.5
Horse 60.8
Person 56.7
Person 51.3

Clarifai
created on 2019-11-10

people 99.7
group 97.3
cavalry 97.3
mammal 94
man 93.5
many 92.6
adult 92.5
transportation system 92.5
group together 92.2
retro 90.4
military 88.9
vintage 87.8
seated 87.1
war 85.6
vehicle 84.7
art 79.3
cattle 78.3
several 77.9
wear 77.3
old 73.8

Imagga
created on 2019-11-10

horse 59.1
carriage 55.8
harness 47.7
horse cart 46.5
cart 40.3
animal 36.3
horses 33.1
wagon 28.8
support 28.8
device 22.2
mammal 20.9
cowboy 20
ride 19.4
wheeled vehicle 18.1
animals 17.6
farm 16.9
wild 16.5
outdoors 16.4
stallion 15.7
riding 15.6
jockey 14.8
grass 14.2
landscape 14.1
brown 14
herd 12.8
transportation 12.6
equine 12.2
travel 12
speed 11.9
mare 11.8
equestrian 11.8
field 11.7
saddle 11.7
camel 11.7
sky 11.5
outdoor 11.5
rural 11.5
action 11.1
ranch 10.9
tourism 10.7
running 10.6
country 10.5
desert 10.4
summer 10.3
sport 9.9
racing 9.8
cow 9.7
west 9.7
pasture 9.6
race 9.6
dirt 9.5
vehicle 9.5
competition 9.2
sand 8.9
group 8.9
rider 8.8
mane 8.8
western 8.7
plow 8.7
tail 8.6
male 8.5
people 8.4
old 8.4
mountains 8.3
silhouette 8.3
tourist 8.2
sunset 8.1
horseback 7.9
cattle 7.9
architecture 7.8
head 7.6
man 7.4
vacation 7.4
transport 7.3
countryside 7.3
team 7.2
day 7.1
spring 7.1
scenic 7

Google
created on 2019-11-10

Microsoft
created on 2019-11-10

horse 99.1
cow 97.1
animal 95
text 92.3
outdoor 89.7
person 74.9
mule 61
mammal 59.2
people 58.5
rein 54.2
working animal 53.8
bull 51.3
halter 51
old 48
cattle 43.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 41-59
Gender Male, 54.7%
Sad 45.4%
Angry 45.3%
Disgusted 45%
Surprised 45%
Fear 45%
Calm 54.2%
Confused 45%
Happy 45%

AWS Rekognition

Age 24-38
Gender Male, 54.9%
Surprised 45.4%
Angry 49.9%
Happy 45.1%
Disgusted 45.1%
Calm 48.7%
Fear 45.3%
Confused 45.1%
Sad 45.2%

AWS Rekognition

Age 36-54
Gender Male, 55%
Surprised 45%
Happy 45%
Disgusted 45%
Confused 45%
Fear 45%
Angry 46.1%
Calm 53.5%
Sad 45.3%

AWS Rekognition

Age 21-33
Gender Male, 50.4%
Fear 49.5%
Confused 49.5%
Disgusted 49.5%
Calm 50.1%
Happy 49.9%
Surprised 49.5%
Angry 49.5%
Sad 49.5%

AWS Rekognition

Age 26-42
Gender Male, 50.4%
Angry 49.6%
Happy 49.5%
Sad 49.5%
Calm 50%
Fear 49.5%
Confused 49.6%
Disgusted 49.5%
Surprised 49.8%

AWS Rekognition

Age 38-56
Gender Female, 50%
Surprised 49.6%
Sad 49.5%
Happy 49.6%
Calm 50.1%
Disgusted 49.6%
Fear 49.5%
Confused 49.6%
Angry 49.6%

AWS Rekognition

Age 23-37
Gender Male, 53.3%
Surprised 45.1%
Calm 47.2%
Angry 51.4%
Fear 45.1%
Happy 45%
Disgusted 45.2%
Sad 45.8%
Confused 45.2%

AWS Rekognition

Age 24-38
Gender Male, 50.3%
Happy 49.5%
Disgusted 49.5%
Angry 49.5%
Sad 49.9%
Fear 49.5%
Surprised 49.5%
Confused 49.5%
Calm 49.9%

AWS Rekognition

Age 35-51
Gender Male, 50.4%
Happy 49.5%
Angry 49.9%
Calm 49.9%
Surprised 49.5%
Fear 49.6%
Disgusted 49.5%
Confused 49.5%
Sad 49.6%

Feature analysis

Amazon

Person 97.6%
Horse 96.8%

Categories

Imagga

pets animals 56.9%
nature landscape 42.9%