Human Generated Data

Title

Untitled (thee riders on horseback, Gap of Dunloe, Ireland)

Date

c. 1905-1915

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3947

Human Generated Data

Title

Untitled (thee riders on horseback, Gap of Dunloe, Ireland)

People

Artist: Durette Studio, American 20th century

Date

c. 1905-1915

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3947

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Horse 99.4
Animal 99.4
Mammal 99.4
Horse 98.9
Person 98.4
Human 98.4
Person 96.7
Person 94.1
Horse 88
Nature 73.5
Outdoors 67.8
Hat 60.3
Apparel 60.3
Clothing 60.3
Equestrian 56.4

Clarifai
created on 2019-06-01

people 99.9
group 98
many 97.4
adult 97.3
wear 96.7
outfit 94.8
cavalry 94.5
group together 93.8
dancing 91.5
man 91.3
several 90
veil 87.3
child 84.1
leader 83.7
woman 83.5
dancer 83.1
ballet dancer 82.3
administration 82.1
three 80.4
motion 78.9

Imagga
created on 2019-06-01

borzoi 27.4
wolfhound 22.6
hound 19.2
horse 15.6
cattle 15.2
people 15
dog 14.5
cow 14.4
art 12.8
dress 12.6
hunting dog 12.5
farm 12.5
summer 12.2
adult 12.2
animals 12
black 12
white 11.6
statue 11.5
outdoors 11.2
person 11
sea 10.9
outdoor 10.7
water 10.7
travel 10.6
bovine 10.5
group 10.5
dancer 10.4
sky 10.2
ranch 9.9
sand 9.9
herd 9.8
sun 9.7
snow 9.6
grass 9.5
beach 9.4
male 9.3
field 9.2
man 8.8
couple 8.7
performer 8.6
fountain 8.5
head 8.4
color 8.3
traditional 8.3
negative 8.2
rural 7.9
wild 7.8
happiness 7.8
standing 7.8
horses 7.8
scene 7.8
men 7.7
culture 7.7
ice 7.6
canine 7.2
meadow 7.2
family 7.1
bride 7.1
face 7.1

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

posing 99.5
horse 98.1
old 96.7
standing 96.1
person 92.9
window 92.8
text 91.3
outdoor 89.1
black 83.4
white 72.1
animal 69.3
group 68.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-38
Gender Female, 54.1%
Angry 45.6%
Happy 47.2%
Confused 45.8%
Calm 47.6%
Disgusted 45.8%
Sad 47.5%
Surprised 45.5%

AWS Rekognition

Age 26-43
Gender Female, 53.1%
Disgusted 45.4%
Sad 46.8%
Happy 46.2%
Surprised 45.3%
Angry 45.5%
Calm 50.4%
Confused 45.3%

AWS Rekognition

Age 26-43
Gender Female, 53.9%
Disgusted 45.1%
Sad 45.3%
Surprised 45.2%
Happy 45.2%
Angry 45.2%
Calm 53.8%
Confused 45.2%

Feature analysis

Amazon

Horse 99.4%
Person 98.4%

Categories

Imagga

paintings art 96.5%
text visuals 3.1%

Text analysis

Amazon

OF