Human Generated Data

Title

Untitled (outdoor portrait of family with dalmation in front of debris strewn yard)

Date

c. 1910-1920

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3770

Human Generated Data

Title

Untitled (outdoor portrait of family with dalmation in front of debris strewn yard)

People

Artist: Durette Studio, American 20th century

Date

c. 1910-1920

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Outdoors 100
Storm 100
Nature 100
Snow 100
Winter 100
Blizzard 100
Human 99.7
Person 99.7
Person 98.9
Person 98.5
Animal 98.3
Canine 98.3
Pet 98.3
Mammal 98.3
Dog 98.3
Person 94.4
Train 62.9
Vehicle 62.9
Transportation 62.9
Person 62.4
Ice 61.3
Clothing 60.5
Apparel 60.5

Clarifai
created on 2019-06-01

people 99.9
group 99
adult 97.6
group together 97
man 97
vehicle 93.5
many 92.9
street 92
several 91.8
administration 90.2
woman 89.2
home 89.1
war 88.5
military 88.1
two 87.5
child 87.4
four 83.5
three 83
print 82.8
wear 79.8

Imagga
created on 2019-06-01

dog 84.5
dalmatian 72.5
canine 51.9
domestic animal 49.4
snow 27.3
winter 24.7
hunting dog 24.4
setter 23.5
cold 19.8
outdoor 17.6
landscape 17.1
sporting dog 17
hound 14.9
man 13.4
outdoors 11.9
night 11.5
tree 11.5
season 10.9
black 10.8
park 10.7
trees 10.7
people 10.6
sport 10
weather 9.9
forest 9.6
men 9.4
animals 9.3
danger 9.1
vacation 9
sky 8.9
farm 8.9
snowy 8.7
water 8.7
outside 8.6
horse 8.5
hill 8.4
old 8.4
dark 8.3
ice 8.3
pet 8.3
mountain 8
sand 7.9
frozen 7.6
beach 7.6
walking 7.6
person 7.5
fun 7.5
evening 7.5
child 7.2
holiday 7.2
adult 7.1
rural 7
travel 7

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

snow 96.1
dog 95.4
carnivore 94.8
outdoor 93.4
animal 86.6
black and white 65.9
old 42

Face analysis

Amazon

AWS Rekognition

Age 23-38
Gender Male, 52.1%
Happy 45.4%
Surprised 45.5%
Angry 45.8%
Confused 45.2%
Calm 47.9%
Sad 49.8%
Disgusted 45.4%

AWS Rekognition

Age 20-38
Gender Female, 52%
Angry 45.7%
Calm 48.7%
Sad 46.7%
Surprised 45.9%
Disgusted 45.7%
Happy 46.9%
Confused 45.5%

AWS Rekognition

Age 26-43
Gender Female, 51.8%
Sad 45.2%
Calm 45%
Surprised 45.1%
Angry 45%
Disgusted 45%
Happy 54.5%
Confused 45%

Feature analysis

Amazon

Person 99.7%
Dog 98.3%
Train 62.9%

Captions

Microsoft

a man standing next to a horse 80.9%
a man riding a horse 57.4%
a man and a woman standing next to a horse 57.3%