Human Generated Data

Title

Untitled (man with hat photographed outside of chicken yard)

Date

c. 1910-1920

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3694

Human Generated Data

Title

Untitled (man with hat photographed outside of chicken yard)

People

Artist: Durette Studio, American 20th century

Date

c. 1910-1920

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Person 98.9
Human 98.9
Nature 95.9
Outdoors 93.2
Countryside 78.1
Building 77
Mammal 71.2
Pet 71.2
Canine 71.2
Dog 71.2
Animal 71.2
Rural 68.3
People 66.8
Hut 64.9
Vehicle 61.9
Transportation 61.9
Boat 61.9
Standing 56.4
Shack 56.4

Clarifai
created on 2019-06-01

people 99.4
man 96.7
adult 96.6
group together 95.3
group 94.3
two 94.2
monochrome 91.9
one 91.2
beach 90.4
winter 89.5
snow 86.1
home 84.1
water 82.4
vehicle 82.2
cold 80.2
street 79.9
woman 79.2
many 79.2
outdoors 78.5
several 76.6

Imagga
created on 2019-06-01

shopping cart 100
handcart 91.4
wheeled vehicle 73.8
container 42.6
snow 28.4
conveyance 25.5
winter 23
landscape 20.8
sky 20.5
cold 18.9
bench 16.9
tree 16.2
outdoor 16.1
old 15.3
weather 15.1
water 14.7
park bench 14
forest 13.9
outdoors 13.6
seat 13.5
ice 13.2
rural 13.2
scenery 12.6
frozen 12.4
river 11.6
man 11.4
travel 11.3
scene 11.3
park 10.7
snowy 10.7
trees 10.7
cool 10.7
fog 10.6
building 10.5
black 10.2
beach 9.6
frost 9.6
season 9.4
chair 9.3
field 9.2
danger 9.1
environment 9
people 8.9
person 8.8
frosty 8.8
country 8.8
scenic 8.8
architecture 8.6
serene 8.5
smoke 8.4
dark 8.4
wood 8.3
light 8
summer 7.7
grunge 7.7
structure 7.6
furniture 7.5
vintage 7.4
tourism 7.4
industrial 7.3
transportation 7.2
history 7.2
vehicle 7.1
male 7.1
day 7.1
sea 7

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

outdoor 97.2
old 82.8
black and white 82.5
person 79
vintage 30.7

Face analysis

Amazon

AWS Rekognition

Age 57-77
Gender Male, 51.6%
Sad 49.7%
Angry 45.8%
Disgusted 47.7%
Surprised 45.6%
Happy 45.4%
Calm 45.4%
Confused 45.5%

Feature analysis

Amazon

Person 98.9%
Dog 71.2%
Boat 61.9%

Captions

Microsoft

a vintage photo of a man 90.1%
a vintage photo of a man in a field 90%
a vintage photo of a man riding a horse 81%

Text analysis

Amazon

MAACOL