Human Generated Data

Title

Untitled (man sitting on tractor-drawn potato digging machine)

Date

c. 1930, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5751

Human Generated Data

Title

Untitled (man sitting on tractor-drawn potato digging machine)

People

Artist: Durette Studio, American 20th century

Date

c. 1930, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 99.7
Human 99.7
Machine 96.8
Wheel 96.8
Person 96.1
Wheel 91.2
Outdoors 90.3
Nature 88.5
Vehicle 88.1
Transportation 88.1
Wheel 83.8
Wagon 78.7
Car 78.6
Automobile 78.6
Wheel 78.1
Countryside 65.5
Tire 58.4
Carriage 57.9
Horse Cart 55.6
Spoke 55.1

Clarifai
created on 2019-11-16

people 99.8
vehicle 98.1
adult 96.8
transportation system 95.6
man 95.6
cavalry 94.3
two 93
one 92.8
war 91.6
driver 91.6
group together 89.2
street 89.2
seated 87.8
group 86.6
monochrome 86.2
wagon 85.3
soldier 83.4
military 82.5
road 82.2
three 80.3

Imagga
created on 2019-11-16

bench 52.9
park bench 49.6
plow 47.6
barrow 36.7
tool 36.3
snow 35.6
seat 31.4
handcart 30.2
winter 28.9
wheeled vehicle 27.8
vehicle 27.1
landscape 26
cold 24.1
tree 23.1
park 20.6
conveyance 20.4
furniture 19.2
forest 17.4
season 16.4
trees 16
old 15.3
track 14.7
snowy 14.6
outdoor 14.5
outdoors 14.2
scene 13.8
sky 12.1
day 11.8
people 11.7
frost 11.5
frozen 11.5
rural 11.5
travel 11.3
ice 11.1
sled 11
wood 10.8
man 10.8
seasonal 10.5
weather 10.2
furnishing 9.9
road 9.9
male 9.9
stretcher 9.8
frosty 9.8
country 9.7
black 9.6
vintage 9.1
snowing 8.9
freeze 8.7
walking 8.5
path 8.5
dark 8.3
sport 8.2
countryside 8.2
branch 8.2
scenery 8.1
water 8
night 8
scenic 7.9
litter 7.8
fog 7.7
summer 7.7
lonely 7.7
walk 7.6
field 7.5
silhouette 7.4
light 7.4
sunset 7.2
recreation 7.2
cool 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

ground 99.4
outdoor 97.5
black and white 90.6
text 89.3
monochrome 80.6
tree 69.8
vehicle 65.2
wheel 60.3
land vehicle 51.8
old 42.8

Face analysis

Amazon

AWS Rekognition

Age 39-57
Gender Male, 51.1%
Surprised 45%
Calm 45%
Happy 45%
Angry 45%
Fear 55%
Sad 45%
Confused 45%
Disgusted 45%

Feature analysis

Amazon

Person 99.7%
Wheel 96.8%
Car 78.6%

Captions

Microsoft

a person riding a horse in front of a window 58%
a person riding a horse in front of a building 57.9%
a person riding a horse next to a window 54.7%