Human Generated Data

Title

Untitled (men working in potato field)

Date

c. 1920-1940, printed later

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12317

Human Generated Data

Title

Untitled (men working in potato field)

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Date

c. 1920-1940, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Nature 99.8
Outdoors 99.5
Person 94.8
Human 94.8
Snow 90.1
Person 77.3
Person 73.5
Mammal 69.4
Horse 69.4
Animal 69.4
Ice 68.6
Countryside 64.5
Horse 64.4
Winter 63.5
Person 61.5
Sand 58.8
Person 57.3
Storm 56
Blizzard 56
Person 49.1

Clarifai
created on 2019-11-16

people 99.1
winter 98.9
snow 98.2
no person 93.7
man 92.7
road 92.4
adult 92.3
street 91.6
track 90.4
war 90.3
group 89.1
outdoors 87.3
transportation system 86.2
vehicle 85.7
city 85.7
ice 85.5
nature 84.4
many 84.1
landscape 82.7
architecture 82.4

Imagga
created on 2019-11-16

snow 52
billboard 48.2
landscape 44.7
signboard 39
structure 38.6
weather 36.1
sky 33.4
field 26
rural 22.9
country 18.5
forest 18.3
grass 17.4
horizon 17.1
farm 17
tree 17
cloud 15.5
travel 15.5
scenery 15.3
mountain 14.2
agriculture 14.1
scene 13.9
countryside 13.7
trees 13.4
land 12.9
outdoors 12.7
season 12.5
scenic 12.3
outdoor 12.2
rock 12.2
old 11.9
cold 11.2
industry 11.1
meadow 10.8
sunny 10.3
summer 10.3
winter 10.2
memorial 10
wall 10
frost 9.6
construction 9.4
natural 9.4
clouds 9.3
hay 9.3
wood 9.2
park 9.1
environment 9.1
road 9
black 9
sea 8.6
architecture 8.6
building 8.4
vintage 8.3
industrial 8.2
cemetery 8.1
autumn 7.9
outside 7.7
equipment 7.7
dirt 7.6
farming 7.6
stone 7.6
tractor 7.5
harvest 7.5
cloudy 7.5
city 7.5
tourism 7.4
mountains 7.4
ice 7.4
track 7.3
peaceful 7.3
sun 7.3
fall 7.2
open 7.2
coast 7.2
transportation 7.2
tower 7.2
day 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 93.5
television 93
window 85.8
indoor 85.4
screen 81.8
gallery 66.7
screenshot 66.5
person 60.7
sky 59.7
room 52.9
picture frame 51.2
flat 45.4
image 34.1

Face analysis

Amazon

AWS Rekognition

Age 25-39
Gender Male, 54.9%
Calm 54.1%
Confused 45%
Happy 45.1%
Surprised 45.5%
Sad 45.1%
Fear 45%
Angry 45.1%
Disgusted 45%

AWS Rekognition

Age 21-33
Gender Female, 50.4%
Disgusted 49.5%
Calm 49.5%
Confused 49.5%
Sad 49.5%
Happy 49.5%
Angry 50.4%
Fear 49.5%
Surprised 49.5%

Feature analysis

Amazon

Person 94.8%
Horse 69.4%

Captions

Microsoft

a flat screen television 63.5%
a black and white photo of a flat screen television 49%
a black and white photo of a flat screen tv 48.9%