Human Generated Data

Title

Untitled (men working in potato field)

Date

c. 1920-1940, printed later

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12315

Human Generated Data

Title

Untitled (men working in potato field)

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Date

c. 1920-1940, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 99.2
Person 99.2
Person 96.4
Soil 92.5
Outdoors 91.7
Nature 90.2
Screen 74.4
Electronics 74.4
Person 71.7
Display 65
Monitor 65
Plant 61.5
Road 61.4

Clarifai
created on 2019-11-16

winter 98.8
snow 97.3
people 92.6
landscape 89.9
no person 89.7
city 88.9
track 87.2
sky 87
industry 87
monochrome 86.2
outdoors 85.6
street 84.6
nature 84
black and white 83.6
travel 83.6
road 83.1
architecture 81.8
urban 81.6
desktop 80.3
empty 80.2

Imagga
created on 2019-11-16

landscape 44.7
sky 39.8
structure 30.4
rampart 25.2
field 25.1
rural 23.8
memorial 20.8
farm 20.5
stone 19.2
snow 19
grass 18.2
wall 18
building 17.7
mountain 16.9
country 16.7
agriculture 16.7
architecture 16.4
track 16.1
old 16
hay 15.8
travel 14.8
megalith 14.8
land 14.7
trees 14.2
tree 14
rock 13.9
weather 13.9
countryside 13.7
clouds 13.5
outdoors 13.4
summer 12.9
billboard 12.3
forest 12.2
scene 12.1
cloud 12.1
horizon 11.7
house 11.7
meadow 11.7
tourism 11.6
scenic 11.4
ancient 11.3
sunny 11.2
construction 11.1
scenery 10.8
fence 10.8
tower 10.8
england 10.5
historic 10.1
signboard 10
park 9.9
history 9.8
sun 9.7
farming 9.5
plant 9.5
industry 9.4
gravestone 9.4
castle 9.4
mountains 9.3
city 9.2
autumn 8.8
wheat 8.6
harvest 8.5
stack 8.3
environment 8.2
national 8.2
fall 8.2
road 8.1
bale 7.9
tractor 7.9
day 7.9
machinery 7.8
outside 7.7
roof 7.7
quiet 7.7
vegetation 7.7
outdoor 7.7
crop 7.5
thatch 7.4
natural 7.4
machine 7.3
brick 7.2
landmark 7.2
equipment 7.1
earth 7.1
season 7
barrier 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

window 90.9
text 89.5
vehicle 63
sky 60.9
black and white 56.2
land vehicle 50.2
old 45.2
flat 27.7
picture frame 24.9

Face analysis

Amazon

AWS Rekognition

Age 23-35
Gender Male, 51.7%
Confused 45.4%
Surprised 45.4%
Happy 45.2%
Sad 46.5%
Disgusted 47%
Angry 46.6%
Calm 45.5%
Fear 48.5%

Feature analysis

Amazon

Person 99.2%

Captions

Microsoft

a flat screen television sits in front of a window 50.3%
a flat screen television 50.2%
a view of a flat screen tv 50.1%