Human Generated Data

Title

Untitled (men working in potato field)

Date

c. 1920-1940, printed later

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12316

Human Generated Data

Title

Untitled (men working in potato field)

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Date

c. 1920-1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12316

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 99.1
Person 99.1
Outdoors 98.1
Person 97.2
Nature 97.2
Person 95.8
Soil 93.4
Truck 90.8
Transportation 90.8
Vehicle 90.8
Person 78.6
Person 73.5
Person 72.9
Countryside 72.2
Sand 69.6
Person 65.6
Field 61.9
Crowd 57.9
Electronics 57.9
Screen 57.9
Military 56.4
Agriculture 56.2
Person 48.7

Clarifai
created on 2019-11-16

winter 98.6
snow 97.9
people 96.7
no person 94.5
architecture 94.2
city 93.5
street 93.2
outdoors 90.9
landscape 90.4
sky 89.6
war 89.5
road 87.9
nature 87.8
building 87.8
travel 87.1
urban 86.3
old 85.2
monochrome 84.9
industry 84.3
house 83.9

Imagga
created on 2019-11-16

landscape 40.9
sky 35.2
structure 33.6
field 27.6
billboard 24
rural 22.9
memorial 21
snow 20.7
grass 19.8
farm 19.6
signboard 19.4
country 19.3
stone 19.1
tree 17
agriculture 15.8
gravestone 15.8
wall 15.8
land 15.7
countryside 15.5
cloud 15.5
travel 15.5
old 15.3
scenery 14.4
horizon 14.4
hay 14.3
mountain 14.2
outdoors 14.2
weather 14
forest 13.9
meadow 13.5
plow 12.9
summer 12.2
rock 12.2
architecture 11.7
season 11.7
machine 11.6
backhoe 11
trees 10.7
scene 10.4
construction 10.3
industry 10.3
tool 10.2
cemetery 10.1
natural 10
outdoor 9.9
park 9.9
environment 9.9
building 9.9
tractor 9.9
megalith 9.7
scenic 9.7
dirt 9.6
wheat 9.5
ancient 9.5
fence 9.5
day 9.4
desert 9.3
clouds 9.3
tourism 9.1
road 9
sun 8.9
textured 8.8
frost 8.6
sunny 8.6
device 8.5
winter 8.5
crop 8.5
horizontal 8.4
autumn 7.9
bale 7.9
black 7.8
cold 7.8
great 7.7
texture 7.6
plant 7.6
england 7.6
farming 7.6
earth 7.6
harvest 7.5
track 7.5
wood 7.5
vintage 7.4
mountains 7.4
stack 7.4
historic 7.3
power shovel 7.2
history 7.2
barrier 7.1
sand 7.1
equipment 7.1
brick 7.1
rampart 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 91.7
television 89.6
window 81.1
screen 72.4
vehicle 69.6
sky 62.7
black and white 62.3
land vehicle 54.8
room 54.2
picture frame 38
image 32.6
flat 31.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 35-51
Gender Male, 54.5%
Disgusted 45%
Happy 48.7%
Angry 47.1%
Calm 45.4%
Surprised 45.1%
Confused 45.1%
Sad 48.2%
Fear 45.4%

AWS Rekognition

Age 22-34
Gender Female, 50.2%
Angry 50.3%
Disgusted 49.5%
Fear 49.5%
Sad 49.6%
Happy 49.6%
Calm 49.5%
Confused 49.5%
Surprised 49.5%

AWS Rekognition

Age 23-37
Gender Female, 50.1%
Angry 49.5%
Sad 49.9%
Confused 49.5%
Disgusted 49.5%
Surprised 49.5%
Happy 49.5%
Fear 49.6%
Calm 49.9%

Feature analysis

Amazon

Person 99.1%
Truck 90.8%

Categories

Captions

Microsoft
created on 2019-11-16

a flat screen television 60.5%
a flat screen television on the wall 56.3%
a flat screen tv 56.2%