Human Generated Data

Title

Untitled (men and two boys working in potato field)

Date

c. 1920-1940, printed later

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12313

Human Generated Data

Title

Untitled (men and two boys working in potato field)

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Date

c. 1920-1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12313

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 96.8
Person 96.8
Wheel 94.3
Machine 94.3
Person 92.9
Person 92.6
Outdoors 88.6
Person 88.3
Nature 87.1
Person 85.2
Screen 83.9
Electronics 83.9
Transportation 70.4
Monitor 70
Display 70
Military 65.8
Military Uniform 65.8
Vehicle 63.3
Car 63.3
Automobile 63.3
Army 62
Armored 62
Plant 62
Land 56.8
LCD Screen 55.2
Person 49.8

Clarifai
created on 2019-11-16

people 99.4
winter 98.8
snow 98.2
group 95.4
vehicle 93.2
no person 93.2
many 92
adult 90.1
street 87.6
road 86.5
man 86.5
transportation system 84.2
television 84
war 83.3
frost 81.8
nature 79.3
landscape 79.3
cropland 76.8
cold 76.5
military 76.4

Imagga
created on 2019-11-16

track 58.7
landscape 31.2
sky 26.9
old 20.2
structure 20.1
snow 19.8
stone 18.8
fence 17.4
mountain 16
tree 15
mountains 14.8
texture 14.6
travel 14.1
rock 13.9
ant 13.7
grunge 13.6
wall 13.5
forest 13.1
ancient 12.1
outdoors 11.9
grass 11.9
barrier 11.8
architecture 11.7
vintage 11.6
antique 11.2
field 10.9
scenery 10.8
weather 10.8
park 10.7
outdoor 10.7
trees 10.7
building 10.6
tie 10.4
billboard 10.2
countryside 10
tourism 9.9
coast 9.9
insect 9.9
country 9.7
textured 9.6
black 9
retro 9
summer 9
vacation 9
rural 8.8
scenic 8.8
brace 8.7
outside 8.6
clouds 8.5
house 8.4
color 8.3
city 8.3
signboard 8.2
horizon 8.1
water 8
obstruction 7.9
day 7.8
sea 7.8
art 7.8
cold 7.7
grungy 7.6
stones 7.6
frame 7.5
device 7.4
rough 7.3
dirty 7.2
roof 7.2
tower 7.2
arthropod 7.1
paper 7.1
gravestone 7
season 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

window 98.4
television 94.2
indoor 92
text 91.2
screen 83.3
white 68.9
black and white 63.6
screenshot 58.1
vehicle 55.4
old 48
flat 43.4
picture frame 42.2
image 35.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-37
Gender Male, 50.5%
Angry 49.5%
Calm 49.5%
Happy 49.5%
Sad 50.4%
Fear 49.5%
Surprised 49.5%
Disgusted 49.5%
Confused 49.5%

AWS Rekognition

Age 31-47
Gender Male, 50.4%
Sad 49.6%
Surprised 49.5%
Confused 49.5%
Fear 49.5%
Angry 49.5%
Happy 50%
Calm 49.9%
Disgusted 49.5%

AWS Rekognition

Age 23-37
Gender Male, 50.5%
Angry 49.5%
Sad 49.5%
Confused 49.5%
Disgusted 49.5%
Surprised 49.5%
Happy 49.5%
Fear 49.5%
Calm 50.4%

AWS Rekognition

Age 47-65
Gender Male, 50.4%
Happy 49.7%
Fear 49.5%
Angry 49.6%
Confused 49.7%
Calm 49.6%
Disgusted 49.6%
Surprised 49.6%
Sad 49.8%

Feature analysis

Amazon

Person 96.8%
Wheel 94.3%
Car 63.3%