Human Generated Data

Title

Untitled (men and two boys working in potato field)

Date

c. 1920-1940, printed later

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11184

Human Generated Data

Title

Untitled (men and two boys working in potato field)

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Date

c. 1920-1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11184

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 98.2
Person 98.2
Person 91.2
Person 90.1
Person 88.9
Person 88.9
Outdoors 87.5
Nature 81.4
Machine 79.1
Wheel 79.1
Screen 72.6
Electronics 72.6
Military 71.4
Military Uniform 71.1
Army 64.7
Armored 64.7
Soil 59.4
Display 57.8
Monitor 57.8
Canvas 56.9

Clarifai
created on 2019-11-16

people 99.5
winter 98.9
snow 98.4
group 95.7
vehicle 95
many 94.3
adult 93.5
man 92.9
war 90.1
television 89.7
no person 87.8
transportation system 87.2
street 86.1
desktop 85.7
military 84.3
soldier 83.9
collage 82.5
road 80.8
picture frame 78.7
group together 77.7

Imagga
created on 2019-11-16

track 34.1
landscape 28.2
sky 24.3
old 22.3
structure 20.7
padlock 18.6
snow 17.8
tree 16.2
fence 15.7
rural 15
forest 14.8
stone 14.7
texture 14.6
lock 14.5
billboard 14.3
grunge 13.6
fastener 13.1
travel 12.7
trees 12.4
vintage 12.4
barrier 12.3
antique 12.1
mountains 12
light 12
field 11.7
signboard 11.6
screen 11.3
outdoors 11.2
grass 11.1
building 10.8
retro 10.6
country 10.5
street 10.1
countryside 10
rough 10
frame 10
road 9.9
mountain 9.8
textured 9.6
wall 9.5
weathered 9.5
grungy 9.5
device 9.5
architecture 9.4
house 9.2
park 9.1
black 9
summer 9
material 8.9
scenic 8.8
ancient 8.6
empty 8.6
damaged 8.6
season 8.6
space 8.5
winter 8.5
color 8.3
cemetery 8.3
dirty 8.1
scenery 8.1
restraint 8.1
holiday 7.9
rock 7.8
art 7.8
cloud 7.7
sunny 7.7
weather 7.7
dirt 7.6
clouds 7.6
city 7.5
peaceful 7.3
obstruction 7.3
aged 7.2
border 7.2
home 7.2
farm 7.1
autumn 7
agriculture 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 98.4
monitor 94.6
indoor 93.7
television 91.2
window 91
electronics 81.1
screen 78.9
vehicle 76.4
white 75.3
picture frame 63.7
land vehicle 62.1
display 62.1
screenshot 52.3
flat 40
computer 31
entertainment center 17.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 22-34
Gender Male, 50.3%
Confused 49.7%
Surprised 49.5%
Happy 49.5%
Sad 49.6%
Disgusted 49.5%
Angry 49.5%
Calm 50.1%
Fear 49.5%

AWS Rekognition

Age 23-35
Gender Female, 50.4%
Happy 50.5%
Angry 49.5%
Disgusted 49.5%
Surprised 49.5%
Sad 49.5%
Fear 49.5%
Calm 49.5%
Confused 49.5%

Feature analysis

Amazon

Person 98.2%
Wheel 79.1%

Categories