Human Generated Data

Title

Untitled (men and two boys working in potato field)

Date

c. 1920-1940, printed later

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11170

Human Generated Data

Title

Untitled (men and two boys working in potato field)

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Date

c. 1920-1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11170

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 99.1
Human 99.1
Wheel 97.1
Machine 97.1
Person 96.7
Person 96.5
Person 96
Person 94.4
Nature 86.3
Wheel 86
Outdoors 85
Person 78.4
Person 77.7
Screen 77
Electronics 77
Person 70
Display 68.6
Monitor 68.6
Mammal 64.1
Animal 64.1
Horse 64.1
Plant 60.1
LCD Screen 59.3
Wheel 54.7
Person 51.8
Person 47.1
Person 43.7

Clarifai
created on 2019-11-16

people 99.4
vehicle 95.2
group 95
winter 94.3
adult 93.6
war 91.3
transportation system 90.9
group together 90.4
many 89
man 88.4
no person 88.2
snow 87
cropland 85.2
soldier 83.8
road 83.2
two 82.2
military 81.9
street 81.3
one 77.9
industry 76.6

Imagga
created on 2019-11-16

track 49.5
snow 33
landscape 32
fence 25.1
old 23
sky 21.8
texture 19.5
structure 18.9
barrier 18.4
stone 18.1
grunge 17.9
rock 17.4
vintage 16.5
weather 16.4
wall 16.1
antique 15.6
mountain 13.4
black 13.2
tree 13.2
textured 12.3
travel 12
outdoors 11.9
obstruction 11.7
frame 11.7
pattern 11.6
retro 11.5
forest 11.3
ancient 11.2
rough 10.9
winter 10.2
mountains 10.2
space 10.1
border 9.9
scenery 9.9
park 9.9
trees 9.8
grungy 9.5
field 9.2
countryside 9.1
art 9.1
material 9.1
aged 9
dirty 9
paper 8.6
frozen 8.6
season 8.6
weathered 8.5
clouds 8.5
outdoor 8.4
city 8.3
ice 8.3
building 8.3
cemetery 8.2
memorial 8.2
road 8.1
rural 7.9
empty 7.7
outside 7.7
frost 7.7
damaged 7.6
old fashioned 7.6
tie 7.5
environment 7.4
grain 7.4
light 7.4
gravestone 7.3
color 7.2
gray 7.2
roof 7.1
day 7.1
country 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 93.8
window 91.3
electronics 73
screen 71.2
vehicle 65.9
old 57.8
display 53.4
flat 52.9
land vehicle 52.8
picture frame 47.7
image 35.1
entertainment center 21.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-35
Gender Male, 50.4%
Fear 49.6%
Disgusted 50%
Calm 49.5%
Happy 49.6%
Surprised 49.5%
Angry 49.6%
Sad 49.6%
Confused 49.6%

AWS Rekognition

Age 30-46
Gender Male, 50.4%
Fear 49.5%
Calm 50.1%
Sad 49.7%
Angry 49.5%
Disgusted 49.5%
Happy 49.5%
Surprised 49.5%
Confused 49.6%

AWS Rekognition

Age 45-63
Gender Male, 50.3%
Angry 49.7%
Calm 49.6%
Surprised 49.5%
Confused 49.6%
Disgusted 49.5%
Happy 49.5%
Sad 50%
Fear 49.5%

AWS Rekognition

Age 21-33
Gender Male, 50.2%
Angry 49.5%
Confused 49.5%
Disgusted 49.6%
Happy 49.5%
Sad 50.1%
Fear 49.6%
Calm 49.6%
Surprised 49.5%

Feature analysis

Amazon

Person 99.1%
Wheel 97.1%
Horse 64.1%