Human Generated Data

Title

Untitled (men and two boys working in potato field)

Date

c. 1920-1940, printed later

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12314

Human Generated Data

Title

Untitled (men and two boys working in potato field)

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Date

c. 1920-1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12314

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 99
Human 99
Person 98
Person 97.6
Person 97.5
Person 95.6
Rail 92.4
Railway 92.4
Train Track 92.4
Transportation 92.4
Person 92.2
Nature 91.7
Wheel 89.7
Machine 89.7
Person 87.6
Outdoors 82.2
Electronics 78.9
Screen 78.9
Display 75.9
Monitor 75.9
Wheel 67.1
Person 64.9
LCD Screen 60.9
Snow 59.6
Wheel 57.1

Clarifai
created on 2019-11-16

people 99.6
man 96
group 95.9
winter 95.9
television 95.3
snow 94.7
many 94.7
adult 92.2
vehicle 91.5
war 88.6
street 87.2
transportation system 85.6
picture frame 83.9
movie 82.8
desktop 82.5
no person 81
woman 80.6
city 80.5
military 79.1
soldier 77.3

Imagga
created on 2019-11-16

grunge 57.9
vintage 49.6
old 49.5
texture 47.2
antique 41.5
retro 37.7
damaged 36.2
structure 35
frame 34.9
aged 34.4
television 33.5
material 33
wall 33
border 32.6
rough 31.9
dirty 31.6
pattern 28.7
grungy 28.5
textured 28
rusty 27.6
ancient 25.1
design 24.8
weathered 24.7
paper 23.5
space 23.3
rust 23.1
art 22.8
empty 21.5
messy 21.3
graphic 21.2
black 21
wallpaper 20.7
telecommunication system 19.9
backdrop 19
screen 18.6
text 17.5
old fashioned 17.1
surface 16.8
film 16.4
decay 16.4
stain 16.3
parchment 15.3
spot 15.3
billboard 15
scratch 14.6
decoration 14.6
paint 14.5
edge 14.4
obsolete 14.4
mottled 13.6
fracture 13.6
faded 13.6
crumpled 13.6
detailed 13.5
brown 13.2
blank 12.9
highly 12.8
designed 12.8
ragged 12.7
stains 12.6
crack 12.6
your 12.6
dark 12.5
stone 12.2
grain 12
historic 11.9
noisy 11.8
frames 11.7
noise 11.7
grime 11.7
negative 11.7
collage 11.6
concrete 11.5
dirt 11.5
signboard 11.4
backgrounds 11.4
digital 11.3
color 11.1
gray 10.8
broad 10.8
scratched 10.8
tracery 10.7
succulent 10.7
movie 10.7
layer 10.6
stained 10.6
worn 10.5
scratches 9.8
layered 9.8
mess 9.8
photographic 9.8
computer 9.8
crease 9.8
slide 9.8
close 9.7
mask 9.6
aging 9.6
camera 9.4
cement 8.7
urban 8.7
spotted 8.7
blackboard 8.6
nobody 8.6
detail 8
brass 7.9
overlay 7.9
cracked 7.8
strip 7.8
photograph 7.7
great 7.7
canvas 7.6
textures 7.6
building 7.5
container 7.5
element 7.4
memorial 7.4
entertainment 7.4
broadcasting 7.3
history 7.2
track 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

window 98.9
television 91
indoor 88.9
text 82.7
screen 75.2
vehicle 68
gallery 67
land vehicle 63.6
room 61.6
picture frame 56
old 53.9
flat 39.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 15-27
Gender Male, 50.5%
Disgusted 49.5%
Happy 49.5%
Confused 49.5%
Angry 49.5%
Fear 49.5%
Sad 50.4%
Calm 49.6%
Surprised 49.5%

AWS Rekognition

Age 28-44
Gender Male, 50.4%
Angry 49.5%
Surprised 49.5%
Sad 49.5%
Happy 50.4%
Calm 49.5%
Fear 49.5%
Confused 49.5%
Disgusted 49.5%

AWS Rekognition

Age 35-51
Gender Male, 50.3%
Angry 49.5%
Calm 50%
Happy 49.5%
Sad 49.6%
Fear 49.6%
Surprised 49.7%
Disgusted 49.6%
Confused 49.6%

Feature analysis

Amazon

Person 99%
Wheel 89.7%
Monitor 75.9%