Human Generated Data

Title

Untitled (three photographs: two adults and two children wading in river; man sitting on large cut logs; old car driving across bridge)

Date

c. 1950, printed later

People

Artist: Jack Rodden Studio, American 1914 - 2016

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13818

Human Generated Data

Title

Untitled (three photographs: two adults and two children wading in river; man sitting on large cut logs; old car driving across bridge)

People

Artist: Jack Rodden Studio, American 1914 - 2016

Date

c. 1950, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Advertisement 99.9
Collage 99.9
Poster 99.9
Indoors 99.6
Interior Design 99.6
Human 98.6
Person 98.6
Home Decor 94.4
Electronics 92.7
Screen 92.7
Display 89.7
Monitor 89.7
LCD Screen 87.6
Person 81.1
Outdoors 77.7
Nature 74
Head 70.5
Silhouette 58.8
Brick 58.8
Face 57.7
Room 57
Bedroom 55.4
Person 47.9

Clarifai
created on 2019-11-16

people 99.6
collage 98.5
movie 98.1
television 97.5
picture frame 97.4
bill 95.5
man 95.2
monochrome 95.2
dirty 94.9
adult 94.7
group 93.6
war 92.1
art 91.8
street 91.6
vehicle 91.5
margin 90
old 89.6
portrait 87.9
woman 87.2
desktop 86.2

Imagga
created on 2019-11-16

piano 22.3
black 22.2
grunge 21.3
upright 21
vintage 19
web site 17.6
keyboard instrument 17.5
stringed instrument 17.4
old 16.7
art 16.4
musical instrument 15.7
percussion instrument 15.3
screen 14.8
window 14.8
frame 14.4
design 13.5
silhouette 13.2
retro 13.1
pattern 12.3
text 12.2
texture 11.8
antique 11.4
graphic 10.9
paint 10.9
border 10.8
dirty 10.8
man 10.1
people 10
water 10
paper 9.4
light 9.4
space 9.3
film 9.3
landscape 8.9
style 8.9
symbol 8.7
damaged 8.6
glass 8.6
travel 8.4
digital 8.1
material 8
windowsill 7.8
collage 7.7
grungy 7.6
poster 7.5
device 7.5
decoration 7.4
aged 7.2
office 7.2
covering 7.2
night 7.1
architecture 7
modern 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 98.4
black and white 84.5
water 80.9
tree 79.1

Feature analysis

Amazon

Person 98.6%

Captions

Microsoft

a person standing in front of a window 58.9%
a person sitting in front of a window 39.9%
a person in front of a window 39.8%

Text analysis

Google

VAF&NTURE
VAF&NTURE