Human Generated Data

Title

Untitled (two photographs: Fourth of July float advertising Houlton, ME music equipment store; two women sitting on top of Fourth of July float)

Date

1938, printed later

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12349

Human Generated Data

Title

Untitled (two photographs: Fourth of July float advertising Houlton, ME music equipment store; two women sitting on top of Fourth of July float)

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Date

1938, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Transportation 83.1
Vehicle 83.1
Boat 83.1
Plant 77.5
Tree 77.5
Poster 71.1
Advertisement 71.1
Collage 71.1
Outdoors 70.2
Human 69
Urban 65.4
Person 63.4
Brick 59.5
Building 57.4

Clarifai
created on 2019-11-16

people 98.3
no person 96
street 93.6
monochrome 92.3
art 88.6
silhouette 86.5
group 85.7
water 84.8
travel 84.1
vehicle 82.8
architecture 82.7
home 82.2
analogue 81.1
war 79.7
bridge 78.7
city 77.5
transportation system 77
calamity 76.9
collage 76.4
one 76.4

Imagga
created on 2019-11-16

backboard 26.2
old 20.9
handcart 20.4
equipment 19.4
silhouette 19
shopping cart 16.1
wheeled vehicle 15.9
black 15.6
light 14.7
window 14.6
grunge 14.5
vintage 14.1
building 13.9
container 13.7
structure 12.9
wall 12.8
dirty 12.7
ashcan 12.2
device 12
texture 11.8
antique 11.2
art 11.1
architecture 10.9
dark 10.9
sky 10.8
design 10.7
water 10.7
night 10.7
blackboard 10.3
glass 10.1
wood 10
frame 10
city 10
travel 9.9
bin 9.7
outdoors 9.7
weathered 9.5
barrow 9.5
scenery 9
retro 9
sunset 9
trees 8.9
pattern 8.9
scene 8.7
dusk 8.6
rough 8.2
aged 8.1
digital 8.1
urban 7.9
forest 7.8
people 7.8
lonely 7.7
bench 7.7
damaged 7.6
man 7.4
street 7.4
calm 7.3
graphic 7.3
park bench 7.3
sun 7.2
negative 7.1
modern 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

tree 99.1
text 98.7
black and white 93.7
ship 87.6
sign 24.3

Face analysis

Amazon

AWS Rekognition

Age 16-28
Gender Female, 50.1%
Surprised 49.5%
Happy 49.5%
Sad 50.4%
Fear 49.5%
Calm 49.5%
Confused 49.5%
Disgusted 49.5%
Angry 49.5%

AWS Rekognition

Age 47-65
Gender Female, 50.1%
Happy 49.5%
Angry 49.5%
Fear 49.5%
Surprised 49.5%
Calm 49.5%
Sad 50.5%
Disgusted 49.5%
Confused 49.5%

Feature analysis

Amazon

Boat 83.1%
Person 63.4%

Captions

Microsoft

a person standing in front of a window 54.3%
a person sitting in front of a window 39%
a person that is standing in front of a window 38.9%

Text analysis

Amazon

TADBELL
DECORDS
H
ST.
YICTO2 DECORDS
6HATER ST. OULTOR
TADBELL Co. H
YICTO2
6HATER
OULTOR
Co.
MDo

Google

VICTOR
4lic
SST
VICTOR RECORDS ARE THE SST TARBELL CO. 6 WATER NOULTOR 4lic
RECORDS
THE
TARBELL
6
NOULTOR
ARE
WATER
CO.