Human Generated Data

Title

Untitled (employee in loading dock for Genest bread company)

Date

c. 1938, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5756

Human Generated Data

Title

Untitled (employee in loading dock for Genest bread company)

People

Artist: Durette Studio, American 20th century

Date

c. 1938, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 99.2
Human 99.2
Train 91.8
Transportation 91.8
Vehicle 91.8
Warehouse 84.6
Building 84.6
Indoors 65.5
Shelf 56

Clarifai
created on 2019-11-16

no person 98.2
warehouse 96.7
people 94.9
indoors 93.8
industry 93
stock 90.2
business 90.1
grinder 89.8
storage 89.4
container 88.3
transportation system 87.9
room 85.5
crate 83.4
commerce 82.8
train 82.6
rack 82.5
shelf 82.1
monochrome 80.5
vehicle 80.1
group 79

Imagga
created on 2019-11-16

warehouse 100
gate 23.4
urban 22.7
city 22.5
architecture 21.1
building 20.7
travel 19.7
industry 19.7
transportation 18.8
interior 17.7
turnstile 17.5
modern 16.8
station 16.4
business 15.2
street 14.7
wall 14.7
industrial 14.5
steel 13.3
window 12.8
transport 12.8
corridor 12.8
metal 12.1
supermarket 11.8
glass 11.7
tunnel 11.5
perspective 11.3
old 11.2
construction 11.1
house 10.9
sky 10.8
light 10.7
entrance 10.6
movable barrier 10.5
traffic 10.5
office 10.4
train 10.4
barrier 10.3
road 9.9
rail 9.8
hall 9.7
shipping 9.7
passage 9.6
storage 9.5
empty 9.5
store 9.5
water 9.3
inside 9.2
vehicle 9
factory 8.8
public 8.7
cargo 8.7
highway 8.7
line 8.6
motion 8.6
row 8.4
place 8.4
floor 8.4
locker 8
river 8
world 8
metro 7.9
subway 7.9
underground 7.9
railway 7.8
door 7.6
brick 7.5
technology 7.4
deck 7.4
dairy 7.4
car 7.3
speed 7.3
reflection 7.3
people 7.3
structure 7.2
grocery store 7.2
room 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 99.5
black and white 92.1

Face analysis

Amazon

AWS Rekognition

Age 22-34
Gender Female, 50.5%
Sad 50.2%
Disgusted 49.5%
Fear 49.5%
Angry 49.5%
Happy 49.5%
Surprised 49.5%
Calm 49.8%
Confused 49.5%

Feature analysis

Amazon

Person 99.2%
Train 91.8%

Captions

Microsoft

a person standing in front of a store 49.5%
a close up of a store 49.4%
a portrait of a store 49.3%

Text analysis

Amazon

SCIENCE
HOME SCIENCE
BREAD
BRBAD
HOME
NH
GENESTS
HESTER NH
RRPAD
GENEST'S
RRRAN
HESTER
NH.
BEAD
N
esdf
T3 BRBAD
T3

Google

GENESTS HOME ST BREA GENESTS CETER GENESTS sOME SCENO DREAD cESTER BRLA GENESTS NE ENE HOME SCIENCE scKE TS BREAD GENESTS OME SCONCE BREAD ESTER NH S BREAD TS BREAD HESTER NH TS BREAD
GENESTS
HOME
ST
BREA
CETER
sOME
SCENO
DREAD
cESTER
BRLA
NE
ENE
SCIENCE
scKE
TS
BREAD
OME
SCONCE
ESTER
NH
S
HESTER