Human Generated Data

Title

Paul Thek Studio Shoot, Thek Working on Tomb Effigy 8

Date

1967, printed 2010

People

Artist: Peter Hujar, American 1934 - 1987

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Schneider/Erdman Printer's Proof Collection, partial gift, and partial purchase through the Margaret Fisher Fund, 2011.259

Copyright

© The Peter Hujar Archive LLC / Artists Rights Society (ARS), New York

Human Generated Data

Title

Paul Thek Studio Shoot, Thek Working on Tomb Effigy 8

People

Artist: Peter Hujar, American 1934 - 1987

Date

1967, printed 2010

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Schneider/Erdman Printer's Proof Collection, partial gift, and partial purchase through the Margaret Fisher Fund, 2011.259

Copyright

© The Peter Hujar Archive LLC / Artists Rights Society (ARS), New York

Machine Generated Data

Tags

Amazon
created on 2019-04-06

Human 98.4
Person 98.4
Building 96.9
Wood 96.3
Person 87.1
Architecture 81.7
Plywood 80.3
Workshop 76.5
Furniture 75.3
Pillar 74.7
Column 74.7
Clothing 70.9
Apparel 70.9
Metropolis 70.1
City 70.1
Town 70.1
Urban 70.1
Table 68.1
Machine 66.3
Couch 62.5
Symbol 60
Emblem 56.1

Clarifai
created on 2018-03-23

grinder 99.1
people 98.4
industry 97.9
production 97.3
room 96.4
adult 95.2
indoors 93.9
furniture 93.7
one 92.9
military 92.3
hospital 91.8
machine 90.9
war 90.6
employee 90.3
vehicle 90.2
two 89
weapon 87.8
wear 86.7
group 85.9
man 85.4

Imagga
created on 2018-03-23

interior 27.4
industrial 24.5
industry 23
room 22.6
home 22.3
furniture 21.6
modern 21
house 20.9
machine 19.6
steel 19.4
factory 18.5
device 18
architecture 17.2
work 16.5
table 16.2
metal 16.1
building 15.7
living 15.1
power 15.1
indoors 14.9
inside 14.7
apartment 14.4
decor 14.1
sofa 13.8
seat 13.5
plant 12.8
lamp 12.7
design 12.5
structure 12.5
floor 12.1
chair 12.1
heat 12
luxury 12
production 11.7
shop 11.6
vehicle 11.6
window 11.5
comfortable 11.4
heavy 11.4
engineering 11.4
light 11.4
urban 11.3
decoration 11
equipment 10.9
black 10.8
transportation 10.7
manufacturing 10.7
pillow 10.7
engine 10.6
pollution 10.6
old 10.4
style 10.4
construction 10.3
wall 10.3
car 10.2
energy 10.1
machinery 10
cockpit 9.9
mechanic 9.7
mechanical 9.7
technology 9.6
residential 9.6
pipe 9.4
transport 9.1
typesetting machine 9.1
refinery 8.9
cozy 8.8
man 8.7
technical 8.7
space 8.5
smoke 8.4
city 8.3
working 7.9
pillows 7.9
pipeline 7.9
waste 7.8
complex 7.8
steam 7.7
station 7.7
couch 7.7
tube 7.7
gas 7.7
repair 7.7
automobile 7.7
concrete 7.6
reflection 7.6
fashion 7.5
wood 7.5
iron 7.5
oil 7.4
indoor 7.3
business 7.3
domestic 7.2
lifestyle 7.2
worker 7.1
job 7.1
mercantile establishment 7.1

Google
created on 2018-03-23

furniture 67.8

Microsoft
created on 2018-03-23

indoor 91.9
window 80.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 45-65
Gender Male, 50.8%
Sad 51.1%
Calm 46.4%
Happy 45.2%
Surprised 45.2%
Angry 45.9%
Disgusted 45.9%
Confused 45.3%

AWS Rekognition

Age 26-43
Gender Male, 97.5%
Disgusted 1.3%
Calm 86.7%
Angry 3.9%
Confused 3.2%
Surprised 2.9%
Sad 1.2%
Happy 0.8%

Feature analysis

Amazon

Person 98.4%
Couch 62.5%