Human Generated Data

Title

Self Portrait with Photographic Paraphernalia, New York

Date

1929, printed 1985

People

Artist: Edward Steichen, American 1879 - 1973

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Sidney and Shirley Singer, 2013.180.12

Copyright

© The Estate of Edward Steichen / Artists Rights Society (ARS), New York

Human Generated Data

Title

Self Portrait with Photographic Paraphernalia, New York

People

Artist: Edward Steichen, American 1879 - 1973

Date

1929, printed 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Sidney and Shirley Singer, 2013.180.12

Copyright

© The Estate of Edward Steichen / Artists Rights Society (ARS), New York

Machine Generated Data

Tags

Amazon
created on 2019-04-08

Person 99.1
Human 99.1
Chair 83.9
Furniture 83.9
Sitting 76
Clothing 72.6
Apparel 72.6
Shoe 72.3
Footwear 72.3
Flooring 66.3
Finger 64.4
Electronics 61.3
Camera 61.3
Door 59.8
Banister 55.8
Handrail 55.8
Screen 55.8
LCD Screen 55.8
Monitor 55.8
Display 55.8

Clarifai
created on 2018-02-09

people 99.2
adult 96.6
one 96.5
man 96.1
two 94.6
indoors 91.3
vehicle 90.9
wear 88.3
music 87.7
recreation 85.6
business 84.9
portrait 84.9
woman 84.6
group 83.3
outfit 81.6
airplane 79.9
room 79.2
aircraft 78.2
transportation system 77.8
airport 76.8

Imagga
created on 2018-02-09

passenger 41.5
architecture 25.1
building 24.2
city 18.3
prison 18
adult 16.8
window 16.6
people 16.2
person 15.2
urban 14.8
modern 14.7
business 14.6
device 14.1
house 13.4
man 12.8
corporate 12
construction 12
old 11.8
elevator 11.6
structure 11.4
office 11.3
travel 11.3
door 11
smiling 10.8
happy 10.6
guillotine 10.3
sky 10.2
male 10.1
looking 9.6
standing 9.6
happiness 9.4
institution 8.8
indoors 8.8
work 8.6
buildings 8.5
boat 8.4
portrait 8.4
pretty 8.4
town 8.3
fashion 8.3
transportation 8.1
home 8
lifestyle 7.9
women 7.9
wall 7.9
glass 7.8
bridge 7.7
outside 7.7
windows 7.7
outdoors 7.5
instrument 7.4
water 7.3
worker 7.2
steel 7.2
posing 7.1
interior 7.1

Google
created on 2018-02-09

Microsoft
created on 2018-02-09

sitting 96.3
person 89.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 16-27
Gender Male, 96.6%
Confused 19.3%
Surprised 7.6%
Angry 12.1%
Sad 18.3%
Happy 17.1%
Calm 22%
Disgusted 3.5%

Feature analysis

Amazon

Person 99.1%
Shoe 72.3%
Camera 61.3%