Human Generated Data

Title

The Blue Sky-Dana Steichen, Long Island, New York

Date

1923, printed 1984

People

Artist: Edward Steichen, American 1879 - 1973

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Sidney and Shirley Singer, 2013.182.1

Copyright

© The Estate of Edward Steichen / Artists Rights Society (ARS), New York

Human Generated Data

Title

The Blue Sky-Dana Steichen, Long Island, New York

People

Artist: Edward Steichen, American 1879 - 1973

Date

1923, printed 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Sidney and Shirley Singer, 2013.182.1

Copyright

© The Estate of Edward Steichen / Artists Rights Society (ARS), New York

Machine Generated Data

Tags

Amazon
created on 2019-04-08

Clothing 95.3
Apparel 95.3
Plant 89.3
Grass 86.9
Human 84.2
People 72
Finger 71.3
Person 67.4

Clarifai
created on 2018-02-09

people 99.2
adult 98.2
one 97.7
man 97.2
portrait 96.5
monochrome 95.8
woman 95.3
outdoors 93.9
nature 93
wood 91.7
grass 91.3
girl 89.4
military 88.9
wear 86.4
summer 85.7
war 85.6
danger 84.4
tree 84
vintage 82.6
rain 81.1

Imagga
created on 2018-02-09

agave 30.4
steel 20.5
metal 20.1
device 19.2
industry 16.2
vehicle 16.1
sky 14.7
transportation 14.3
power 14.3
old 13.9
machine 13.8
construction 13.7
building 13
equipment 12.7
industrial 12.7
car 12.2
plant 12
structure 11.8
spoke 11.8
summer 11.6
technology 11.1
work 11
business 10.9
outdoor 10.7
rural 10.6
energy 10.1
light 10
vintage 9.9
support 9.9
factory 9.9
farm 9.8
outdoors 9.7
rust 9.6
tree 9.5
architecture 9.4
sun 8.8
high 8.7
mask 8.6
wheel 8.6
automobile 8.6
clouds 8.4
iron 8.4
vessel 8.4
field 8.4
countryside 8.2
truck 8.1
man 8.1
black 7.8
abandoned 7.8
labor 7.8
auto 7.6
smoke 7.4
landscape 7.4
transport 7.3
people 7.2
tower 7.2
bright 7.1
grass 7.1
travel 7

Google
created on 2018-02-09

Microsoft
created on 2018-02-09

outdoor 97.3
person 96
plant 57.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 30-47
Gender Male, 55.8%
Disgusted 5.3%
Surprised 1.5%
Sad 13.6%
Calm 70.1%
Angry 4.9%
Happy 0.7%
Confused 3.7%

Feature analysis

Amazon

Person 67.4%

Captions