Human Generated Data

Title

Untitled (people eating breakfast by the pool)

Date

1961

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11553

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (people eating breakfast by the pool)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1961

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.4
Human 99.4
Person 93.3
Tarmac 93
Asphalt 93
Person 90
Road 89.6
Person 87.3
Person 85.3
Person 78.5
Path 78
Waterfront 70.7
Water 70.7
Building 67.7
Person 64.2
Architecture 63.4
Nature 63.1
Person 61.9
People 60.2
Pedestrian 57
Pier 56.4
Dock 56.4
Port 56.4
Airfield 56.4
Airport 56.4
Freeway 55.1
Person 51.1

Imagga
created on 2022-01-15

aircraft carrier 100
warship 97.4
ship 77.1
military vehicle 72.6
vessel 53.2
vehicle 48.1
city 43.3
architecture 28.9
travel 28.2
sky 26.2
cityscape 25.6
tourism 24.8
urban 24.5
craft 23.5
building 22.6
water 22
landscape 21.6
buildings 20.8
sea 20.7
town 18.6
port 16.4
skyline 16.2
boat 16.1
river 16
aerial 15.5
harbor 15.4
landmark 14.5
tower 14.3
panorama 14.3
shore 14.1
street 13.8
tourist 13.1
vacation 13.1
road 12.7
coast 12.6
old 12.5
church 12
houses 11.6
night 11.6
downtown 11.5
history 10.7
roof 10.5
summer 10.3
clouds 10.1
ocean 10
boats 9.7
scene 9.5
famous 9.3
structure 9
scenic 8.8
dock 8.8
above 8.7
beach 8.6
bridge 8.5
transportation 8.1
homes 7.9
construction 7.7
panoramic 7.7
marine 7.6
traffic 7.6
evening 7.5
center 7.4
air 7.4
light 7.4
transport 7.3
sunset 7.2
religion 7.2
day 7.1

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 99.1
black and white 94.8
white 76.9
black 75.6
monochrome 56
old 53
vintage 35.5

Face analysis

Amazon

AWS Rekognition

Age 18-24
Gender Female, 51.8%
Happy 38.2%
Calm 23.6%
Sad 17.4%
Angry 12%
Confused 4.3%
Fear 2.1%
Surprised 1.4%
Disgusted 1%

AWS Rekognition

Age 23-31
Gender Male, 63.6%
Disgusted 46.7%
Happy 23.7%
Calm 13.3%
Angry 5.2%
Sad 4.8%
Surprised 2.6%
Confused 2.3%
Fear 1.5%

Feature analysis

Amazon

Person 99.4%

Captions

Microsoft

a vintage photo of a person 91.9%
a vintage photo of a person with a racket 77.6%
a vintage photo of some people on a court 77.5%

Text analysis

Amazon

47500.

Google

47500.
47500.