Human Generated Data

Title

Untitled (woman playing catch with a small child at the beach)

Date

c. 1955

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10483

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman playing catch with a small child at the beach)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1955

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.2
Human 99.2
Person 98.9
Person 98.8
Nature 98
Person 96.6
Outdoors 94.9
Person 90.1
Weather 86.7
Person 81.5
Person 81.4
Plant 79.4
Tree 79.4
Road 74.2
Person 66.6
Rural 66.3
Countryside 66.3
Shelter 66.3
Building 66.3
Asphalt 64
Tarmac 64
People 62.4
Transportation 61.7
Vehicle 61.4
Palm Tree 59.9
Arecaceae 59.9
Screen 58.4
Electronics 58.4
Monitor 58.4
Display 58.4
LCD Screen 58.4
Cloud 56.9
Cumulus 56.9
Sky 56.9
Landscape 56.2
Person 56.2
Urban 56

Imagga
created on 2022-01-09

stage 54.2
billboard 43.7
platform 43
signboard 35.4
structure 31.4
sky 31.3
cloud 27.6
night 24
landscape 22.3
clouds 20.3
water 18.7
sunset 17.1
smoke 16.7
light 16.7
travel 15.5
beach 15.2
ocean 15.1
city 14.1
sea 14.1
sunrise 14.1
factory 13.5
power 13.4
architecture 13.3
environment 13.2
scene 13
industrial 12.7
television 12.6
pollution 12.5
tourism 12.4
sun 12.1
car mirror 12
blackboard 11.6
chemical 11.6
vacation 11.5
industry 11.1
danger 10.9
dark 10.9
coast 10.8
river 10.7
steam 10.7
mirror 10.2
building 9.9
outdoor 9.9
silhouette 9.9
scenery 9.9
horizon 9.9
toxic 9.8
summer 9.6
dusk 9.5
day 9.4
lake 9.3
energy 9.3
tree 9.2
air 9.2
tower 9
chimney 8.8
sand 8.7
fog 8.7
black 8.4
evening 8.4
famous 8.4
old 8.4
street 8.3
park 8.2
history 8.1
holiday 7.9
color 7.8
wave 7.8
dawn 7.7
construction 7.7
storm 7.7
winter 7.7
coastline 7.5
boat 7.4
island 7.3
broadcasting 7.3
dirty 7.2
landmark 7.2
reflector 7.2
scenic 7

Google
created on 2022-01-09

Plant 93.9
Cloud 93.7
Sky 91.7
Drum 89.9
Black-and-white 84.5
Tree 84.2
Style 83.9
Arecales 82.3
Chair 82.3
Font 79.7
Adaptation 79.4
Musical instrument 78.6
Tints and shades 77.3
Monochrome photography 75.1
Monochrome 74.6
Art 71
Room 69.4
Palm tree 69.2
Rectangle 69.2
Event 68.6

Microsoft
created on 2022-01-09

text 99.3
black and white 85.4
old 81.1
sky 78.3
cloud 76.3
fireworks 55.6
engine 31

Face analysis

Amazon

AWS Rekognition

Age 16-24
Gender Female, 86.7%
Sad 43.1%
Calm 41%
Happy 6.8%
Fear 3.7%
Confused 2.2%
Disgusted 1.2%
Angry 1.1%
Surprised 0.9%

AWS Rekognition

Age 34-42
Gender Male, 80.1%
Angry 62.1%
Calm 11.6%
Sad 8.7%
Confused 7.4%
Fear 5%
Disgusted 3.3%
Happy 1%
Surprised 0.8%

AWS Rekognition

Age 23-33
Gender Male, 99.9%
Calm 97.7%
Sad 1%
Surprised 0.7%
Angry 0.3%
Happy 0.1%
Disgusted 0.1%
Fear 0.1%
Confused 0%

Feature analysis

Amazon

Person 99.2%

Captions

Microsoft

a vintage photo of a group of people on a boat 44.3%
a vintage photo of a person 44.2%
a vintage photo of some people 44.1%

Text analysis

Amazon

44587
MIII-

Google

445
87
445 87