Human Generated Data

Title

Untitled (people playing golf near ocean)

Date

c. 1970

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8142

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (people playing golf near ocean)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1970

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Human 99.5
Person 99.5
Water 98
Person 97.9
Person 97.7
Playground 96.9
Play Area 96.9
Person 95.6
Animal 92.8
Bird 92.8
Person 91.3
Person 88.3
Grass 83.7
Plant 83.7
Bird 79.5
Person 75.9
Play 71.9
Bird 70.8
Tarmac 67
Asphalt 67
Person 62.1
Fountain 61.7
Person 60.8
Tree 56.7

Imagga
created on 2022-01-08

water 25.3
trampoline 24.5
blackboard 22.8
skateboard 22.3
gymnastic apparatus 20.4
wheeled vehicle 19.7
fountain 18.9
board 17.3
sports equipment 17.2
equipment 17.1
vehicle 16.7
travel 16.2
road 15.3
sky 15.3
architecture 14.8
structure 14.1
pool 13.4
city 12.5
summer 12.2
athlete 12.1
building 12.1
sport 11.5
vacation 11.4
urban 10.5
line 10.3
sea 10.2
ocean 9.9
outdoor 9.9
landscape 9.7
conveyance 9.6
house 9.2
speed 9.2
park 9.1
landmark 9
wet 8.9
night 8.9
swim 8.7
swimming 8.6
holiday 8.6
runner 8.6
traffic 8.5
clouds 8.4
tourism 8.2
competition 8.2
environment 8.2
drop 8.2
transportation 8.1
light 8
people 7.8
asphalt 7.8
tennis 7.8
color 7.8
wave 7.8
cold 7.7
modern 7.7
construction 7.7
old 7.7
rain 7.5
street 7.4
exercise 7.3
fitness 7.2
tower 7.2

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 94.9
black and white 91.7
white 62.4
playground 52.4
old 41.6

Face analysis

Amazon

AWS Rekognition

Age 25-35
Gender Female, 87%
Calm 82.5%
Happy 12.5%
Disgusted 1.8%
Angry 1.3%
Surprised 1.1%
Sad 0.5%
Fear 0.2%
Confused 0.2%

AWS Rekognition

Age 1-7
Gender Female, 94.5%
Calm 77.5%
Surprised 15.6%
Sad 5.3%
Disgusted 0.7%
Angry 0.3%
Fear 0.2%
Confused 0.2%
Happy 0.1%

Feature analysis

Amazon

Person 99.5%
Bird 92.8%

Captions

Microsoft

a vintage photo of a crowd 82.9%
a vintage photo of a person 74.5%
a vintage photo of a street 74.4%

Text analysis

Amazon

219
.V-1h 219
.V-1h