Human Generated Data

Title

Untitled (people playing golf near ocean)

Date

c. 1970

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8143

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (people playing golf near ocean)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1970

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Water 99.8
Person 98.7
Human 98.7
Person 98.3
Person 96.6
Person 86
Fountain 85.1
Pool 71.9
Person 68.9
Machine 67.9
Wheel 67.9
Bird 67.6
Animal 67.6
Play Area 63.8
Playground 63.8
Airplane 61.9
Vehicle 61.9
Aircraft 61.9
Transportation 61.9
Tree 59.7
Plant 59.7
Tarmac 57.7
Asphalt 57.7
Outdoors 56.9
Swimming Pool 56.4

Imagga
created on 2022-01-08

skateboard 63.6
wheeled vehicle 54.9
board 48.5
vehicle 43.9
road 28
conveyance 26.5
travel 22.5
landscape 19.3
sky 19.1
water 18
architecture 16.4
highway 16.4
speedway 16
intersection 16
asphalt 15.6
summer 15.4
sport 15.3
city 15
street 14.7
traffic 14.2
vacation 13.9
landmark 13.5
pool 13.4
line 12.9
transportation 12.6
tree 12.3
urban 12.2
outdoor 12.2
building 11.5
holiday 11.5
athlete 11.4
drive 11.4
speed 11
transport 11
racetrack 10.8
structure 10.5
sea 10.2
clouds 10.1
leisure 10
park 9.9
recreation 9.9
grass 9.5
day 9.4
course 9.3
runner 9.2
trampoline 9.2
exercise 9.1
double 8.8
curve 8.8
scene 8.7
swimming 8.6
hotel 8.6
empty 8.6
direction 8.6
sign 8.3
tourism 8.3
competition 8.2
trees 8
facility 8
freeway 7.9
tennis 7.8
fountain 7.8
equipment 7.8
driving 7.7
swim 7.7
track 7.7
automobile 7.7
way 7.6
gymnastic apparatus 7.5
destination 7.5
island 7.3
paint 7.2
history 7.2
sports equipment 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

black and white 91.4
water 91
text 87.9
athletic game 87.3
beach 85.7
sport 77
sky 62.9

Face analysis

Amazon

AWS Rekognition

Age 14-22
Gender Male, 79.7%
Calm 54.8%
Sad 27.6%
Disgusted 6.2%
Angry 3.9%
Confused 2.8%
Surprised 1.6%
Happy 1.6%
Fear 1.5%

Feature analysis

Amazon

Person 98.7%
Wheel 67.9%
Bird 67.6%
Airplane 61.9%

Captions

Microsoft

a group of people sitting around a baseball field 46.8%
a group of baseball players standing on top of a runway 31.3%
a group of baseball players standing on top of a tarmac 31.2%

Text analysis

Amazon

819
in 819
in