Human Generated Data

Title

Untitled (woman golfing with man next to golf cart)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7201

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman golfing with man next to golf cart)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.4
Human 99.4
Person 98.6
Sport 97.7
Sports 97.7
Golf Club 96
Golf 96
Shoe 94.7
Clothing 94.7
Footwear 94.7
Apparel 94.7
Putter 85.3
Field 72.1
Person 63.7

Imagga
created on 2022-01-08

brass 41.8
wind instrument 33.4
sport 27.9
bugle 26.3
person 24.9
sky 24.2
man 24.2
active 23.9
people 21.7
musical instrument 21.5
fun 20.9
grass 19.8
lifestyle 19.5
freedom 19.2
exercise 19.1
happy 18.2
happiness 18
action 17.6
jump 17.3
summer 16.7
athlete 16.6
adult 16.2
outdoor 16.1
cornet 15.8
male 15.6
field 15
joy 15
fly 14.9
air 14.7
play 14.6
jumping 14.5
leisure 14.1
player 13.8
healthy 13.2
negative 13
teenager 12.8
fitness 12.6
vacation 12.3
film 12.2
cloud 12
outdoors 11.9
competition 11.9
spring 11.8
silhouette 11.6
businessman 11.5
body 11.2
relaxation 10.9
vitality 10.4
business 10.3
life 10.3
device 10.1
park 9.9
activity 9.8
human 9.7
sun 9.7
holiday 9.3
event 9.2
training 9.2
playing 9.1
black 9
beach 9
game 8.9
symbol 8.7
ball 8.7
flying 8.5
enjoyment 8.4
golfer 8.4
sports 8.3
sax 8.3
photographic paper 8
bright 7.9
season 7.8
fight 7.7
run 7.7
motion 7.7
youth 7.7
relax 7.6
energy 7.6
vacations 7.5
free 7.5
success 7.2
meadow 7.2
sea 7.1
sand 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 99.5
outdoor 85.8
black and white 79.7
baseball 76.5
golf 74.3

Face analysis

Amazon

AWS Rekognition

Age 25-35
Gender Male, 99.1%
Calm 97.4%
Sad 0.6%
Angry 0.5%
Confused 0.4%
Disgusted 0.4%
Happy 0.3%
Surprised 0.2%
Fear 0.1%

Feature analysis

Amazon

Person 99.4%
Shoe 94.7%

Captions

Microsoft

a vintage photo of a group of people posing for the camera 80.7%
a vintage photo of a group of people posing for a picture 79.8%
a vintage photo of a person 79.7%

Text analysis

Amazon

SE
4344
ADIRON АТОГАЯА2 STEMMENTS 4344
АТОГАЯА2
STEMMENTS
ADIRON
KODAK--2VEEA-1TW

Google

35. AGI9OA ATO2A9A2 STAMNIET2 43 44 Es
35.
AGI9OA
ATO2A9A2
STAMNIET2
43
44
Es