Human Generated Data

Title

Untitled (couple golfing with golf cart)

Date

1963

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8117

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (couple golfing with golf cart)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1963

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.6
Human 99.6
Person 98.9
Clothing 97.5
Apparel 97.5
Vehicle 92.5
Automobile 92.5
Transportation 92.5
Car 92.5
Shorts 90.2
People 88.5
Plant 77.7
Sport 71.7
Sports 71.7
Grass 70.6
Tree 66.6
Team Sport 60.1
Team 60.1
Machine 59.5
Wheel 59.5
Golf Cart 58.4
Field 57.2
Photo 55.3
Portrait 55.3
Face 55.3
Photography 55.3
Building 55.2
Person 52.3

Imagga
created on 2022-01-15

motor vehicle 62.9
golf equipment 62.4
wheeled vehicle 49.7
sports equipment 49
swing 33.8
equipment 29.2
vehicle 27.8
mechanical device 25.9
plaything 25.7
winter 24.7
snow 24.6
tree 23.1
landscape 20.8
sky 19.8
mechanism 19.3
park 18.1
trees 17.8
outdoor 17.6
cold 16.4
forest 14.8
tricycle 14.4
rural 14.1
outdoors 13.1
person 12.7
sunset 12.6
road 11.7
field 11.7
snowy 11.7
man 11.4
building 11.3
travel 11.3
grass 11.1
summer 10.9
season 10.9
river 10.7
bench 10.7
people 10.6
sun 10.5
weather 10.4
scene 10.4
adult 10.3
day 10.2
house 10
country 9.7
seasonal 9.6
frozen 9.6
cloud 9.5
relax 9.3
street 9.2
wood 9.2
sport 9.1
old 9.1
night 8.9
conveyance 8.7
water 8.7
architecture 8.6
sitting 8.6
evening 8.4
city 8.3
countryside 8.2
vacation 8.2
light 8
holiday 7.9
urban 7.9
structure 7.8
sunny 7.7
handcart 7.7
clouds 7.6
dark 7.5
peaceful 7.3
time 7.3
peace 7.3
yellow 7.3
scenery 7.2
mountain 7.1
male 7.1

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

tree 99.5
outdoor 99.2
road 98.7
text 88.7
vehicle 88.2
land vehicle 83
black and white 75.5
car 55.8

Face analysis

Google

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Feature analysis

Amazon

Person 99.6%
Car 92.5%
Wheel 59.5%

Captions

Microsoft

a person riding on the back of a pickup truck 37.7%

Text analysis

Amazon

49248
EZGO