Human Generated Data

Title

Untitled (hunters with dog and ducks)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10400

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (hunters with dog and ducks)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Human 99.7
Person 99.7
Person 97.8
Nature 97.2
Outdoors 96
Wheel 95.3
Machine 95.3
Wheel 94.9
Person 89.7
Vehicle 88.5
Transportation 88.5
Countryside 87.3
Field 77.4
Automobile 77
Car 77
Land 75.7
Rural 74.6
Farm Plow 73.1
Farm 73.1
Agriculture 67.9
Vegetation 64.4
Plant 64.4
Tractor 62.7
Buggy 56.7

Imagga
created on 2022-01-09

memorial 26
gravestone 23.6
stone 22.2
old 20.9
vehicle 18.8
structure 18.6
man 16.8
people 16.7
wheeled vehicle 16.7
newspaper 16.3
statue 16.3
male 14.9
outdoor 14.5
sculpture 14.5
vintage 14.1
art 13.8
product 13.7
transportation 12.5
architecture 12.5
tree 12.3
ancient 12.1
tricycle 12
person 11.9
fountain 11.1
dirty 10.8
sky 10.8
tool 10.8
history 10.7
dirt 10.5
outdoors 10.4
antique 10.4
outside 10.3
creation 10
water 10
religion 9.9
sand 9.8
decoration 9.6
black 9.6
building 9.5
grass 9.5
men 9.4
culture 9.4
grunge 9.4
city 9.1
transport 9.1
industrial 9.1
adult 9.1
snow 9
machine 8.8
boy 8.7
plow 8.6
power 8.4
field 8.4
summer 8.4
tractor 8.2
landscape 8.2
work 8.1
child 8.1
sitting 7.7
industry 7.7
house 7.5
cart 7.5
park 7.4
retro 7.4
speed 7.3
lawn mower 7.3
sun 7.2
rural 7
travel 7

Google
created on 2022-01-09

Wheel 98.1
Tire 97.9
Plant 94.7
Vehicle 92.7
Motor vehicle 92.5
Automotive tire 91.4
Tread 89
Tree 84.7
Car 81.7
Fender 78.7
Grass 74.6
Automotive wheel system 74.5
Wood 73.6
Monochrome 72.1
Monochrome photography 71.8
Classic 70.1
Synthetic rubber 66.3
Auto part 64.8
Machine 63.2
Soil 62.7

Microsoft
created on 2022-01-09

outdoor 98.9
tree 98.9
grass 96.4
text 96.3
wheel 79.4
black and white 74.6
tractor 73.5
vehicle 69.1
land vehicle 67.4
tire 61.1

Face analysis

Amazon

Google

AWS Rekognition

Age 50-58
Gender Male, 85.4%
Calm 99.7%
Sad 0.1%
Happy 0.1%
Surprised 0%
Angry 0%
Disgusted 0%
Confused 0%
Fear 0%

AWS Rekognition

Age 30-40
Gender Male, 96%
Sad 37.3%
Happy 33.7%
Calm 24.1%
Surprised 2.2%
Confused 0.9%
Disgusted 0.8%
Angry 0.7%
Fear 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Wheel 95.3%
Car 77%

Captions

Microsoft

a group of people riding on the back of a truck 55.2%
a group of men riding on the back of a truck 45.3%
a group of people standing in front of a truck 45.2%

Text analysis

Amazon

42708
RODVR
RODVR COVEEIX-EIW
COVEEIX-EIW

Google

42208.
42208.