Human Generated Data

Title

Untitled (elephant walking on to train car)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8543

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (elephant walking on to train car)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 99.5
Human 99.5
Person 98.7
Person 96.4
Person 96.2
Person 91.8
Outdoors 81.6
People 79.7
Person 78.5
Nature 75.1
Mammal 74.3
Horse 74.3
Animal 74.3
Horse 67.2
Military 66
Military Uniform 66
Elephant 62.4
Wildlife 62.4
Army 62.3
Armored 62.3
Apparel 58.9
Clothing 58.9
Shorts 58.9

Imagga
created on 2022-02-05

billboard 100
signboard 86.1
structure 69.7
sky 24.2
landscape 17.8
city 17.5
architecture 17.2
travel 16.2
river 16
water 14.7
tree 14.6
dishwasher 14
black 13.8
bridge 13.8
freight car 13.7
car 13.1
urban 13.1
building 13
scene 13
cloud 12.9
winter 12.8
outdoors 11.9
old 11.8
clouds 11.8
landmark 11.7
trees 11.6
tourism 11.5
night 11.5
white goods 11.3
light 10.7
negative 10.5
wheeled vehicle 10.3
park 9.9
transportation 9.9
tower 9.8
summer 9.6
cold 9.5
film 9.4
reflection 9.2
transport 9.1
modern 9.1
environment 9
design 9
pattern 8.9
home appliance 8.9
season 8.6
cityscape 8.5
business 8.5
vehicle 8.4
dark 8.4
snow 8.2
dirty 8.1
scenery 8.1
history 8
scenic 7.9
forest 7.8
color 7.8
fog 7.7
outdoor 7.6
evening 7.5
silhouette 7.4
vintage 7.4
symbol 7.4
vacation 7.4
street 7.4
road 7.2
sunset 7.2
bright 7.1

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

text 99.6
outdoor 94.4
black and white 83.4

Face analysis

Amazon

AWS Rekognition

Age 30-40
Gender Male, 95.1%
Calm 99.9%
Happy 0%
Confused 0%
Surprised 0%
Sad 0%
Disgusted 0%
Fear 0%
Angry 0%

AWS Rekognition

Age 23-31
Gender Male, 89.3%
Happy 59.3%
Angry 13.1%
Disgusted 7.3%
Calm 6.6%
Sad 5%
Confused 4.9%
Surprised 2.3%
Fear 1.4%

AWS Rekognition

Age 25-35
Gender Female, 69.8%
Happy 87.7%
Calm 8.6%
Disgusted 1.4%
Surprised 1%
Angry 0.5%
Sad 0.4%
Confused 0.2%
Fear 0.2%

Feature analysis

Amazon

Person 99.5%
Horse 74.3%
Elephant 62.4%

Captions

Microsoft

a man standing in front of a television 27%

Text analysis

Amazon

BARNUM
BROS.
BROS. AND
AND
RINGLING
&
BA
15260
M 117
S
ЭТАЯТIИ
M 117 ЭТАЯТIИ A70A
16260.
A70A

Google

BA
RINGLING
1526
BARNUM
ЭТАЯТИ
BROS,
AND
1526 BA BARNUM RINGLING BROS, AND 15260. ЭТАЯТИ АДА
15260.
АДА