Human Generated Data

Title

Untitled (parade of cars, seen from above)

Date

1938

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19535

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (parade of cars, seen from above)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1938

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Human 97.8
Person 97.8
Person 95.5
Person 94.4
Outdoors 91.3
Landscape 91.3
Nature 91.3
Metropolis 89.7
City 89.7
Town 89.7
Urban 89.7
Building 89.7
Person 86.5
Person 86.4
Scenery 82.8
Person 74.9
Car 72.6
Transportation 72.6
Automobile 72.6
Vehicle 72.6
Art 68.3
Person 68.2
People 68
Boat 65.6
Boat 60.2
Aerial View 59.8
Road 58.7
Text 58.6
Architecture 56.3

Imagga
created on 2022-03-05

freight car 56
grunge 52.8
car 44.6
nuclear weapon 41.6
old 39.7
texture 38.9
vintage 38.9
wheeled vehicle 35.6
damaged 33.4
weapon of mass destruction 33.3
antique 32.9
wall 30.8
aged 29.9
dirty 27.1
rusty 26.7
ancient 25.1
weapon 24.8
retro 24.6
border 24.4
vehicle 24.3
frame 24.1
material 24.1
structure 23.7
space 22.5
paper 22
art 20.8
empty 20.6
wallpaper 19.9
design 19.9
backdrop 19.8
stain 19.2
parchment 19.2
rough 19.1
old fashioned 19
weathered 19
grungy 19
faded 18.5
textured 18.4
decay 18.3
pattern 17.8
ragged 17.5
stains 17.5
spot 17.3
grain 16.6
grime 16.6
fracture 16.5
crack 16.5
instrument 16
mottled 15.6
crumpled 15.5
graphic 15.3
screen 15.1
historic 14.7
tracery 13.7
messy 13.5
rust 13.5
windshield 13.4
black 13.2
broad 12.8
crease 12.7
scratched 12.7
paint 12.7
succulent 12.6
dark 12.5
backgrounds 12.2
conveyance 12.1
device 11.7
concrete 11.5
worn 11.5
gray 10.8
surface 10.6
aging 10.5
blank 10.3
decoration 10
detailed 9.6
stained 9.6
text 9.6
artistic 9.6
textures 9.5
car mirror 9.4
mirror 9.1
color 8.9
stone 8.9
cracked 8.7
your 8.7
obsolete 8.6
protective covering 8.3
billboard 8.3
brown 8.1
metal 8
light 8
highly 7.9
urban 7.9
abandoned 7.8
broken 7.7
edge 7.7
rustic 7.7
sheet 7.5
computer 7.4
copy 7.1

Microsoft
created on 2022-03-05

text 99.1
vehicle 93.4
black and white 92.1
car 91
land vehicle 90.1
white 64
wheel 61.9
old 61.5
military vehicle 52

Face analysis

Amazon

AWS Rekognition

Age 19-27
Gender Male, 99.2%
Sad 90.8%
Calm 3%
Fear 2.2%
Angry 1.3%
Confused 1.1%
Disgusted 0.9%
Happy 0.5%
Surprised 0.2%

AWS Rekognition

Age 25-35
Gender Male, 98.2%
Calm 88.6%
Fear 9.8%
Sad 0.8%
Disgusted 0.3%
Angry 0.1%
Confused 0.1%
Happy 0.1%
Surprised 0.1%

Feature analysis

Amazon

Person 97.8%
Car 72.6%
Boat 65.6%

Captions

Microsoft

a vintage photo of a crowd 68.7%
a vintage photo of a person 56.6%
a vintage photo of an old building 52.8%

Text analysis

Amazon

7713
A73A
MUJI A73A
MUJI

Google

ר
ר