Human Generated Data

Title

Untitled (group of men marching down street with car in parade)

Date

c. 1945

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8947

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (group of men marching down street with car in parade)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1945

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Car 99.4
Automobile 99.4
Vehicle 99.4
Transportation 99.4
Person 97.7
Human 97.7
Person 97.6
Person 95.5
Person 95
Person 94.7
Wheel 89.3
Machine 89.3
Person 88.8
Person 84.6
Person 79.3
Person 78.2
Person 78.2
Person 70.8
Person 69.1
Pedestrian 67.4
Car Wash 67.1
Person 62.8
Person 60.9
Person 60.2
Person 60.1
Antique Car 59.8
Sports Car 56.2
Person 55.3
Person 54.5
Person 48.8

Imagga
created on 2022-01-09

vehicle 55.1
motor vehicle 40.2
wheeled vehicle 32.4
military vehicle 30.2
conveyance 24.8
golf equipment 24.4
city 21.6
transportation 20.6
car 19.9
travel 18.3
sports equipment 18.3
street 16.6
architecture 16.4
sky 15.9
tourism 15.7
equipment 15
ocean 14.9
vacation 14.7
water 14.7
center 14.5
amphibian 14.5
sea 14.1
road 13.5
building 13.5
landscape 13.4
house 13.4
transport 12.8
coast 12.6
tank 12.5
tree 12.3
boat 12.1
town 12.1
tracked vehicle 11.9
old 11.8
structure 11.5
destination 11.2
construction 11.1
industry 11.1
pier 10.7
harbor 10.6
automobile 10.5
dock 9.7
port 9.6
urban 9.6
residential 9.6
home 9.6
scene 9.5
shore 9.3
outdoor 9.2
machine 9.1
tourist 9.1
summer 9
scenic 8.8
sand 8.7
auto 8.6
truck 8.5
coastline 8.5
passenger 8.4
industrial 8.2
half track 8.1
new 8.1
steamroller 8
heavy 7.6
traffic 7.6
ship 7.6
beach 7.6
cityscape 7.6
drive 7.6
park 7.4
landmark 7.2
village 7.1
river 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 95
black and white 92.6
vehicle 78.4
tree 73.4
white 67
car 57.8
house 53.2

Face analysis

Amazon

AWS Rekognition

Age 19-27
Gender Male, 95.8%
Calm 76.3%
Sad 16.2%
Angry 2.9%
Happy 1.3%
Fear 1.1%
Confused 0.8%
Disgusted 0.8%
Surprised 0.4%

Feature analysis

Amazon

Car 99.4%
Person 97.7%
Wheel 89.3%

Captions

Microsoft

a group of people standing in front of a building 85.2%
a group of people standing outside of a building 85.1%
a group of people standing next to a building 84.2%