Human Generated Data

Title

Untitled (group of men marching down residential street with car in parade)

Date

c. 1945

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8944

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (group of men marching down residential street with car in parade)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1945

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.7
Human 99.7
Person 99.7
Person 99.3
Person 98.6
Person 97.4
Person 93.1
Person 92.6
Wheel 91.6
Machine 91.6
Pedestrian 90.8
Person 90.7
Person 90.3
Person 88.3
Clothing 87.8
Apparel 87.8
Shorts 87.8
Automobile 86.4
Car 86.4
Transportation 86.4
Vehicle 86.4
Person 86.2
Person 85.9
Person 76.5
Asphalt 76.4
Tarmac 76.4
Bus 73.2
Path 72
People 69.1
Face 65.7
Person 65.6
Train 62.5
Road 61.7
Crowd 60.1
Person 60
Outdoors 59.8
Photography 57.2
Photo 57.2
Building 55.9
Town 55.9
Urban 55.9
Street 55.9
City 55.9
Car 52.2
Person 47.3

Imagga
created on 2022-01-09

motor vehicle 26.3
golf equipment 25.3
barbershop 19.1
sports equipment 19.1
shop 17.2
wheeled vehicle 17.1
travel 16.2
sky 15.9
city 15.8
building 15.3
street 14.7
vehicle 14.2
house 14.2
landscape 14.1
architecture 14.1
equipment 13.8
night 13.3
old 13.2
mercantile establishment 13
stall 12
people 11.7
summer 11.6
outdoor 11.5
world 11.2
tourism 10.7
sun 10.5
scene 10.4
structure 10.3
sea 10.2
transport 10
water 10
transportation 9.9
sand 9.7
urban 9.6
outdoors 9.2
road 9
place of business 8.8
stone 8.6
chair 8.2
vacation 8.2
tourist 8.2
sunny 7.7
men 7.7
tree 7.7
center 7.5
industrial 7.3
car 7.3
holiday 7.2
history 7.2
male 7.1
day 7.1
scenic 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

building 99.3
outdoor 97.4
text 96
black and white 91.4
white 68.7
vehicle 56

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Male, 99.7%
Sad 57.3%
Calm 31.7%
Happy 7%
Fear 2.3%
Confused 0.5%
Surprised 0.5%
Disgusted 0.5%
Angry 0.2%

AWS Rekognition

Age 29-39
Gender Male, 80.3%
Surprised 90.6%
Calm 4.1%
Fear 2.6%
Confused 0.9%
Disgusted 0.6%
Angry 0.6%
Happy 0.4%
Sad 0.3%

AWS Rekognition

Age 23-33
Gender Female, 63.1%
Surprised 77.3%
Fear 7.8%
Calm 6.5%
Happy 4.4%
Confused 1.5%
Sad 0.9%
Angry 0.8%
Disgusted 0.8%

AWS Rekognition

Age 13-21
Gender Male, 67.7%
Calm 83.5%
Happy 4.4%
Sad 4%
Fear 3.6%
Angry 1.5%
Surprised 1.3%
Disgusted 1.1%
Confused 0.7%

AWS Rekognition

Age 24-34
Gender Male, 97.4%
Calm 78%
Sad 8.1%
Fear 3.4%
Angry 3.2%
Confused 3.2%
Surprised 1.8%
Happy 1.5%
Disgusted 0.7%

AWS Rekognition

Age 22-30
Gender Female, 60.8%
Fear 33.1%
Surprised 20%
Angry 15.9%
Sad 9.8%
Confused 7.5%
Happy 5.7%
Calm 4.2%
Disgusted 3.9%

AWS Rekognition

Age 24-34
Gender Male, 84.2%
Calm 90%
Sad 3.7%
Disgusted 2.1%
Happy 1.5%
Surprised 1.1%
Confused 0.6%
Angry 0.6%
Fear 0.3%

AWS Rekognition

Age 18-24
Gender Female, 63.2%
Confused 43.1%
Calm 22.2%
Sad 16.5%
Disgusted 6.2%
Surprised 5.7%
Happy 2.6%
Angry 2.2%
Fear 1.5%

AWS Rekognition

Age 23-33
Gender Male, 55.5%
Happy 88.4%
Fear 4.3%
Sad 3.3%
Calm 1.5%
Disgusted 0.8%
Surprised 0.7%
Angry 0.7%
Confused 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Feature analysis

Amazon

Person 99.7%
Wheel 91.6%
Car 86.4%
Bus 73.2%
Train 62.5%

Captions

Microsoft

a vintage photo of a group of people standing in front of a building 89%
a vintage photo of a group of people walking in front of a building 88.9%
a vintage photo of a group of people in front of a building 88.8%

Text analysis

Amazon

STOP
57
TEL