Human Generated Data

Title

Untitled (women marching in parade)

Date

1938

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19537

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (women marching in parade)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1938

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Human 98.2
Person 98.2
Person 96.1
Person 94.6
Clothing 94.1
Apparel 94.1
Person 92.4
Person 88.2
Crowd 81.5
People 80.2
Shoe 76.2
Footwear 76.2
Person 68.1
Person 65.1
Person 62.8
Outdoors 60.4
Path 57.7
Ground 57.1
Standing 56.3
Pedestrian 55.7
Festival 55.4

Imagga
created on 2022-03-05

aircraft carrier 36.7
warship 30.6
vehicle 27.3
ship 26.5
military vehicle 25.8
sky 25.5
landscape 20.8
road 20.8
travel 17.6
sea 17.2
vessel 17.1
sand 16.1
city 15.8
ocean 14.9
water 14.7
wheeled vehicle 14.4
coast 14.4
beach 13.8
truck 13.7
clouds 13.5
car 12.5
architecture 11.7
tourism 11.6
hill 11.2
building 11.2
mountain 11.2
stone 11.2
industry 11.1
structure 11.1
track 10.8
urban 10.5
tree 10
industrial 10
horizon 9.9
summer 9.6
mobile home 9.6
rock 9.6
cloud 9.5
old 9.1
environment 9
sign 9
transportation 9
trees 8.9
motor vehicle 8.8
trailer 8.6
black 8.4
craft 8.4
snow 8.4
shore 8.4
street 8.3
island 8.2
man 8.1
machine 8
scene 7.8
wave 7.8
factory 7.7
construction 7.7
winter 7.7
drive 7.6
housing 7.5
dark 7.5
cloudy 7.5
light 7.4
tourist 7.3
holiday 7.2
day 7.1
scenic 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 98.8
black and white 96.7
outdoor 90.7
drawing 86.3
white 72.9
monochrome 68.8
people 56.3
old 45.4

Face analysis

Amazon

AWS Rekognition

Age 23-33
Gender Female, 59.1%
Calm 99%
Sad 0.3%
Fear 0.3%
Happy 0.1%
Disgusted 0.1%
Angry 0.1%
Surprised 0.1%
Confused 0%

AWS Rekognition

Age 18-24
Gender Female, 54.5%
Sad 69.1%
Calm 25%
Fear 2.4%
Happy 1.1%
Surprised 0.9%
Disgusted 0.6%
Angry 0.6%
Confused 0.2%

AWS Rekognition

Age 35-43
Gender Female, 93.5%
Calm 60.8%
Fear 16.9%
Sad 16.7%
Surprised 1.4%
Happy 1.4%
Disgusted 1.1%
Confused 0.9%
Angry 0.8%

AWS Rekognition

Age 41-49
Gender Female, 69.4%
Sad 69.1%
Fear 11.3%
Calm 8.2%
Happy 6%
Disgusted 2.1%
Surprised 1.2%
Angry 1.2%
Confused 1.1%

AWS Rekognition

Age 25-35
Gender Male, 77.9%
Calm 94.1%
Sad 3.7%
Angry 0.7%
Surprised 0.6%
Fear 0.3%
Confused 0.3%
Disgusted 0.3%
Happy 0.1%

Feature analysis

Amazon

Person 98.2%
Shoe 76.2%

Captions

Microsoft

a vintage photo of a group of people walking down a dirt road 87.9%
a group of people walking down a dirt road 87.8%
a vintage photo of a group of people on a dirt road 86.2%

Text analysis

Amazon

9
9 7
7
7704
MJ17
MJ17 YT3RA3 AFCA
AFCA
YT3RA3
97040

Google

YT33
A
MJ17
MJ17 YT33 A AD A
AD