Human Generated Data

Title

Untitled (formally dressed women waiting to board a bus)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7405

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (formally dressed women waiting to board a bus)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Apparel 99.1
Clothing 99.1
Person 99
Human 99
Person 98.3
Person 95.3
Person 90.9
Car 90.1
Transportation 90.1
Vehicle 90.1
Automobile 90.1
Robe 81.5
Fashion 81.5
Person 78.8
Gown 77.8
Person 74.9
Wedding 74.5
People 72.1
Face 69
Bumper 67.7
Female 65.3
Photography 63.7
Photo 63.7
Person 63.7
Person 62.2
Portrait 61.8
Wedding Gown 61.7
Bridegroom 61.7
Evening Dress 59.7
Sports Car 56.9

Imagga
created on 2022-01-08

negative 46.8
film 36.1
photographic paper 24.9
grunge 18.7
old 17.4
photographic equipment 16.6
newspaper 15.1
black 14.4
vintage 14.1
space 14
sky 13.4
water 13.3
grungy 13.3
snow 13.2
landscape 12.6
design 11.8
border 11.8
dirty 11.7
product 11.6
retro 11.5
antique 11.3
texture 11.1
pattern 10.9
art 10.9
frame 10.8
outdoor 10.7
cool 10.7
winter 10.2
light 10.1
power 10.1
industrial 10
city 10
travel 9.9
trees 9.8
creation 9.7
building 9.7
damaged 9.5
cold 9.5
vessel 9.4
rough 9.1
business 9.1
paint 9
urban 8.7
scene 8.7
cloud 8.6
peaceful 8.2
man 8.2
splash 8
paper 8
river 8
text 7.9
forest 7.8
container 7.8
color 7.8
steam 7.8
edge 7.7
tree 7.7
industry 7.7
mask 7.7
weathered 7.6
ice 7.6
dark 7.5
weather 7.4
smoke 7.4
park 7.4
environment 7.4
people 7.2
decoration 7.2
material 7.1
businessman 7.1
architecture 7
season 7

Microsoft
created on 2022-01-08

water 94.9
text 93.6
outdoor 90.9
black and white 87.4

Face analysis

Amazon

Google

AWS Rekognition

Age 23-31
Gender Female, 95.3%
Calm 65.8%
Sad 13.4%
Happy 8.6%
Fear 7.7%
Confused 1.6%
Disgusted 1.2%
Surprised 1%
Angry 0.6%

AWS Rekognition

Age 23-33
Gender Female, 91.7%
Happy 63.1%
Fear 21.1%
Surprised 4.4%
Confused 4%
Sad 2.7%
Calm 2.3%
Angry 1.2%
Disgusted 1.2%

AWS Rekognition

Age 18-26
Gender Male, 77.6%
Sad 81%
Disgusted 5.9%
Calm 4.3%
Fear 3.8%
Happy 1.6%
Angry 1.2%
Confused 1.2%
Surprised 1%

AWS Rekognition

Age 24-34
Gender Female, 96.9%
Fear 38.3%
Calm 35.6%
Sad 19%
Happy 2.6%
Surprised 1.3%
Angry 1.2%
Confused 1.1%
Disgusted 0.9%

AWS Rekognition

Age 22-30
Gender Female, 62.8%
Calm 92.5%
Sad 2.1%
Happy 1.6%
Fear 1.5%
Surprised 0.9%
Confused 0.7%
Angry 0.4%
Disgusted 0.3%

AWS Rekognition

Age 11-19
Gender Male, 94.4%
Sad 35.3%
Calm 29.3%
Fear 13.9%
Confused 11.4%
Disgusted 4.4%
Happy 3.6%
Angry 1.3%
Surprised 0.9%

AWS Rekognition

Age 27-37
Gender Female, 74.9%
Calm 87%
Happy 6%
Sad 3.1%
Confused 1.8%
Surprised 0.8%
Angry 0.5%
Disgusted 0.4%
Fear 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%
Car 90.1%

Captions

Microsoft

a group of people sitting on a bench 41.9%
a group of people sitting at a table 41.8%
a group of people posing for a photo 41.7%

Text analysis

Amazon

SPECIAL
15951.
by

Google

15951.
15951. SPECIAL 15951.
SPECIAL