Human Generated Data

Title

Untitled (woman reading magazine on Ringling Circus train)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11705

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman reading magazine on Ringling Circus train)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Clothing 98.7
Apparel 98.7
Female 96.8
Human 96.8
Furniture 94.5
Chair 94.2
Dress 93.6
Person 90.9
Text 90.9
Woman 86.4
Reading 85.5
Face 85.4
Newspaper 79.4
Pillow 77.3
Cushion 77.3
Girl 77.3
Door 70.9
Portrait 70.2
Photography 70.2
Photo 70.2
Blonde 65.1
Teen 65.1
Child 65.1
Kid 65.1
Window 60
Smile 58.1
Suit 57.1
Coat 57.1
Overcoat 57.1
Dish 57
Food 57
Meal 57

Imagga
created on 2022-02-05

car 50
refrigerator 37.2
white goods 31.4
transportation 25.1
drive 24.6
home appliance 24.5
vehicle 23.6
seat 23.3
automobile 22
auto 21
transport 21
driver 20.4
interior 20.3
appliance 18.5
man 17.5
adult 17.4
device 17.2
wheel 17
driving 16.4
person 16.2
people 15
male 14.9
inside 14.7
sitting 14.6
support 13.7
modern 13.3
new 12.1
luxury 12
attractive 11.9
cockpit 11.7
portrait 11.6
smiling 11.6
black 11.4
window 11
happy 10.6
travel 10.6
business 10.3
steering 9.9
comfortable 9.5
mirror 9.5
light 9.3
face 9.2
20s 9.2
pretty 9.1
equipment 9
dashboard 8.9
machine 8.9
shower curtain 8.8
model 8.5
leather 8.5
durables 8.4
hand 8.3
fashion 8.3
motor vehicle 8.2
speed 8.2
technology 8.2
road 8.1
furnishing 7.9
women 7.9
indoors 7.9
design 7.9
work 7.8
happiness 7.8
glass 7.8
control 7.7
plane seat 7.7
child 7.6
power 7.5
holding 7.4
sexy 7.2
home 7.2
hair 7.1
smile 7.1
working 7.1
curtain 7

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

text 98.7
window 92.1
black and white 88.5
person 85
clothing 74.4
human face 66.7
man 56.9
open 46.8

Face analysis

Amazon

Google

AWS Rekognition

Age 45-51
Gender Female, 83.4%
Calm 80.7%
Sad 9.2%
Happy 4.5%
Surprised 3.6%
Confused 1%
Disgusted 0.5%
Fear 0.3%
Angry 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 90.9%

Captions

Microsoft

a man holding a book in front of a window 33.5%
a man standing in front of a window 33.4%
a man standing in front of a window holding a book 28.7%

Text analysis

Amazon

MILK-BONE
DOG
DOG BISCUIT
BISCUIT
MILK
MILK BONE
BONE
16208
MURDER
16208.
MJI7
16308
ЭТАЯТIИ
MJI7 ЭТАЯТIИ ARDA
ARDA

Google

16208.
....
MILK-BONE
16208. .... 16208. MILK-BONE