Human Generated Data

Title

Untitled (street artist drawing portrait; old woman artist, young girl sitter)

Date

c. 1950

People

Artist: Mary Lowber Tiers, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15849

Human Generated Data

Title

Untitled (street artist drawing portrait; old woman artist, young girl sitter)

People

Artist: Mary Lowber Tiers, American active 1940s

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 99.2
Human 99.2
Person 98.8
Person 95.4
Clothing 95.2
Apparel 95.2
Furniture 89.6
Female 81.6
Chair 78.7
Sitting 75.1
Table 73.9
Text 73
Shoe 69.6
Footwear 69.6
Indoors 67.8
Face 67.4
Woman 66.7
Room 65.5
Robe 61.9
Evening Dress 61.9
Fashion 61.9
Gown 61.9
Photo 60.4
Photography 60.4
Floor 59.3
Shoe 58.9
Overcoat 58.2
Coat 58.2
Suit 58.2
Advertisement 57.7
Housing 56.5
Building 56.5
Sleeve 56.2
Poster 56

Imagga
created on 2022-02-05

newspaper 46.8
barbershop 43.8
shop 39.1
product 34.9
creation 28.3
mercantile establishment 28
office 21.7
man 21.5
room 19.7
people 19.5
computer 19.5
place of business 18.7
indoors 18.4
home 18.3
business 17.6
male 17
laptop 16.9
person 16.2
sitting 14.6
art 14.2
interior 14.1
sculpture 13.1
window 13
old 12.5
working 12.4
technology 11.9
house 11.7
history 11.6
monitor 10.7
call 10.7
chair 10.6
statue 10.6
black 10.2
architecture 10.1
vintage 9.9
adult 9.9
desk 9.8
businessman 9.7
antique 9.7
table 9.4
happy 9.4
lifestyle 9.4
establishment 9.3
back 9.2
indoor 9.1
alone 9.1
screen 8.7
smiling 8.7
work 8.6
one person 8.5
senior 8.4
portrait 8.4
building 8.2
smile 7.8
notebook 7.7
men 7.7
modern 7.7
attractive 7.7
city 7.5
furniture 7.4
inside 7.4
salon 7.3
businesswoman 7.3
professional 7.1
happiness 7

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

text 99.8
person 96.5
clothing 73.7
piano 60.3
furniture 55.8
drawing 52.7

Face analysis

Amazon

Google

AWS Rekognition

Age 33-41
Gender Female, 76.8%
Calm 99.9%
Sad 0.1%
Confused 0%
Surprised 0%
Angry 0%
Disgusted 0%
Fear 0%
Happy 0%

AWS Rekognition

Age 33-41
Gender Male, 92.6%
Calm 97%
Surprised 1.6%
Confused 1.2%
Disgusted 0.1%
Angry 0%
Happy 0%
Sad 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Chair 78.7%
Shoe 69.6%

Captions

Microsoft

a man and a woman standing in front of a store 49.9%
a person standing in front of a store 49.8%
a person standing in front of a store 49.7%

Text analysis

Amazon

FREE
APPARELS
on
CARE on
CARE
SILK
BUTTONS
HIRTS
SOAP
CAREFULLY
DARNING
EACH
MENDING, DARNING
THE
CAREFULLY IRONED
on THE
SEWED
MENDING,
IN PURE
IRONED
LAUNDERED IN PURE SOAP
Special
BUTTONS SEWED on
ts
ERENCE
UFFS
LAUNDERED
A
50
NTIONS
50 L
O
AJ
L
AUDDY

Google

AUNDRY
Special
ON
IN
CAREFULLY
ERENCE
DARNING
EREE
AUNDRY Special CARE ON SILK APPARELS on THE UFFS LAUNDERED IN PURE SOAP CAREFULLY IRONED ERENCE MENDING, DARNING BUTTONS SEWED ON NTIONS HIRTS EREE
SILK
APPARELS
on
UFFS
LAUNDERED
IRONED
SEWED
THE
PURE
BUTTONS
HIRTS
MENDING,
NTIONS
CARE
SOAP