Human Generated Data

Title

Untitled (woman having portrait sketched on side walk, Greenwich Village, NY)

Date

c. 1950

People

Artist: Mary Lowber Tiers, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15814

Human Generated Data

Title

Untitled (woman having portrait sketched on side walk, Greenwich Village, NY)

People

Artist: Mary Lowber Tiers, American active 1940s

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 99.3
Human 99.3
Person 99.1
Person 98.7
Person 98.2
Person 96.4
Clothing 96.3
Apparel 96.3
Person 96.1
Text 79.2
Face 79.1
People 70.6
Advertisement 68.6
Poster 68.6
Female 67.9
Furniture 65.8
Meal 64.3
Food 64.3
Outdoors 63.4
Hat 61.4
Sitting 59.8
Coat 59.7
Overcoat 59.7
Suit 59.7
Shorts 58.3
Nature 56.3
Plant 55.8
Tire 55.6

Imagga
created on 2022-02-05

barbershop 81.3
shop 68.2
mercantile establishment 51.8
place of business 34.5
man 21.5
old 20.9
people 18.4
establishment 17.4
statue 17.1
city 16.6
male 16.3
sculpture 15.3
vintage 14.1
ancient 13.8
newspaper 13.4
men 12.9
historic 12.8
black 12.6
architecture 11.7
history 11.6
antique 11.2
art 11.1
building 11
person 10.9
travel 10.6
product 10.3
tourism 9.9
business 9.7
monument 9.3
street 9.2
historical 8.5
kin 8.4
room 8.2
landmark 8.1
creation 8.1
urban 7.9
soldier 7.8
military 7.7
war 7.7
culture 7.7
hairdresser 7.3
tourist 7.2
aged 7.2
dirty 7.2
adult 7.2
women 7.1

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

person 100
text 98.4
clothing 97.6
man 90.7
black and white 76.8
people 67
musical instrument 65.4

Face analysis

Amazon

Google

AWS Rekognition

Age 19-27
Gender Female, 97.9%
Calm 98.1%
Surprised 1.1%
Disgusted 0.2%
Angry 0.2%
Sad 0.1%
Confused 0.1%
Fear 0.1%
Happy 0%

AWS Rekognition

Age 29-39
Gender Female, 72.7%
Happy 74.8%
Surprised 13.3%
Calm 6.5%
Fear 2.9%
Angry 0.9%
Disgusted 0.8%
Sad 0.6%
Confused 0.4%

AWS Rekognition

Age 26-36
Gender Male, 94.9%
Calm 99.7%
Confused 0.1%
Happy 0.1%
Surprised 0%
Sad 0%
Disgusted 0%
Fear 0%
Angry 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%

Captions

Microsoft

a group of people sitting in front of a building 93.5%
a group of people sitting in front of a window 85.3%
a group of people sitting and standing in front of a building 85.2%

Text analysis

Amazon

R
RCOM
TOMAP RCOM
SHIR
TOMAP
lives SHIR
lives

Google

SHO
JE SHO TAPROO
TAPROO
JE