Human Generated Data

Title

Untitled (street portrait artists at work, New York City)

Date

c. 1950

People

Artist: Mary Lowber Tiers, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15887

Human Generated Data

Title

Untitled (street portrait artists at work, New York City)

People

Artist: Mary Lowber Tiers, American active 1940s

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Human 99.6
Person 99.6
Person 98.8
Person 97.8
Person 97.5
Person 97.5
Person 97.2
Apparel 96.8
Clothing 96.8
Person 92.7
Footwear 87.2
Shoe 87.2
Shoe 80.7
Advertisement 78.3
Poster 77.6
Crowd 77.5
Person 76.2
People 74.4
Person 74.3
Face 73.5
Costume 72.4
Female 71
Person 69.8
Text 69.5
Person 69.4
Shoe 68.2
Coat 68
Suit 68
Overcoat 68
Photography 63.4
Photo 63.4
Indoors 63.2
Art 62.7
Person 62.5
Person 62
Pedestrian 58.9
Floor 58.3
Collage 58.1
Drawing 57.9
Shorts 57.6
Paper 56.5
Brochure 56.5
Flyer 56.5
Girl 56.1
Person 43.9

Imagga
created on 2022-02-05

people 24.5
man 20.8
business 20
shop 19.6
barbershop 19.4
men 18.9
male 18.4
newspaper 16.3
person 14.8
interior 14.1
room 14.1
old 13.9
office 13.8
mercantile establishment 13.6
architecture 13.3
businessman 13.2
product 12.8
city 12.5
building 12
adult 11.9
house 11.7
urban 11.3
drawing 11.3
group 11.3
grunge 11.1
history 10.7
life 10.3
wall 10.3
finance 10.1
symbol 10.1
creation 10
professional 9.9
team 9.8
human 9.7
work 9.5
women 9.5
window 9.4
place of business 9.2
historic 9.2
silhouette 9.1
black 9
musical instrument 8.9
home 8.8
art 8.6
historical 8.5
design 8.4
wind instrument 8.4
inside 8.3
vintage 8.3
chair 8.1
teacher 7.9
brass 7.8
portrait 7.8
crowd 7.7
two 7.6
sign 7.5
back 7.3
indoor 7.3
graphic 7.3
new 7.3
success 7.2
worker 7.2
sax 7.2
family 7.1
working 7.1
indoors 7
modern 7

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

text 99.4
window 91.2
person 85.6
clothing 77.7
group 68.2
people 56.8
black and white 54.3

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Female, 82.7%
Calm 93%
Sad 3.9%
Confused 1.4%
Happy 0.6%
Disgusted 0.5%
Angry 0.2%
Surprised 0.2%
Fear 0.2%

AWS Rekognition

Age 27-37
Gender Male, 93.8%
Calm 99%
Confused 0.3%
Sad 0.2%
Surprised 0.2%
Disgusted 0.1%
Angry 0.1%
Happy 0%
Fear 0%

AWS Rekognition

Age 31-41
Gender Female, 54.4%
Calm 76.4%
Angry 8.4%
Sad 4%
Disgusted 2.9%
Confused 2.8%
Fear 2.3%
Surprised 2.1%
Happy 1%

AWS Rekognition

Age 53-61
Gender Female, 54.1%
Calm 97%
Sad 1.5%
Happy 0.6%
Surprised 0.4%
Angry 0.2%
Confused 0.2%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 26-36
Gender Female, 59.4%
Sad 37.2%
Calm 31.1%
Confused 9.8%
Fear 9.8%
Surprised 5%
Happy 3%
Disgusted 2.6%
Angry 1.5%

AWS Rekognition

Age 30-40
Gender Female, 71.1%
Calm 98.2%
Happy 0.6%
Sad 0.4%
Confused 0.2%
Surprised 0.2%
Disgusted 0.1%
Fear 0.1%
Angry 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person 99.6%
Shoe 87.2%

Captions

Microsoft

a group of people standing in front of a window 77.9%
a group of people standing in front of a store window 70.5%
a group of people in front of a window 70.4%

Text analysis

Amazon

YOUR
YOUR PORTRAIT
PORTRAIT
OR
MACDOUGAL
OR CARICATURE
CARICATURE
HAND
MACDOUGAL HAND PARK
PARK
R
A
ND R
Amo
ND
16 A R
16

Google

OR
YIININI
MACDOUGAL PARK HAND YOUR PORTRAIT OR CAPICATURE Amo YIININI
PARK
HAND
YOUR
CAPICATURE
MACDOUGAL
PORTRAIT
Amo