Human Generated Data

Title

Untitled (passerbys watching street artist at work, Greenwich VIllage, NY)

Date

c. 1950

People

Artist: Mary Lowber Tiers, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15824

Human Generated Data

Title

Untitled (passerbys watching street artist at work, Greenwich VIllage, NY)

People

Artist: Mary Lowber Tiers, American active 1940s

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 99.4
Human 99.4
Person 98.9
Person 98.7
Person 97.7
Person 94.1
Apparel 93.4
Clothing 93.4
Footwear 93.4
Shoe 93.4
Person 92.6
Shoe 91.2
Person 71
Text 68.1
Coat 67.9
Poster 65.5
Advertisement 65.5
Face 65.1
Photo 64.8
Photography 64.8
Portrait 64.4
Wood 56.1

Imagga
created on 2022-02-05

old 23
newspaper 19.8
architecture 19.5
building 18.5
shop 17.7
vintage 16.5
ancient 16.4
wall 16.2
art 15.8
barbershop 15.7
product 15.4
sculpture 14.9
window 14.8
city 14.1
mercantile establishment 14
historic 13.7
religion 13.4
antique 12.1
creation 12
painter 11.6
tourism 11.5
travel 11.3
texture 11.1
history 10.7
decoration 10.6
statue 10.5
culture 10.2
black 10.2
grunge 10.2
man 10.1
vacation 9.8
design 9.6
historical 9.4
religious 9.4
monument 9.3
place of business 9.3
house 9.2
mask 9.2
dirty 9
detail 8.8
structure 8.8
urban 8.7
door 8.7
retro 8.2
aged 8.1
device 8
portrait 7.8
church 7.4
landmark 7.2
home 7.2
colorful 7.2
interior 7.1

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

text 99.8
person 98.1
clothing 90.7
black and white 79.9
newspaper 76.5
posing 72.3
drawing 71.7
group 71.5
man 71.5
people 60
clothes 23

Face analysis

Amazon

Google

AWS Rekognition

Age 33-41
Gender Female, 76%
Calm 60.6%
Fear 19.6%
Surprised 7.6%
Angry 5.3%
Sad 2.1%
Disgusted 1.9%
Happy 1.8%
Confused 1.2%

AWS Rekognition

Age 34-42
Gender Male, 97.7%
Calm 97.3%
Fear 1.9%
Confused 0.2%
Surprised 0.2%
Disgusted 0.1%
Angry 0.1%
Sad 0.1%
Happy 0%

AWS Rekognition

Age 33-41
Gender Female, 99.5%
Calm 99.3%
Surprised 0.4%
Fear 0.1%
Sad 0.1%
Disgusted 0%
Confused 0%
Angry 0%
Happy 0%

AWS Rekognition

Age 38-46
Gender Female, 99.9%
Sad 62%
Calm 19.4%
Happy 12.2%
Disgusted 1.6%
Angry 1.4%
Fear 1.3%
Confused 1.3%
Surprised 0.9%

AWS Rekognition

Age 21-29
Gender Male, 99.7%
Confused 47.8%
Sad 27%
Calm 15.9%
Fear 4.1%
Surprised 2%
Angry 1.4%
Disgusted 1%
Happy 0.8%

AWS Rekognition

Age 24-34
Gender Female, 61.3%
Calm 95.7%
Sad 3.9%
Surprised 0.1%
Angry 0.1%
Happy 0.1%
Fear 0.1%
Confused 0%
Disgusted 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Shoe 93.4%

Captions

Microsoft

a group of people posing for a photo 86.6%
a group of people posing for the camera 86.5%
a group of people posing for a picture 86.4%