Human Generated Data

Title

Untitled (children playing with ribbons, New York City)

Date

1940s

People

Artist: Mary Lowber Tiers, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15878.1

Human Generated Data

Title

Untitled (children playing with ribbons, New York City)

People

Artist: Mary Lowber Tiers, American active 1940s

Date

1940s

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 98.8
Human 98.8
Person 98.6
Leisure Activities 95
Violin 92.6
Viola 92.6
Fiddle 92.6
Musical Instrument 92.6
Person 91.3
Shoe 90.4
Apparel 90.4
Clothing 90.4
Footwear 90.4
Shoe 67.6
Vehicle 62.6
Bicycle 62.6
Bike 62.6
Transportation 62.6
Cello 61.8
Advertisement 60.8
Poster 60.1
Female 58.4
Door 57.9
Collage 57.7
Musician 56.7

Imagga
created on 2022-02-05

violin 48.2
bowed stringed instrument 46.4
stringed instrument 39.8
musical instrument 32.6
shop 20.9
man 20.1
barbershop 20
adult 19.6
people 19
door 17.9
window 17.8
male 16.3
person 16.1
building 15.7
chair 15.1
women 13.4
architecture 13.3
urban 13.1
men 12.9
fashion 12.8
house 12.5
interior 12.4
indoors 12.3
business 12.1
sliding door 12.1
mercantile establishment 12.1
inside 12
home 12
modern 11.9
portrait 11.6
lifestyle 11.6
room 11.4
black 11.4
indoor 10.9
light 10
city 10
happy 9.4
attractive 9.1
holding 9.1
dress 9
office 9
one 9
smiling 8.7
sitting 8.6
wall 8.5
casual 8.5
pretty 8.4
hand 8.3
training 8.3
human 8.2
alone 8.2
style 8.2
music 8.1
lady 8.1
place of business 8
handsome 8
face 7.8
entrance 7.7
old 7.7
buy 7.5
wood 7.5
silhouette 7.4
movable barrier 7.4
device 7.3
shopping 7.3
cheerful 7.3
exercise 7.3
sexy 7.2
looking 7.2
body 7.2
life 7.2
smile 7.1
happiness 7
child 7

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

footwear 94.3
street 93.7
black and white 92.3
text 89.9
clothing 89.5
person 82.5
woman 62

Face analysis

Amazon

Google

AWS Rekognition

Age 21-29
Gender Male, 98.4%
Calm 95.6%
Confused 2.8%
Sad 1.1%
Surprised 0.2%
Fear 0.1%
Disgusted 0.1%
Happy 0.1%
Angry 0.1%

AWS Rekognition

Age 25-35
Gender Male, 83.8%
Sad 54.1%
Happy 13.3%
Confused 8.4%
Calm 7.8%
Disgusted 7.7%
Fear 5.1%
Angry 2.5%
Surprised 1.2%

AWS Rekognition

Age 11-19
Gender Female, 85.2%
Calm 60.6%
Sad 25.4%
Happy 8.4%
Disgusted 1.7%
Confused 1.4%
Angry 1.2%
Fear 1.1%
Surprised 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%
Shoe 90.4%
Bicycle 62.6%

Captions

Microsoft

a woman standing in front of a window 92.7%
a woman sitting in front of a window 86.7%
a woman standing next to a window 86.6%

Text analysis

Amazon

ICE
ICE CREAM
CREAM
TERFIELD
HORTON'S
H

Google

HORTON'S
TERFIELD HORTON'S TCE CREAM
TERFIELD
CREAM
TCE