Human Generated Data

Title

Untitled (boy having portrait drawn on sidewalk)

Date

c. 1950

People

Artist: Mary Lowber Tiers, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15708

Human Generated Data

Title

Untitled (boy having portrait drawn on sidewalk)

People

Artist: Mary Lowber Tiers, American active 1940s

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Clothing 100
Apparel 100
Shorts 100
Person 99.4
Human 99.4
Person 99.2
Person 98.1
Hat 94
Shoe 80.8
Footwear 80.8
Person 80.7
Transportation 79.5
Vehicle 79.5
Automobile 79.5
Car 79.5
Person 77.3
Female 72.2
Military 72.1
Military Uniform 72.1
People 69.4
Suit 68.9
Coat 68.9
Overcoat 68.9
Officer 66.2
Person 64.9
Sun Hat 62.8
Girl 55.2
Shoe 54.6
Person 41.7

Imagga
created on 2022-02-05

newspaper 35.4
clothing 29
military uniform 28.7
product 27.3
uniform 24.9
creation 23.3
person 21.4
covering 19.6
consumer goods 18.9
art 17.8
people 16.2
man 16.1
fashion 15.1
black 14.4
portrait 14.2
male 13.5
adult 13
face 12.8
statue 12.5
sexy 12
style 11.9
brassiere 11.8
dress 11.7
model 11.7
doll 10.6
costume 10.5
body 9.6
woman's clothing 9.5
undergarment 9.5
sculpture 9.4
makeup 9.1
old 9.1
posing 8.9
mask 8.8
garment 8.8
decoration 8.7
marble 8.7
hair 8.7
standing 8.7
party 8.6
traditional 8.3
vintage 8.3
human 8.2
plaything 8.2
look 7.9
antique 7.8
ancient 7.8
carnival 7.8
luxury 7.7
modern 7.7
pretty 7.7
culture 7.7
festival 7.6
health 7.6
hand 7.6
historical 7.5
city 7.5
astronaut 7.5
room 7.4
stylish 7.2
lifestyle 7.2
celebration 7.2
home 7.2
history 7.2
women 7.1
commodity 7.1

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

text 99.2
person 96.3
posing 89
clothing 86.7
black and white 60.3
old 45.9

Face analysis

Amazon

Google

AWS Rekognition

Age 49-57
Gender Male, 99.2%
Surprised 70.7%
Calm 14.8%
Happy 8.5%
Disgusted 1.9%
Confused 1.6%
Sad 1.1%
Angry 1%
Fear 0.5%

AWS Rekognition

Age 18-24
Gender Male, 94.3%
Calm 39.2%
Sad 36%
Happy 10.9%
Angry 7.8%
Fear 1.9%
Surprised 1.9%
Disgusted 1.3%
Confused 1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Person 99.4%
Hat 94%
Shoe 80.8%
Car 79.5%

Captions

Microsoft

a group of people posing for a photo 92.8%
a group of people posing for the camera 92.7%
an old photo of a group of people posing for the camera 91.7%

Text analysis

Amazon

CLEANERS
WEAVERS
STORED
TAILORS
SERVICE
FURRIERS
ros
ER
GASMENTS STORED
ЭЛЬКО TAILORS FURRIERS
4NOV SERVICE
I ros
GASMENTS
I
ЭЛЬКО
4NOV
WASH

Google

TAILORS URRIERS 4NOA SCE CLEANERS WEAVERS GAEMINTS STORED
TAILORS
URRIERS
4NOA
STORED
CLEANERS
GAEMINTS
SCE
WEAVERS