Human Generated Data

Title

Untitled (boy having portrait drawn on sidewalk)

Date

c.1950

People

Artist: Mary Lowber Tiers, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15709

Human Generated Data

Title

Untitled (boy having portrait drawn on sidewalk)

People

Artist: Mary Lowber Tiers, American active 1940s

Date

c.1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 99.3
Human 99.3
Apparel 98.1
Clothing 98.1
Hat 96.9
Person 96.7
Text 95.3
Person 93
Advertisement 88.2
Poster 88.2
Person 74
Newspaper 70.3
Paper 70
Brochure 66.3
Flyer 66.3
Page 63.2
Furniture 61.2
Reading 59.3
Collage 59.2
Photography 56.8
Photo 56.8
Sun Hat 55.9

Imagga
created on 2022-02-05

man 39
people 32.4
male 31.3
person 30
work 28.3
medical 26.5
professional 25.6
adult 23.4
doctor 21.6
hospital 21.6
business 20.6
job 20.3
surgeon 19.7
businessman 19.4
worker 19.2
happy 18.8
office 18.8
medicine 18.5
patient 18.5
smiling 18.1
working 16.8
computer 16
looking 16
health 16
nurse 15.6
coat 14.9
corporate 14.6
portrait 14.2
face 14.2
senior 14.1
men 13.7
laboratory 13.5
exam 13.4
businesswoman 12.7
executive 12.3
human 12
lab coat 11.9
occupation 11.9
clinic 11.9
bow tie 11.9
laptop 11.8
chemistry 11.6
care 11.5
illness 11.4
indoors 11.4
businesspeople 11.4
mature 11.2
happiness 11
student 10.9
lifestyle 10.8
scientist 10.8
team 10.8
hairdresser 10.5
necktie 10.4
clothing 10.3
room 10.2
smile 10
modern 9.8
mid adult 9.6
black 9.6
serious 9.5
table 9.5
biology 9.5
sitting 9.4
casual 9.3
teamwork 9.3
hand 9.1
technology 8.9
science 8.9
surgery 8.8
together 8.8
lab 8.7
test 8.7
education 8.7
research 8.6
talking 8.6
attractive 8.4
manager 8.4
phone 8.3
school 8.2
indoor 8.2
cheerful 8.1
garment 8.1
group 8.1
success 8
employee 8
equipment 8
hair 7.9
instrument 7.9
standing 7.8
uniform 7.8
chemical 7.7
using 7.7
profession 7.7
two 7.6
violin 7.6
mobile 7.5
meeting 7.5
study 7.5
holding 7.4
specialist 7.4
suit 7.4
successful 7.3
teenager 7.3
women 7.1

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

text 99.5
person 97.4
clothing 91.1
black and white 89.3
drawing 84
boy 75.1
sketch 60.2

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Male, 65.1%
Calm 93%
Sad 5.1%
Happy 0.8%
Surprised 0.3%
Disgusted 0.3%
Angry 0.2%
Confused 0.2%
Fear 0.1%

AWS Rekognition

Age 39-47
Gender Male, 63.4%
Calm 79.6%
Happy 11%
Surprised 5.2%
Fear 1.6%
Angry 0.9%
Disgusted 0.6%
Sad 0.6%
Confused 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Feature analysis

Amazon

Person 99.3%
Hat 96.9%

Captions

Microsoft

an old photo of a boy 64.5%
old photo of a boy 59.4%
a photo of a boy 56.7%

Text analysis

Amazon

JEWELRY
HAND
HAND MADE
MADE
EY
SAM
SAM KRAMER
KRAMER
RESTAURANT
OP
INNERSITY
boar

Google

MADE
JAN
JAN KRAMER HAND MADE JEWELRY RESTRA ashing
HAND
JEWELRY
RESTRA
ashing
KRAMER