Human Generated Data

Title

Untitled (baby on barber chair)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16385

Human Generated Data

Title

Untitled (baby on barber chair)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-11

Furniture 99.9
Chair 97.9
Human 92.3
Person 92.3
Couch 67.3
Shelf 57.4

Imagga
created on 2022-02-11

laptop 46.1
computer 42.8
person 39.5
office 38.4
business 37.7
people 31.2
adult 29.5
corporate 29.2
desk 28.7
working 28.3
businesswoman 28.2
professional 27.7
job 27.4
negative 26.2
work 25.9
man 24.9
happy 23.8
attractive 22.4
worker 22.3
smile 22.1
sitting 21.5
male 21.3
businessman 21.2
executive 21.2
notebook 20.9
technology 20.8
film 20.7
communication 20.2
portrait 20.1
smiling 19.5
casual 18.6
home 18.3
looking 16.8
manager 16.8
wireless 16.2
businesspeople 16.1
photographic paper 16
face 15.6
confident 15.5
cheerful 15.4
chair 15.3
secretary 13.8
successful 13.7
indoor 13.7
suit 13.5
women 13.4
modern 13.3
men 12.9
success 12.9
expression 12.8
blond 12.5
table 12.4
indoors 12.3
lady 12.2
student 12
pretty 11.9
lifestyle 11.6
device 11.4
career 11.4
photographic equipment 11.2
hair 11.1
cute 10.8
education 10.4
support 10.3
alone 10
clothing 10
studio 9.9
book 9.8
handsome 9.8
employee 9.8
businessperson 9.7
one 9.7
paper 9.6
sofa 9.6
happiness 9.4
model 9.3
occupation 9.2
fashion 9
teacher 9
typing 8.8
busy 8.7
workplace 8.6
meeting 8.5
keyboard 8.4
scholar 8.4
call 8.4
old 8.4
glasses 8.3
phone 8.3
document 7.7
corporation 7.7
stress 7.7
boss 7.7
senior 7.5
company 7.4
mature 7.4
floor 7.4
20s 7.3
attire 7.3
furniture 7.2
black 7.2
team 7.2
bright 7.1

Google
created on 2022-02-11

Microsoft
created on 2022-02-11

laptop 99.2
text 98
sitting 97.5
person 90.3
human face 70.8
book 69.3
clothing 62.9
black and white 54.5

Face analysis

Amazon

Google

AWS Rekognition

Age 11-19
Gender Male, 80.9%
Surprised 58.3%
Happy 39.3%
Calm 0.8%
Disgusted 0.4%
Confused 0.4%
Fear 0.3%
Angry 0.3%
Sad 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 92.3%

Captions

Microsoft

a person sitting in front of a laptop 91.7%
a person sitting in front of a laptop computer 90.3%
a person sitting at a desk looking at a laptop 90.2%

Text analysis

Amazon

3
18
abeam
KODAK-
KODAK- 2.4
Shores
Classics Shores
2.4
Classics
and

Google

nbean
nbean