Human Generated Data

Title

Untitled (girl sitting on couch)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17060

Human Generated Data

Title

Untitled (girl sitting on couch)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Human 98.7
Person 98.7
Clothing 90.6
Apparel 90.6
Table Lamp 87.8
Furniture 86.5
Bed 84
Lamp 81.5
Lampshade 72.4
Portrait 58.2
Photo 58.2
Photography 58.2
Face 58.2
Indoors 55.3

Imagga
created on 2022-02-26

person 35
adult 33.2
portrait 27.8
people 27.3
happy 26.9
blond 26.9
smiling 23.9
smile 22.8
attractive 22.4
home 22.3
grandma 21.7
pretty 21.7
hair 20.6
face 19.9
business 18.8
cheerful 18.7
lady 18.7
casual 18.6
indoors 18.4
looking 18.4
computer 17.7
women 17.4
laptop 16.4
cute 15.8
sitting 15.5
work 14.9
mature 14.9
holding 14.9
sexy 14.5
lifestyle 14.5
human 14.2
one 14.2
senior 14.1
office 13.8
happiness 13.3
room 13.1
worker 12.8
gorgeous 12.7
model 12.4
male 12.2
man 12.1
black 11.7
working 11.5
clothing 11.3
eyes 11.2
indoor 11
businesswoman 10.9
executive 10.6
dress 9.9
fashion 9.8
interior 9.7
professional 9.7
domestic 9.6
elderly 9.6
desk 9.4
manager 9.3
house 9.2
alone 9.1
old 9.1
fun 9
couple 8.7
bow tie 8.6
corporate 8.6
skin 8.5
modern 8.4
suit 8.4
color 8.3
single 8.2
businessman 7.9
child 7.9
mother 7.8
expression 7.7
clothes 7.5
leisure 7.5
technology 7.4
camera 7.4
confident 7.3
student 7.2
together 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

person 96.4
indoor 95
text 91.4
human face 87.5
statue 86
bed 79.7
black and white 70.9
clothing 69.6

Face analysis

Amazon

AWS Rekognition

Age 24-34
Gender Male, 98.8%
Surprised 50.1%
Happy 23.4%
Disgusted 11.6%
Calm 6.4%
Angry 4.1%
Fear 2.6%
Sad 1.1%
Confused 0.6%

Feature analysis

Amazon

Person 98.7%
Bed 84%
Lamp 81.5%

Captions

Microsoft

a person sitting on a bed 55.5%
a person sitting on a bed 51.4%
a person sitting on a bed 43.4%

Text analysis

Amazon

21
MJ17--YT37A°2- -X

Google

MJI7-- YT3RA°2 - -XAGON 21
MJI7--
21
-XAGON
YT3RA°2
-