Human Generated Data

Title

Untitled (man with baby on bed)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17494

Human Generated Data

Title

Untitled (man with baby on bed)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Furniture 99
Human 98.3
Person 98.3
Person 94.8
Face 91.7
Bedroom 91
Room 91
Indoors 91
Bed 90.4
Couch 85.7
Interior Design 83.4
Home Decor 79.1
Clothing 70.9
Apparel 70.9
Portrait 69.7
Photo 69.7
Photography 69.7
Baby 68.7
Flooring 63.2
Finger 62.8
Head 61.7
Female 60.4
Newborn 59.2
Art 57.1
Nature 56

Imagga
created on 2022-02-26

man 34.3
person 32.9
people 29.6
couple 27.9
adult 27.5
senior 27.2
home 27.1
male 27.1
grandfather 21.4
portrait 20.7
happy 20.7
indoors 20.2
love 18.2
sitting 18
room 17.7
smiling 15.9
mature 15.8
face 15.6
family 15.1
women 15
teacher 15
patient 14.6
elderly 14.4
happiness 14.1
men 13.7
lifestyle 13.7
retired 13.6
old 13.2
husband 12.9
indoor 12.8
retirement 12.5
grandma 12.3
smile 12.1
mother 11.9
together 11.4
bed 11.4
togetherness 11.3
hair 11.1
70s 10.8
educator 10.7
bow tie 10.6
cheerful 10.6
pretty 10.5
fun 10.5
professional 10.4
sexy 10.4
clothing 10.2
two 10.2
health 9.7
two people 9.7
medical 9.7
older 9.7
black 9.6
married 9.6
loving 9.5
wife 9.5
enjoying 9.5
hospital 9.2
attractive 9.1
care 9.1
romantic 8.9
affectionate 8.7
table 8.7
groom 8.6
talking 8.6
lying 8.5
relationship 8.4
necktie 8.4
relaxation 8.4
fashion 8.3
inside 8.3
joyful 8.3
romance 8
body 8
living room 7.8
grandmother 7.8
lover 7.8
couch 7.7
bride 7.7
sofa 7.7
casual 7.6
resting 7.6
nurse 7.6
human 7.5
vintage 7.4
holding 7.4
hand blower 7.4
occupation 7.3
lady 7.3
sensual 7.3
sensuality 7.3
office 7.2
looking 7.2
chair 7.2
bright 7.1
to 7.1
interior 7.1
look 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 96.7
drawing 92.9
person 92.2
human face 83.9
sketch 69.7

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Male, 99.8%
Surprised 65.3%
Happy 20.1%
Fear 4.7%
Calm 2.8%
Angry 2.2%
Confused 2%
Sad 1.6%
Disgusted 1.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.3%
Bed 90.4%

Captions

Microsoft

a man and a woman looking at the camera 41.8%
a man and a woman standing in a room 41.7%

Text analysis

Amazon

20
50
YТ3А

Google

20
20