Human Generated Data

Title

Untitled (portrait of two girls seated in living room)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16324

Human Generated Data

Title

Untitled (portrait of two girls seated in living room)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-11

Apparel 99.9
Clothing 99.9
Person 99.4
Human 99.4
Person 97.8
Dress 92.6
Bonnet 88
Hat 88
Costume 84.7
Home Decor 81.6
Face 75.3
Female 68.8
Door 68.5
Indoors 66.6
Portrait 64.8
Photography 64.8
Photo 64.8
Girl 63.9
People 62.2
Bed 59.8
Furniture 59.8
Room 59.3

Imagga
created on 2022-02-11

groom 63.5
grandma 58.5
people 31.2
couple 29.6
adult 28.6
love 28.4
portrait 27.8
happiness 27.4
bride 27
person 26.4
dress 25.3
wedding 23.9
happy 23.2
married 23
man 20.8
family 19.6
fashion 18.8
smiling 18.8
senior 18.7
home 18.3
male 17.1
cheerful 17.1
face 17
two 16.9
wife 16.1
lifestyle 15.9
bouquet 15.1
kin 14.9
mother 14.6
hair 14.3
smile 14.3
human 14.2
women 14.2
pretty 14
child 13.4
romantic 13.4
marriage 13.3
sitting 12.9
cute 12.2
lady 12.2
sexy 12
attractive 11.9
old 11.8
model 11.7
husband 11.5
elderly 11.5
health 11.1
day 11
clothing 11
veil 10.8
retired 10.7
loving 10.5
celebration 10.4
casual 10.2
house 10
joy 10
care 9.9
life 9.8
tenderness 9.7
retirement 9.6
healthy 9.4
room 9.4
holiday 9.3
girls 9.1
aged 9
gown 8.8
kiss 8.8
indoors 8.8
together 8.8
innocence 8.7
elegance 8.4
holding 8.3
fun 8.2
one 8.2
style 8.2
grandfather 8.1
new 8.1
romance 8
looking 8
kid 8
interior 8
wed 7.9
flowers 7.8
pensioner 7.8
hug 7.7
summer 7.7
parent 7.7
flower 7.7
expression 7.7
youth 7.7
skin 7.6
pair 7.6
outdoors 7.5
pose 7.2

Google
created on 2022-02-11

Microsoft
created on 2022-02-11

text 98.5
black and white 86.4
person 81.6
human face 71.4
clothing 67
posing 46.6

Face analysis

Amazon

Google

AWS Rekognition

Age 16-24
Gender Male, 94.1%
Happy 38.4%
Surprised 27%
Sad 18.9%
Calm 5.5%
Fear 3.4%
Confused 3.1%
Disgusted 2.5%
Angry 1.3%

AWS Rekognition

Age 23-33
Gender Female, 96.1%
Calm 30.9%
Surprised 26.6%
Sad 21%
Fear 8%
Happy 5.8%
Confused 3.7%
Disgusted 2.2%
Angry 1.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Captions

Microsoft

a person sitting in front of a window 60.1%
a man and a woman posing for a photo 43%
a man and woman posing for a photo 35.5%

Text analysis

Amazon

KODAK-A-EITW

Google

MJI7-- YT 3RA°2 - - XAGON
MJI7--
YT
XAGON
-
3RA°2