Human Generated Data

Title

Untitled (two girls with two dogs)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16356

Human Generated Data

Title

Untitled (two girls with two dogs)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-11

Clothing 99.8
Apparel 99.8
Person 99
Human 99
Face 92.5
Person 90.1
Female 89.9
Chair 89.5
Furniture 89.5
Costume 83.4
Animal 79.8
Mammal 78.8
Canine 77.9
Pet 77
Woman 76.4
Dress 75.8
Photography 74.2
Photo 74.2
Portrait 74.2
People 70.3
Girl 68.9
Hat 67.6
Food 67.4
Meal 67.4
Gown 62.2
Fashion 62.2
Robe 62.2
Door 61.8
Table 60.2
Indoors 58.1
Pants 57.1

Imagga
created on 2022-02-11

grandma 53
senior 40.3
people 31.2
couple 30.5
person 29.3
happy 28.2
elderly 27.8
man 27.5
adult 27.1
portrait 25.9
kin 25
retired 24.2
mature 24.2
retirement 23
smiling 22.4
love 22.1
male 22
old 20.2
smile 19.9
lifestyle 18.8
together 18.4
home 18.3
sitting 18
lady 17
hair 16.6
women 16.6
pensioner 16.1
happiness 15.7
husband 15.6
face 14.9
wedding 14.7
bride 14.5
married 14.4
grandfather 14.3
wife 14.2
groom 14.2
indoors 14.1
pretty 14
health 13.9
looking 13.6
human 13.5
family 13.3
generator 13.2
aged 12.7
holding 12.4
two 11.9
dress 11.7
older 11.6
care 11.5
attractive 11.2
mother 11
grandmother 10.8
loving 10.5
outdoors 10.4
marriage 10.4
day 10.2
gray 9.9
romantic 9.8
cheerful 9.8
fun 9.7
patient 9.6
enjoying 9.5
men 9.4
casual 9.3
70s 8.8
veil 8.8
healthy 8.8
gown 8.8
two people 8.7
affection 8.7
holiday 8.6
age 8.6
females 8.5
relaxation 8.4
blond 8.3
camera 8.3
leisure 8.3
fashion 8.3
negative 8.2
relaxing 8.2
romance 8
child 7.9
medical 7.9
citizen 7.9
look 7.9
seniors 7.9
talking 7.6
hand 7.6
laughing 7.6
togetherness 7.6
joy 7.5
relaxed 7.5
indoor 7.3
girls 7.3
parent 7.2
nurse 7.1

Google
created on 2022-02-11

Microsoft
created on 2022-02-11

wedding dress 95.9
text 90.8
window 89.6
bride 87.9
person 87.4
dress 76.1
clothing 65

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Male, 92.4%
Surprised 53.4%
Calm 42.1%
Happy 3.1%
Fear 0.5%
Angry 0.3%
Disgusted 0.3%
Sad 0.2%
Confused 0.2%

AWS Rekognition

Age 16-22
Gender Female, 99.7%
Calm 98%
Surprised 0.9%
Happy 0.4%
Sad 0.3%
Angry 0.1%
Confused 0.1%
Disgusted 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%

Captions

Microsoft

a man and a woman sitting in front of a window 78.1%
a man and woman sitting next to a window 76.1%
a person sitting in front of a window 76%

Text analysis

Amazon

YACOX
VITRAL YACOX
VITRAL