Human Generated Data

Title

Untitled (two girls at window)

Date

1958, printed later

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.201

Human Generated Data

Title

Untitled (two girls at window)

People

Artist: Lucian and Mary Brown, American

Date

1958, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Apparel 99.2
Clothing 99.2
Person 98.7
Human 98.7
Person 98.6
Person 98.2
Female 88
Flooring 87.6
Gown 76.9
Fashion 76.9
Evening Dress 76.9
Robe 76.9
Dress 74.8
Leisure Activities 74.4
Woman 72.8
Face 69
Dance Pose 67.4
Photo 66.5
Photography 66.5
Portrait 65.8
Dance 63.7
Girl 60.6
Undershirt 59.1
Plant 58.8

Imagga
created on 2022-01-23

hairdresser 75.7
salon 34.2
chair 34
man 33.6
barber chair 31.8
groom 31.2
people 30.7
adult 28.5
person 27
male 26.2
couple 26.1
seat 23
happy 21.9
happiness 21.2
home 19.9
family 19.6
bride 19.2
love 18.9
smiling 18.8
indoors 17.6
men 17.2
wedding 16.6
dress 16.3
room 15.7
two 15.2
lifestyle 15.2
women 15
smile 15
furniture 14.6
portrait 14.2
together 14
nurse 13.4
work 13.3
cheerful 13
mother 12.6
medical 11.5
relationship 11.2
sitting 11.2
professional 11.1
day 11
married 10.6
office 10.4
patient 10.3
worker 10.2
communication 10.1
indoor 10
care 9.9
loving 9.5
husband 9.5
wife 9.5
back 9.2
romance 8.9
romantic 8.9
kiss 8.8
celebration 8.8
table 8.7
hospital 8.6
business 8.5
togetherness 8.5
bouquet 8.5
casual 8.5
relaxation 8.4
child 8.3
outdoors 8.2
life 8.2
interior 8
medicine 7.9
affection 7.7
boyfriend 7.7
modern 7.7
girlfriend 7.7
pretty 7.7
attractive 7.7
youth 7.7
health 7.6
daughter 7.6
talking 7.6
marriage 7.6
bed 7.6
adults 7.6
house 7.5
technology 7.4
phone 7.4
lady 7.3
new 7.3
laptop 7.3
computer 7.2
face 7.1
job 7.1
businessman 7.1
father 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 96.9
toddler 90.2
clothing 88.9
person 87
dress 85
baby 77.6
child 74.1
girl 68.9
black and white 61.9

Face analysis

Amazon

Google

AWS Rekognition

Age 2-8
Gender Male, 97.1%
Confused 60.6%
Sad 21.6%
Calm 15.3%
Angry 0.7%
Surprised 0.7%
Happy 0.5%
Disgusted 0.4%
Fear 0.4%

AWS Rekognition

Age 23-33
Gender Male, 99.2%
Fear 67.4%
Angry 12.2%
Calm 9.8%
Happy 6.4%
Sad 2.2%
Surprised 1%
Disgusted 0.6%
Confused 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.7%

Captions

Microsoft

a person sitting on a bench 82.2%
a person sitting on a table 82.1%
a person sitting at a table 82%