Human Generated Data

Title

Mother and Child, Sanfond

Date

1992

People

Artist: Eric Breitenbach, American born 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Beinecke Fund, 2.2002.240

Copyright

© Eric Breitenbach

Human Generated Data

Title

Mother and Child, Sanfond

People

Artist: Eric Breitenbach, American born 1956

Date

1992

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Face 98.3
Human 98.3
Person 98
Person 97.2
Smile 84.8
Apparel 82.6
Shorts 82.6
Clothing 82.6
Photography 71
Portrait 71
Photo 71
People 67.4
Child 67.2
Kid 67.2
Female 63.8
Building 63
Finger 61.7
Man 61.1
Girl 60.5
Urban 60
Sleeve 58.6
Furniture 55.1

Imagga
created on 2022-01-08

child 59.2
brother 43.6
man 28.9
people 27.9
person 27.8
male 27.5
sibling 27.4
love 26
parent 25.1
portrait 24.6
attractive 24.5
lifestyle 23.8
couple 23.5
home 23.1
mother 22.6
happy 22.6
adult 21.4
boy 20.9
body 19.2
juvenile 18.7
face 17
sexy 16.9
cute 16.5
kid 16
looking 15.2
smiling 15.2
father 15.2
family 15.1
baby 15
leisure 14.9
indoor 14.6
casual 14.4
skin 14.4
childhood 14.3
smile 14.3
relaxation 14.2
indoors 14.1
dad 13.9
fun 13.5
happiness 13.3
little 13.2
sitting 12.9
two 12.7
healthy 12.6
model 12.4
care 12.3
human 12
pretty 11.9
women 11.9
handsome 11.6
youth 11.1
girlfriend 10.6
husband 10.5
bath 10.4
health 10.4
relationship 10.3
relaxing 10
hand 9.9
wet 9.8
fashion 9.8
cheerful 9.8
black 9.7
boyfriend 9.6
30s 9.6
hair 9.5
lying 9.4
expression 9.4
rest 9.3
relax 9.3
20s 9.2
life 9.2
domestic 9.1
laptop 9.1
children 9.1
world 9.1
son 9.1
bathtub 9
lady 8.9
together 8.8
brunette 8.7
bride 8.6
loving 8.6
bed 8.5
togetherness 8.5
vessel 8.5
room 8.4
horizontal 8.4
house 8.4
clean 8.4
adorable 8.3
girls 8.2
sensuality 8.2
innocent 8.1
computer 8
water 8
interior 8
day 7.8
education 7.8
muscular 7.6
serious 7.6
living 7.6
one 7.5
floor 7.4
holding 7.4
make 7.3
spa 7.2

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 98.4
human face 98.3
toddler 97.9
baby 97.1
person 96
child 95
sitting 94.8
clothing 94.6
smile 93.6
black and white 90
young 87.2
boy 76
monochrome 61
posing 50.8
picture frame 9.1

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 31-41
Gender Female, 100%
Happy 94.1%
Calm 4.3%
Disgusted 0.6%
Confused 0.2%
Surprised 0.2%
Sad 0.2%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 1-7
Gender Male, 99.7%
Calm 72.4%
Confused 24.6%
Sad 1.4%
Angry 0.7%
Surprised 0.3%
Fear 0.3%
Disgusted 0.2%
Happy 0.1%

Microsoft Cognitive Services

Age 0
Gender Male

Microsoft Cognitive Services

Age 32
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98%

Captions

Microsoft

a young boy sitting in front of a window 55.8%
a young boy sitting in front of a window posing for the camera 55.7%
a young boy sitting next to a window 51.4%

Text analysis

Amazon

3/15
1992
+
child,
Ein
Ein Ruth 1992
MoThen + child, Sanford
MoThen
Sanford
Ruth

Google

Nothen
child,
315 Nothen + child, Samfondd 1992
+
1992
315
Samfondd