Human Generated Data

Title

Rehabilitation client, Arkansas

Date

1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3065

Human Generated Data

Title

Rehabilitation client, Arkansas

People

Artist: Ben Shahn, American 1898 - 1969

Date

1935

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Person 99.1
Human 99.1
Person 97.4
Clothing 93.4
Apparel 93.4
Face 91.1
Smile 87.6
People 77.5
Photo 68.5
Photography 68.5
Portrait 67
Boy 65.1
Child 62.6
Kid 62.6
Girl 58.9
Female 58.9
Door 56.3

Imagga
created on 2021-12-15

parent 65.8
mother 64.8
family 48
child 41
happy 40.7
father 39.4
love 37.1
male 37
dad 36.7
people 32.9
together 32.4
happiness 32.1
smiling 31.8
man 31.6
portrait 30.4
couple 29.6
daughter 27.4
adult 27.2
home 25.5
smile 24.2
brother 24.2
sibling 23.5
lifestyle 22.4
fun 21
kid 20.4
casual 20.3
son 20.1
kin 19.7
loving 19.1
grandma 18.4
boy 18.3
husband 18.1
togetherness 17.9
senior 17.8
children 17.3
cheerful 17.1
person 16.8
two 16.1
joy 15.9
cute 15.8
wife 15.2
indoors 14.9
old 14.6
couch 14.5
elderly 14.4
face 14.2
playing 13.7
hug 13.6
outdoors 13.4
baby 13.2
grandfather 13
youth 12.8
women 12.7
attractive 12.6
affectionate 12.6
holding 12.4
little 12.4
park 12.4
laughing 12.3
relationship 12.2
mature 12.1
sitting 12
room 11.9
interior 11.5
kids 11.3
relaxed 11.3
relaxing 10.9
aged 10.9
parenthood 10.7
hugging 10.7
two people 10.7
married 10.5
enjoying 10.4
life 10.3
day 10.2
adorable 10.2
indoor 10
girls 10
childhood 9.9
handsome 9.8
grandmother 9.8
parents 9.8
retired 9.7
boys 9.7
affection 9.7
30s 9.6
expression 9.4
embracing 8.8
bonding 8.8
men 8.6
color 8.3
20s 8.2
domestic 8.1
blond 8.1
romantic 8
cuddling 7.9
bond 7.9
pensioner 7.8
60s 7.8
play 7.8
outside 7.7
generation 7.7
outdoor 7.6
healthy 7.6
horizontal 7.5
house 7.5
enjoyment 7.5
leisure 7.5
juvenile 7.5
lady 7.3
group 7.3
dress 7.2
adolescent 7.1

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

human face 98.3
person 98.2
clothing 93
baby 92.6
toddler 92.3
smile 90.1
text 89.5
child 82.8
white 73.6
boy 71.8
black 71.1
posing 44.5

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 34-50
Gender Male, 81.6%
Calm 75.8%
Angry 9.8%
Sad 8.1%
Fear 2.8%
Surprised 1.4%
Confused 1%
Happy 0.9%
Disgusted 0.3%

AWS Rekognition

Age 4-12
Gender Male, 85.4%
Sad 57.1%
Calm 41.6%
Confused 0.5%
Happy 0.3%
Angry 0.2%
Surprised 0.2%
Disgusted 0.2%
Fear 0.1%

Microsoft Cognitive Services

Age 44
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%

Captions

Microsoft

a girl posing for a photo 93.6%
a boy and a girl posing for a photo 70.9%
a boy posing for a photo 70.8%