Human Generated Data

Title

Untitled (three children)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16783

Human Generated Data

Title

Untitled (three children)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.1
Human 99.1
Apparel 99.1
Clothing 99.1
Dress 98.3
Person 97.5
Person 96.1
Female 80.7
Suit 75.7
Coat 75.7
Overcoat 75.7
People 75.2
Face 72.3
Girl 71.8
Shirt 71.3
Child 70.7
Kid 70.7
Furniture 66.1
Chair 66.1
Photo 65.8
Photography 65.8
Portrait 65.8
Baby 65.2
Person 64.1
Toy 62.5
Finger 61.6
Pants 55.8
Bed 55.7
Hair 55.3

Imagga
created on 2022-02-26

man 37.6
negative 35.2
people 33.5
male 32.8
person 29.8
happy 28.8
film 27.8
couple 27
senior 24.4
portrait 22.6
photographic paper 21.4
adult 19.9
smiling 18.8
family 18.7
elderly 18.2
love 18.1
happiness 17.2
child 17.1
together 15.8
patient 15.4
medical 15
professional 14.9
husband 14.6
men 14.6
photographic equipment 14.3
indoors 14.1
home 13.6
women 13.4
wife 13.3
mature 13
smile 12.8
casual 12.7
retirement 12.5
world 12.3
lifestyle 12.3
sitting 12
old 11.8
health 11.8
groom 11.8
married 11.5
hospital 11.5
looking 11.2
grandfather 10.8
care 10.7
retired 10.7
affection 10.6
bride 10.5
sibling 10.5
group 10.5
marriage 10.4
doctor 10.3
kin 10.3
life 10.2
teacher 10
handsome 9.8
pretty 9.8
attractive 9.8
older 9.7
look 9.6
enjoying 9.5
work 9.4
wedding 9.2
mother 9.1
hand 9.1
human 9
team 9
cheerful 8.9
lady 8.9
to 8.8
job 8.8
face 8.5
friends 8.5
holding 8.2
bow tie 8.2
worker 8.1
suit 8.1
romantic 8
nurse 8
celebration 8
clothing 8
room 8
businessman 7.9
medicine 7.9
60s 7.8
father 7.8
party 7.7
mask 7.7
clinic 7.7
illness 7.6
relationship 7.5
friendship 7.5
fun 7.5
leisure 7.5
camera 7.4
sexy 7.2
hair 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

person 99.4
human face 96.7
text 91
clothing 82
smile 76.7
baby 75.6
toddler 67.2

Face analysis

Amazon

Google

AWS Rekognition

Age 6-14
Gender Female, 95%
Happy 85.5%
Surprised 13.2%
Calm 0.4%
Sad 0.3%
Fear 0.2%
Disgusted 0.2%
Angry 0.1%
Confused 0.1%

AWS Rekognition

Age 23-31
Gender Female, 58.1%
Calm 89.5%
Surprised 2.5%
Fear 2.4%
Happy 2.2%
Sad 1.6%
Disgusted 0.9%
Confused 0.5%
Angry 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%

Captions

Microsoft

a group of people standing in front of a box 56.3%
a group of people sitting at a table 55.4%
a group of people sitting in a box 42.1%

Text analysis

Amazon

DE