Human Generated Data

Title

Untitled (two girls and baby combing hair)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17433

Human Generated Data

Title

Untitled (two girls and baby combing hair)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Human 99.7
Person 99.7
Person 98
Person 97.9
Clothing 89.3
Apparel 89.3
Person 87.9
Person 83.8
Face 79.2
People 76.1
Person 73.9
Child 73.7
Kid 73.7
Indoors 72.4
Female 70.2
Door 69.5
Furniture 67.9
Girl 65.6
Baby 65.5
Room 62.9
Sitting 59.2
Floor 57.3
Hair 56.3
Play 56.3

Imagga
created on 2022-02-26

man 34.9
people 34
male 27.7
person 26.1
adult 23.7
professional 21.1
happy 20.7
smiling 20.2
hospital 17.9
portrait 17.5
patient 16.7
indoors 16.7
office 16.6
work 16.5
bow tie 16.3
doctor 16
lifestyle 15.9
medical 15.9
shop 15.7
smile 15.7
sunglasses 15.2
senior 15
business 14.6
looking 14.4
home 14.3
sitting 13.7
women 13.4
worker 13.4
necktie 13
occupation 12.8
two 12.7
spectacles 12.7
health 12.5
job 12.4
working 12.4
medicine 12.3
specialist 12.3
barbershop 12.2
desk 12
happiness 11.7
face 11.4
couple 11.3
computer 11.2
men 11.2
laptop 10.9
cheerful 10.6
salon 10.4
technology 10.4
mercantile establishment 10.2
nurse 10.2
child 10
modern 9.8
human 9.7
coat 9.6
education 9.5
togetherness 9.4
day 9.4
clothing 9.1
student 9.1
black 9
family 8.9
group 8.9
equipment 8.9
case 8.8
interior 8.8
garment 8.8
two people 8.7
hair 8.7
love 8.7
casual 8.5
pretty 8.4
attractive 8.4
mature 8.4
old 8.4
optical instrument 8.3
holding 8.3
care 8.2
team 8.1
handsome 8
together 7.9
room 7.8
color 7.8
corporate 7.7
profession 7.7
illness 7.6
horizontal 7.5
enjoyment 7.5
leisure 7.5
one 7.5
treatment 7.3
alone 7.3
lady 7.3
cute 7.2
science 7.1
businessman 7.1
clinic 7
chair 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

window 93.1
person 91.6
human face 91.4
toddler 88.2
clothing 88
text 85.5
black and white 85.1
baby 79.6

Face analysis

Amazon

AWS Rekognition

Age 7-17
Gender Female, 98.2%
Calm 95%
Surprised 3.3%
Happy 1%
Fear 0.3%
Disgusted 0.2%
Sad 0.1%
Confused 0.1%
Angry 0.1%

AWS Rekognition

Age 18-24
Gender Male, 81.9%
Calm 67.9%
Surprised 15.3%
Happy 12%
Fear 2.1%
Sad 1.1%
Disgusted 0.6%
Angry 0.5%
Confused 0.5%

AWS Rekognition

Age 27-37
Gender Male, 99%
Sad 99.1%
Calm 0.6%
Happy 0.1%
Surprised 0.1%
Confused 0.1%
Disgusted 0%
Angry 0%
Fear 0%

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft

a group of people sitting in front of a window 81.9%
a group of people in front of a window 81.8%
a group of people sitting at a table in front of a window 77.2%

Text analysis

Amazon

46
ACT

Google

46
46