Human Generated Data

Title

Untitled (two girls and baby combing hair)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17432

Human Generated Data

Title

Untitled (two girls and baby combing hair)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Apparel 98.1
Clothing 98.1
Human 98
Person 98
Person 92.5
Person 91.2
Person 90.6
Female 74.4
Dance 71.7
Costume 69.3
Face 67.6
Person 67.5
Girl 65.3
Door 65.1
Sunglasses 63.7
Accessory 63.7
Accessories 63.7
Photo 61.5
Photography 61.5
Portrait 60.9
Child 59.6
Kid 59.6
Dance Pose 57.7
Leisure Activities 57.7
People 57.5
Floor 56.3

Imagga
created on 2022-02-26

man 27.5
toilet tissue 25
people 24.5
person 22.9
male 20.6
tissue 19.9
adult 19.6
helmet 19.4
medical 19.4
football helmet 18.5
equipment 17.2
doctor 16.9
patient 16.8
medicine 16.7
work 15.7
professional 15.6
health 15.3
men 14.6
hospital 14.2
technology 14.1
device 13.5
science 13.3
worker 13.3
room 13.1
human 12.7
paper 12.7
mask 12.5
headdress 12.2
clothing 11.6
holding 11.5
working 11.5
nurse 11.4
home 10.4
portrait 10.3
black 10.2
care 9.9
instrument 9.8
profession 9.6
women 9.5
specialist 9.5
lifestyle 9.4
occupation 9.2
modern 9.1
clinic 9
team 9
interior 8.8
indoors 8.8
happy 8.8
lab 8.7
happiness 8.6
face 8.5
negative 8.5
house 8.4
treatment 8.3
dress 8.1
film 8.1
love 7.9
scientist 7.8
surgeon 7.8
chemistry 7.7
laboratory 7.7
illness 7.6
research 7.6
wedding 7.4
glass 7.3
cheerful 7.3
indoor 7.3
suit 7.2
looking 7.2
covering 7.2
hair 7.1
smile 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 96.9
black and white 78.5
clothing 71.2
toddler 63.8
baby 63.3
human face 59
person 58.4

Face analysis

Amazon

AWS Rekognition

Age 25-35
Gender Female, 71.3%
Surprised 88.3%
Calm 7.9%
Fear 2%
Happy 1%
Sad 0.3%
Disgusted 0.2%
Confused 0.1%
Angry 0.1%

AWS Rekognition

Age 23-31
Gender Female, 75.8%
Calm 99.8%
Fear 0.1%
Sad 0%
Confused 0%
Surprised 0%
Disgusted 0%
Happy 0%
Angry 0%

AWS Rekognition

Age 22-30
Gender Male, 99.2%
Calm 91.6%
Happy 5.8%
Sad 1.6%
Fear 0.4%
Surprised 0.3%
Confused 0.2%
Disgusted 0.1%
Angry 0.1%

Feature analysis

Amazon

Person 98%
Sunglasses 63.7%

Captions

Microsoft

a man holding a stuffed animal 50.2%
a man is holding a stuffed animal 50.1%
a man sitting in front of a window 50%

Text analysis

Amazon

45
ras
KODVK-EVEELA