Human Generated Data

Title

Untitled (two girls with dolls)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17805

Human Generated Data

Title

Untitled (two girls with dolls)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.1
Human 99.1
Person 98.5
Apparel 97.2
Clothing 97.2
Dress 95.2
Female 91.1
Furniture 90
Leisure Activities 87.5
Girl 75.4
Woman 72.6
Musical Instrument 66.2
Piano 66.2
Indoors 64.5
People 61.5
Portrait 61.2
Photography 61.2
Face 61.2
Photo 61.2
Art 60.3
Drawing 60.1
Floor 58.4
Room 58
Kid 57.9
Child 57.9
Chair 56.4

Imagga
created on 2022-02-26

people 27.9
group 25.8
person 22.9
man 20.9
human 20.2
team 18.8
male 18.4
men 17.2
adult 17
art 16.9
body 16
silhouette 15.7
black 15.6
business 15.2
fashion 15.1
style 14.1
portrait 13.6
women 12.6
drawing 12.1
dress 11.7
sport 11.7
active 10.8
businessman 10.6
sketch 10.5
standing 10.4
walking 10.4
render 10.4
motion 10.3
dance 10.2
design 10.1
together 9.6
crowd 9.6
lifestyle 9.4
3d 9.3
planner 9.1
suit 9
posing 8.9
urban 8.7
silhouettes 8.7
boy 8.7
anatomy 8.7
corporate 8.6
sitting 8.6
attractive 8.4
fitness 8.1
symbol 8.1
activity 8.1
success 8
science 8
skeleton 7.8
party 7.7
dancer 7.7
grunge 7.7
performance 7.7
child 7.6
health 7.6
film 7.6
power 7.6
happy 7.5
teamwork 7.4
light 7.4
graphic 7.3
work 7.3
professional 7.2
performer 7.2
family 7.1
worker 7.1
brass 7.1
negative 7.1
medical 7.1
indoors 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 96.1
clothing 88.2
dress 81.3
person 80.6
drawing 77.6
footwear 75.4
woman 56.9
old 42.1
posing 38.3

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Male, 97.9%
Calm 88.5%
Surprised 5.8%
Sad 2.4%
Happy 0.8%
Fear 0.8%
Confused 0.6%
Disgusted 0.5%
Angry 0.5%

AWS Rekognition

Age 41-49
Gender Female, 97.6%
Sad 84.6%
Happy 6.1%
Calm 5.9%
Confused 1.9%
Disgusted 0.6%
Angry 0.4%
Surprised 0.4%
Fear 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%

Captions

Microsoft

a group of people posing for a photo 80.1%
a group of people posing for the camera 80%
a group of people posing for a picture 79.9%

Text analysis

Amazon

KODAKVH5W

Google

AGO>
MITYT3RA°2-- AGO>
MITYT3RA°2--