Human Generated Data

Title

In the Studio

Date

20th century

People

Artist: Raphael Soyer, American 1899 - 1987

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Louise E. Bettens Fund, M11050

Human Generated Data

Title

In the Studio

People

Artist: Raphael Soyer, American 1899 - 1987

Date

20th century

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Louise E. Bettens Fund, M11050

Machine Generated Data

Tags

Amazon
created on 2019-10-29

Person 99.6
Human 99.6
Person 91.4
Art 79.2
Worker 77.7
Person 73
Hairdresser 65.8
Hair 63
Apparel 62.6
Clothing 62.6
Photo 58.7
Photography 58.7
Painting 57
Advertisement 56.6
Flooring 55.1

Clarifai
created on 2019-10-29

people 100
group 98.8
adult 98.6
child 97.2
two 97.1
woman 97
wear 95.8
man 95.5
furniture 95
room 91.7
print 91.6
offspring 91.4
administration 91.2
actress 89.1
home 88.8
family 88.5
three 87.1
portrait 87.1
four 86.5
sit 86.1

Imagga
created on 2019-10-29

groom 59.4
dress 28.9
people 25.1
person 19.4
clothing 18.5
bride 18.4
couple 17.4
fashion 17.3
happy 16.9
love 16.6
wedding 16.6
happiness 16.5
portrait 15.5
man 15.5
adult 15
male 14.3
old 13.9
lady 13
women 12.6
men 12
wall 12
attractive 11.9
two 11.9
child 11.8
kimono 11.7
robe 11.3
celebration 11.2
hair 11.1
church 11.1
vintage 10.7
bouquet 10.4
gown 10.2
human 9.7
married 9.6
ancient 9.5
garment 9.5
culture 9.4
architecture 9.4
clothes 9.4
smile 9.3
elegance 9.2
holding 9.1
kin 9
religion 9
one 9
mother 8.9
interior 8.8
together 8.8
building 8.7
husband 8.6
model 8.6
marriage 8.5
face 8.5
females 8.5
pretty 8.4
tradition 8.3
black 8.1
umbrella 8
family 8
smiling 8
lifestyle 7.9
daughter 7.8
veil 7.8
ceremony 7.8
youth 7.7
window 7.6
wife 7.6
life 7.6
joy 7.5
emotion 7.4
retro 7.4
alone 7.3
girls 7.3
world 7.2
looking 7.2
cute 7.2
holiday 7.2
art 7.2
history 7.2
posing 7.1
day 7.1

Google
created on 2019-10-29

Microsoft
created on 2019-10-29

text 99
clothing 95.8
person 94.9
woman 79.8
drawing 74.9
dress 63.2
sketch 56.9
gallery 53.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 9-19
Gender Female, 52%
Calm 47.7%
Disgusted 45.2%
Angry 45.1%
Surprised 45.1%
Fear 45.3%
Happy 45.7%
Sad 50.9%
Confused 45.1%

AWS Rekognition

Age 22-34
Gender Male, 52%
Fear 46.2%
Happy 45.7%
Disgusted 45.3%
Angry 45.8%
Calm 49.4%
Confused 45.2%
Surprised 45.2%
Sad 47.2%

Feature analysis

Amazon

Person 99.6%

Categories

Captions

Microsoft
created on 2019-10-29

an old photo of a girl 57.2%
a group of people in a room 57.1%
an old photo of a person 57%

Text analysis

Amazon

RAONEL
rea
RAONEL Soyee
Soyee

Google

RADNEL Soves reaphael S
RADNEL
Soves
reaphael
S