Human Generated Data

Title

[Figurehead in the Peabody Essex Museum, Salem, Massachusetts]

Date

1950-1952

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.336.6

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Figurehead in the Peabody Essex Museum, Salem, Massachusetts]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1950-1952

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.336.6

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-05-29

Person 98.8
Human 98.8
Clothing 91.3
Apparel 91.3
Face 90.8
Female 83.3
Floor 70.9
Photography 69.4
Photo 69.4
Portrait 69.4
Furniture 68.4
Sleeve 66.7
Woman 65.7
Indoors 63.6
Flooring 62.4
Sitting 59.3
Girl 57.6
Suit 56.5
Coat 56.5
Overcoat 56.5
Door 55.3

Clarifai
created on 2019-05-29

people 100
adult 99.4
one 99.3
portrait 96.2
wear 95.4
two 94.4
man 94.1
woman 93
room 89
administration 87.7
indoors 86.9
child 86.9
military 80.8
group 79.2
group together 78
reclining 76.3
furniture 74.2
interaction 73.1
offense 72.1
retro 72

Imagga
created on 2019-05-29

sill 37.3
wall 35.6
structural member 28.5
old 26.5
support 22.1
device 20.7
house 18.4
door 18
architecture 17.2
building 16
pretty 14.7
lady 13.8
water 12.7
attractive 12.6
window 12.2
texture 11.8
people 11.7
brick 11.7
adult 11.7
ancient 11.2
person 10.5
sexy 10.4
home 10.4
hair 10.3
grunge 10.2
lifestyle 10.1
model 10.1
relaxation 10.1
aged 10
dirty 9.9
decay 9.6
body 9.6
skin 9.3
stucco 9.3
face 9.2
city 9.1
vintage 9.1
bathroom 9.1
portrait 9.1
detail 8.9
vessel 8.7
cute 8.6
construction 8.6
fashion 8.3
alone 8.2
sensuality 8.2
room 8
concrete 7.7
health 7.6
head 7.6
happy 7.5
care 7.4
man 7.4
light 7.4
paint 7.2
dress 7.2
black 7.2
bathtub 7.2
stairs 7.1
travel 7

Google
created on 2019-05-29

Microsoft
created on 2019-05-29

human face 93.7
person 91.2
clothing 85.7
black and white 79.3
white 77.9
standing 75.2
posing 64

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Male, 94.1%
Angry 18.2%
Calm 8.3%
Confused 4.6%
Disgusted 7.6%
Happy 2.3%
Sad 57.1%
Surprised 1.9%

Feature analysis

Amazon

Person 98.8%

Categories

Imagga

paintings art 97.9%
text visuals 1%