Human Generated Data

Title

[Painting by unidentified artist]

Date

1940s-1950s

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1002.16

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Painting by unidentified artist]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1002.16

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Art 95.8
Person 91
Human 91
Painting 88.7
Face 58

Clarifai
created on 2019-11-16

people 99.2
portrait 97.3
one 96.7
adult 96.5
art 91.9
wear 91.8
painting 88.3
woman 86.6
museum 85.6
man 83.4
indoors 82.6
music 82
no person 81.1
light 79.9
shadow 79
picture frame 78.8
retro 78.2
child 76.2
window 75.7
room 75.4

Imagga
created on 2019-11-16

elevator 54.5
lifting device 43.7
device 38.2
black 23.2
man 22.8
one 22.4
person 22.3
male 22
portrait 18.8
dark 18.4
currency 17
money 17
call 15.1
people 15.1
adult 14.9
covering 14.9
mask 14.8
dollar 13.9
human 13.5
business 13.4
bill 13.3
suit 11.9
cash 11.9
finance 11.8
attractive 11.2
expression 11.1
close 10.8
sculpture 10.7
face 10.7
night 10.7
vintage 9.9
fashion 9.8
disguise 9.8
looking 9.6
banking 9.2
alone 9.1
bank 9
financial 8.9
posing 8.9
art 8.8
office 8.8
body 8.8
eyes 8.6
model 8.6
single 8.2
statue 8.1
wealth 8.1
history 8
sexy 8
light 8
hundred 7.7
culture 7.7
old 7.7
exchange 7.6
serious 7.6
clothing 7.6
attire 7.5
telephone 7.5
lifestyle 7.2
paper 7.2
handsome 7.1
love 7.1
figure 7.1
businessman 7.1

Google
created on 2019-11-16

Black 95.6
Picture frame 86
Room 71.4
Portrait 71.3
Photography 67.8
Darkness 65.8
Visual arts 62.5
Square 58.9
Art 58.1
Tints and shades 51.7

Microsoft
created on 2019-11-16

human face 99.4
person 97.1
painting 96.2
man 94.3
clothing 84.1
drawing 77.6
art 71.4
text 66
glasses 56.4
picture frame 9.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 29-45
Gender Male, 97%
Surprised 0.4%
Sad 3%
Disgusted 0.2%
Calm 95.1%
Fear 0.1%
Angry 0.6%
Happy 0.2%
Confused 0.4%

Microsoft Cognitive Services

Age 25
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 91%