Human Generated Data

Title

[Man with guitar]

Date

1940s-1950s

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1004.114

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Man with guitar]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1004.114

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2023-10-23

Adult 98.5
Male 98.5
Man 98.5
Person 98.5
Guitar 96.6
Musical Instrument 96.6
Sitting 96.3
Furniture 92.8
Face 89.9
Head 89.9
Car 76.9
Transportation 76.9
Vehicle 76.9
Chair 61.6
Electrical Device 57.5
Microphone 57.5
Guitarist 57.3
Leisure Activities 57.3
Music 57.3
Musician 57.3
Performer 57.3
Dining Table 57
Table 57
Architecture 55.8
Building 55.8
Indoors 55.8
Living Room 55.8
Room 55.8
Dining Room 55.7

Clarifai
created on 2023-10-15

people 99.7
monochrome 98.4
portrait 98.3
one 95.6
man 95.6
street 95.1
adult 94.9
chair 94.2
seat 93
furniture 89.6
sit 89
boy 88.2
child 88.1
music 87.7
analogue 84.2
window 84.2
vintage 84
art 82.3
wear 82
musician 79.3

Imagga
created on 2019-02-03

television 31.5
man 22.8
male 18.4
telecommunication system 18
person 17.4
people 16.2
room 14.7
barbershop 13.9
chair 13.9
black 13.8
adult 13.8
light 12
sitting 12
home 12
shop 11.9
hair 11.9
dark 11.7
lifestyle 11.6
one 11.2
working 9.7
portrait 9.7
equipment 9.6
blond 9.5
skin 9.3
indoors 8.8
happy 8.8
mercantile establishment 8.7
smiling 8.7
business 8.5
attractive 8.4
house 8.4
worker 8.2
device 7.9
work 7.8
smile 7.8
happiness 7.8
face 7.8
model 7.8
pretty 7.7
computer 7.6
furniture 7.6
fashion 7.5
relaxation 7.5
human 7.5
back 7.3
water 7.3
sexy 7.2
dress 7.2
looking 7.2
body 7.2
night 7.1

Google
created on 2019-02-03

Microsoft
created on 2019-02-03

window 98.1
sitting 97.6
old 93.5
black 71.5
vintage 27.6
black and white 27.6
person 25.8
chair 8.4
man 5.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 52-60
Gender Male, 95.9%
Calm 100%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Happy 0%
Confused 0%
Angry 0%
Disgusted 0%

Feature analysis

Amazon

Adult 98.5%
Male 98.5%
Man 98.5%
Person 98.5%
Car 76.9%

Categories

Imagga

paintings art 98.3%