Human Generated Data

Title

[Julia Feininger and Charles Ross at Mills College, Oakland, California]

Date

1936-1937

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.111.2

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Julia Feininger and Charles Ross at Mills College, Oakland, California]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1936-1937

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.111.2

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2023-10-11

Face 100
Head 100
Photography 100
Portrait 100
Clothing 99.8
Coat 99.8
Person 99.4
Adult 99.4
Female 99.4
Woman 99.4
Person 99.2
Adult 99.2
Male 99.2
Man 99.2
Happy 89
Laughing 62.1
Formal Wear 57.8
Suit 57.8
Smile 57.5
Body Part 56.7
Finger 56.7
Hand 56.7
Jacket 55.7
Lady 55.6

Clarifai
created on 2018-08-23

people 99.8
adult 98.7
one 98.2
portrait 98
man 97.7
wear 96.7
retro 94.9
leader 93.9
outfit 92.3
military 92.1
administration 90.4
war 89.8
two 89.2
uniform 88
facial expression 87.7
music 86.3
musician 84.8
veil 83.3
lid 82
profile 80.9

Imagga
created on 2018-08-23

graffito 100
decoration 64.7
painter 22.9
black 19.2
portrait 16.8
model 15.6
man 15.5
money 15.3
sexy 15.3
currency 15.3
body 15.2
close 14.3
one 14.2
dollar 13.9
male 13.5
person 13.1
banking 12.9
face 12.8
hair 12.7
newspaper 12.6
art 12.4
human 12
cash 11.9
old 11.8
people 11.7
financial 11.6
vintage 11.6
grunge 11.1
business 10.9
adult 10.3
skin 10.2
finance 10.1
head 10.1
product 9.9
fashion 9.8
attractive 9.8
ancient 9.5
bill 9.5
pretty 9.1
sensual 9.1
bank 9
closeup 8.8
hands 8.7
lifestyle 8.7
paper 8.6
wall 8.6
antique 7.8
men 7.7
creation 7.7
hand 7.7
dark 7.5
economy 7.4
symbol 7.4
sensuality 7.3
dirty 7.2
wealth 7.2
wet 7.2

Google
created on 2018-08-23

Microsoft
created on 2018-08-23

person 98.9
man 96.2
black 72
old 57.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 25-35
Gender Male, 72.1%
Happy 99.9%
Surprised 6.3%
Fear 5.9%
Sad 2.1%
Calm 0%
Angry 0%
Disgusted 0%
Confused 0%

AWS Rekognition

Age 33-41
Gender Male, 70.6%
Sad 98.5%
Calm 18.8%
Surprised 14%
Fear 6.4%
Angry 4.6%
Happy 4.3%
Disgusted 2.5%
Confused 2.1%

Feature analysis

Amazon

Person 99.4%
Adult 99.4%
Female 99.4%
Woman 99.4%
Male 99.2%
Man 99.2%

Captions