Human Generated Data

Title

[Julia Feininger and unidentified couple]

Date

Unknown

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.437.28

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Julia Feininger and unidentified couple]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

Unknown

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.437.28

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2021-12-13

Person 97.9
Human 97.9
Person 97.5
Clothing 95.6
Apparel 95.6
Face 95.1
Outdoors 72.4
Nature 69.5
Photo 68.2
Portrait 68.2
Photography 68.2
Meal 67.1
Food 67.1
Brick 66.9
Text 60.7
Female 59.6
Bridegroom 57.4
Wedding 57.4
Man 57.3
Shirt 56.1
Head 55.4

Clarifai
created on 2023-10-15

people 100
adult 99.2
portrait 99
two 98.9
man 98.5
leader 95.5
administration 95.4
group 94.3
wear 94
one 92.9
three 91.8
group together 89.2
military 85.9
scientist 85.6
facial hair 83.1
golfer 81.3
lid 79.9
music 79.6
four 79.3
war 77.3

Imagga
created on 2021-12-13

man 35.6
person 29.2
male 24.1
people 22.8
musical instrument 22.8
adult 17.8
old 17.4
stringed instrument 15.9
portrait 15.5
couple 13.9
wind instrument 13.3
senior 13.1
men 12.9
gun 12.8
fashion 12.8
war 12.6
sax 12.5
weapon 12.4
uniform 12
world 11.1
clothing 11
head 10.9
soldier 10.7
face 10.6
military 10.6
hair 10.3
love 10.2
two 10.2
danger 10
outdoor 9.9
holding 9.9
beard 9.9
human 9.7
outdoors 9.7
banjo 9.4
lifestyle 9.4
sport 9.2
attractive 9.1
dirty 9
religion 9
work 8.6
statue 8.6
model 8.5
wall 8.5
mature 8.4
brass 8.2
protection 8.2
mother 8.1
family 8
looking 8
smile 7.8
mask 7.7
fan 7.4
aged 7.2
trombone 7.2
rifle 7.2
women 7.1
businessman 7.1

Google
created on 2021-12-13

Microsoft
created on 2021-12-13

person 96.2
drawing 88.6
clothing 88.1
man 81.5
text 79.3
old 70.8
sketch 69.8
human face 64.8
black and white 50.9
bowed instrument 10.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 30-46
Gender Female, 87.9%
Happy 86.2%
Calm 10.8%
Sad 1.1%
Surprised 0.6%
Confused 0.6%
Fear 0.4%
Angry 0.2%
Disgusted 0.2%

AWS Rekognition

Age 21-33
Gender Male, 80.6%
Calm 65.6%
Happy 27.2%
Sad 3.6%
Surprised 1.3%
Angry 0.9%
Confused 0.7%
Fear 0.4%
Disgusted 0.3%

Feature analysis

Amazon

Person 97.9%

Captions

Microsoft
created on 2021-12-13

an old photo of a person 91.8%
an old photo of a person 91.7%
old photo of a person 90.3%