Human Generated Data

Title

[Julia and Tomas Feininger]

Date

1945-1946

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.592.1

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Julia and Tomas Feininger]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1945-1946

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.592.1

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-20

Person 99.1
Human 99.1
Person 97.4
Food 81.7
Meal 81.7
Dish 77.4
Home Decor 73.2
Finger 70.1
Clothing 64.9
Apparel 64.9
Dessert 64.2
People 62.8
Person 62.6
Table 60.7
Furniture 60.7
Cake 60.6
Indoors 60
Cream 59.7
Creme 59.7
Icing 59.7
Room 58.8

Clarifai
created on 2019-11-20

people 99.9
adult 98.9
group 97.3
two 96.5
group together 96.4
man 95.1
woman 95
wear 94.2
administration 92
three 90.2
several 89.5
one 89.1
leader 87.3
four 85.8
many 81.7
portrait 81.2
facial expression 80.6
furniture 79.3
outfit 78.9
actor 77.7

Imagga
created on 2019-11-20

man 22.9
people 22.3
person 22.2
male 19.1
statue 18.5
world 16.8
adult 16.1
sculpture 15.6
human 15
old 13.9
ancient 13.8
black 13.2
art 13.1
men 12.9
portrait 12.3
marble 11.6
room 11.4
home 11.2
girls 10.9
worker 10.7
device 10.5
body 10.4
antique 10.4
hair 10.3
face 9.9
religion 9.9
couple 9.6
historic 9.2
vintage 9.1
professional 8.8
work 8.6
culture 8.5
stone 8.4
monument 8.4
dark 8.3
occupation 8.2
tool 8.2
care 8.2
happy 8.1
history 8
family 8
lifestyle 7.9
building 7.9
love 7.9
architecture 7.8
uniform 7.7
health 7.6
nurse 7.6
hand 7.6
head 7.6
historical 7.5
city 7.5
patient 7.5
hospital 7.4
bathroom 7.4
mother 7.3
sexy 7.2
interior 7.1
medical 7.1
happiness 7

Google
created on 2019-11-20

Microsoft
created on 2019-11-20

wall 99.6
person 99.5
human face 95.9
clothing 94.9
man 91.9
text 87.1
black and white 56.5

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 27-43
Gender Female, 50.4%
Disgusted 0.5%
Angry 1.1%
Happy 53.2%
Confused 1.2%
Fear 0.5%
Calm 34%
Sad 8.5%
Surprised 0.9%

Microsoft Cognitive Services

Age 1
Gender Female

Feature analysis

Amazon

Person 99.1%

Categories

Imagga

interior objects 88.7%
paintings art 10.8%