Human Generated Data

Title

[Unidentified woman, possibly in Feininger residence]

Date

c. 1930

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.674.1

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Unidentified woman, possibly in Feininger residence]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

c. 1930

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.674.1

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2022-07-01

Person 99.6
Human 99.6
Meal 98.1
Food 98.1
Dish 94
Home Decor 92.4
Clothing 91.1
Apparel 91.1
Furniture 85.1
Chair 79.1
Table 74.8
Text 73.9
Linen 68.1
Face 62.5
Dining Table 61.4
Bowl 61.4
Shorts 60.4
Female 57.9
Vacation 57.5
Suit 57.2
Coat 57.2
Overcoat 57.2
Photography 55.7
Photo 55.7

Imagga
created on 2022-07-01

man 31.6
male 25.6
person 23.5
people 22.3
adult 20.3
negative 19.6
sport 19.2
winter 17
snow 16.6
film 15.7
active 15.3
outdoors 14.2
action 13
cold 12.9
lifestyle 12.3
equipment 12.3
photographic paper 12.1
men 12
hockey stick 11.6
shoe 11
portrait 11
exercise 10.9
city 10.8
dance 10.5
legs 10.4
black 10.2
clothing 10.1
leisure 10
stick 9.8
posing 9.8
urban 9.6
boy 9.6
motion 9.4
fashion 9
dress 9
fitness 9
cool 8.9
guy 8.9
businessman 8.8
puck 8.7
play 8.6
performance 8.6
wall 8.6
ice 8.5
travel 8.4
street 8.3
photographic equipment 8.2
fun 8.2
worker 8.1
mountain 8
body 8
graffito 8
women 7.9
love 7.9
couple 7.8
disk 7.8
season 7.8
run 7.7
jeans 7.6
casual 7.6
foot 7.6
decoration 7.6
walking 7.6
elegance 7.6
happy 7.5
human 7.5
one 7.5
skateboard 7.4
dancer 7.4
vacation 7.4
teenager 7.3
professional 7.3
athlete 7.3
pose 7.2
music 7.2
sports equipment 7.2
looking 7.2

Google
created on 2022-07-01

Microsoft
created on 2022-07-01

outdoor 98.8
person 96.8
man 96.3
drawing 95.6
sketch 88.7
clothing 88.3
text 85.5
black and white 82.9
food 63.3
footwear 60.5
jumping 60.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 33-41
Gender Male, 99.1%
Sad 88.2%
Calm 35.6%
Confused 14.2%
Fear 7.6%
Surprised 6.6%
Disgusted 2.5%
Angry 1.1%
Happy 1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Categories

Imagga

paintings art 99.5%

Text analysis

Google

30²