Human Generated Data

Title

[Reflection in shop window]

Date

1940s-1950s

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1009.252

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Reflection in shop window]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1009.252

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2023-10-25

Boy 98.6
Male 98.6
Person 98.6
Teen 98.6
Photography 94.1
Outdoors 94
Nature 89
Face 86.5
Head 86.5
Person 86.2
Window 70.9
Indoors 69.9
Portrait 69
Weather 67.2
Furniture 60.3
Smoke 57
Terminal 56.8
Badminton 56.3
Sport 56.3
Electronics 56.3
Screen 56.3
Person 56.3
Door 55.6
People 55.6
Computer Hardware 55.2
Hardware 55.2
Monitor 55.2

Clarifai
created on 2023-10-15

people 99
monochrome 98
woman 94.6
indoors 93.6
window 93.6
technology 92.1
adult 91.2
screen 90.1
man 87.7
wear 86.4
internet 86.2
two 84.7
one 82.3
virtual 80.5
computer 80.1
interaction 78.1
display 77.8
group 74.9
future 74.2
music 73.7

Imagga
created on 2019-02-01

negative 36
film 29.9
photographic paper 21.6
blackboard 21
wall 18.8
house 18.4
barbershop 17.9
people 17.8
window 17.7
shop 17.6
newspaper 17.4
building 16
room 15.8
man 15.4
person 14.5
photographic equipment 14.4
product 13.3
old 12.5
architecture 12.5
portrait 12.3
male 12.1
human 12
home 12
adult 11.9
mercantile establishment 11.8
business 11.5
black 11.4
creation 10.8
interior 10.6
businessman 10.6
glass 10.1
design 9.6
classroom 9.5
happy 9.4
teacher 9.2
modern 9.1
one 9
light 8.7
windows 8.6
space 8.5
city 8.3
place of business 7.9
high 7.8
education 7.8
art 7.5
vintage 7.4
worker 7.3
dress 7.2
body 7.2
hair 7.1
women 7.1

Google
created on 2019-02-01

Microsoft
created on 2019-02-01

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 13-21
Gender Female, 99.1%
Calm 72.5%
Confused 13.8%
Surprised 7.8%
Fear 6.2%
Sad 5.7%
Disgusted 1.5%
Angry 1.3%
Happy 0.4%

Microsoft Cognitive Services

Age 22
Gender Female

Feature analysis

Amazon

Boy 98.6%
Male 98.6%
Person 98.6%
Teen 98.6%

Categories

Imagga

paintings art 99.9%

Captions

Microsoft
created on 2019-02-01

a person sitting in front of a window 42.8%

Text analysis

Amazon

29
DRESSES
RACK
THRIFT RACK
99
THRIFT

Google

29%
DRESSES 29%
DRESSES