Human Generated Data

Title

[Shop window and reflection]

Date

Before 1945

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.572.9

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Shop window and reflection]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

Before 1945

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.572.9

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-20

Human 94.2
Person 94.2
Vehicle 82.1
Automobile 82.1
Car 82.1
Transportation 82.1
Clothing 77.7
Apparel 77.7
Airplane 74.9
Aircraft 74.9
Art 73.3
Sculpture 65
Face 64.3
Furniture 61.2
Drawing 61.1
Text 60.5
Person 48

Clarifai
created on 2019-11-20

people 99.4
furniture 94.9
group 94.7
administration 93.4
room 93.3
no person 92.4
vehicle 91
adult 90.2
indoors 89.7
many 89.2
chair 89
military 87.7
war 85.7
transportation system 85.3
leader 84.7
man 79.5
aircraft 78.7
seat 78
home 77
luxury 76.4

Imagga
created on 2019-11-20

interior 35.4
room 35.4
table 30.9
architecture 21.2
house 20.9
glass 20.3
furniture 20
decor 17.7
modern 17.5
chair 17.1
home 16.7
people 16.7
window 15.6
hall 15.1
hospital 14.5
dining 13.3
city 13.3
indoors 13.2
luxury 12.9
design 12.4
men 12
dinner 11.8
restaurant 11.5
scene 11.2
building 11.2
blackboard 11.2
floor 11.1
decoration 10.9
kitchen 10.9
elegance 10.9
style 10.4
business 10.3
man 10.1
indoor 10
light 10
barbershop 10
travel 9.9
station 9.7
urban 9.6
apartment 9.6
residential 9.6
day 9.4
inside 9.2
classroom 9.2
wood 9.2
life 9
seat 8.7
sketch 8.7
expensive 8.6
party 8.6
shop 8.4
wall 8.4
worker 8.4
negative 8.3
work 8.2
construction 7.7
old 7.7
comfortable 7.6
hotel 7.6
counter 7.6
person 7.3
structure 7.2
tile 7.2
center 7.2
family 7.1
working 7.1

Google
created on 2019-11-20

Microsoft
created on 2019-11-20

indoor 92.6
white 76.2
drawing 68.9
old 65.7
furniture 57.8
sketch 57.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 14-26
Gender Male, 54.6%
Fear 45.5%
Angry 45.3%
Surprised 45.4%
Happy 45.1%
Confused 45.2%
Disgusted 45.1%
Calm 52.6%
Sad 45.9%

Feature analysis

Amazon

Person 94.2%
Car 82.1%
Airplane 74.9%

Categories

Captions

Text analysis

Amazon

Zm
E