Human Generated Data

Title

[Shop window]

Date

1950s?

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.329.14

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Shop window]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1950s?

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.329.14

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-05-29

Apparel 96.9
Clothing 96.9
Person 93.1
Human 93.1
Person 92.8
Shorts 80.9
People 75
Person 74.6
Person 71
Furniture 66.5
Chair 66.5
Door 61.8
Face 58.8
Tire 55.1

Clarifai
created on 2019-05-29

people 99.7
adult 97.2
group together 96.3
man 94.6
one 94.5
wear 94
two 94
administration 93.7
group 93
monochrome 92.5
woman 90.1
many 89.7
three 89.4
vehicle 88.6
several 88.5
war 84.3
four 83.7
military 83.3
furniture 82.3
child 81.5

Imagga
created on 2019-05-29

case 68.5
shop 32.3
window 21.4
mercantile establishment 21.3
old 16.7
building 15.4
people 15
place of business 14.2
city 14.1
door 13.9
wall 13.7
toyshop 13.3
architecture 13.3
detail 12.9
china cabinet 11.7
vintage 11.6
cabinet 11.5
light 11.4
urban 11.3
glass 10.9
black 10.8
furniture 10.5
man 10.1
refrigerator 10
decoration 9.9
modern 9.8
interior 9.7
ancient 9.5
person 9.4
industry 9.4
house 9.2
industrial 9.1
worker 8.9
grunge 8.5
business 8.5
design 8.4
retro 8.2
white goods 8.1
home appliance 7.9
art 7.8
construction 7.7
texture 7.6
power 7.5
fashion 7.5
equipment 7.5
barbershop 7.3
historic 7.3
dirty 7.2
color 7.2
structure 7.2
bright 7.1
adult 7.1
steel 7.1
work 7.1
machine 7.1
establishment 7
life 7

Google
created on 2019-05-29

Photograph 95.9
White 95.9
Snapshot 87.6
Black-and-white 82.7
Window 77.2
Monochrome 75.8
Photography 70.6
Display case 61.4
Glass 58.6

Microsoft
created on 2019-05-29

human face 93.5
black and white 90.2
person 86.4
window 84.8
clothing 82
mirror 70.9
store 40.9
cabinet 29.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 14-23
Gender Female, 73.7%
Angry 1.5%
Calm 77.9%
Confused 6.3%
Disgusted 1.1%
Happy 1.9%
Sad 6.5%
Surprised 4.7%

AWS Rekognition

Age 26-43
Gender Female, 99.8%
Sad 6.1%
Disgusted 2.3%
Surprised 15.7%
Calm 58.6%
Angry 3%
Confused 7.9%
Happy 6.3%

AWS Rekognition

Age 19-36
Gender Female, 53.7%
Disgusted 45.3%
Surprised 45.6%
Sad 46.5%
Happy 45.9%
Calm 50.3%
Confused 46%
Angry 45.5%

Feature analysis

Amazon

Person 93.1%

Categories

Imagga

interior objects 99.7%

Text analysis

Amazon

53500