Human Generated Data

Title

[Store window]

Date

1950s?

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.329.4

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Store window]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1950s?

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-05-29

Person 99.4
Human 99.4
Nature 99.3
Person 95.6
Smoke 90.9
Fog 89.6
Smog 81.1
Outdoors 80
Urban 76.9
Building 76.9
Metropolis 76.9
Town 76.9
City 76.9
Weather 69.8
Silhouette 57.7

Clarifai
created on 2019-05-29

people 99.8
adult 97.2
group 95.1
man 95
watercraft 93.7
wear 92
vehicle 91.8
home 91.7
family 90.7
room 90
furniture 88.8
building 88.5
one 86.5
woman 85.2
two 85.1
house 84.4
indoors 84.2
group together 83.9
window 80.9
military 80.2

Imagga
created on 2019-05-29

negative 31.9
film 26
barbershop 21.6
shop 20.9
old 18.8
mercantile establishment 17.6
man 17.5
photographic paper 17.2
people 16.7
building 15.1
work 14.9
person 14.4
architecture 14.1
male 12.8
house 11.7
place of business 11.4
photographic equipment 11.4
glass 10.9
working 10.6
modern 10.5
looking 10.4
home 10.4
business 10.3
sky 10.2
vintage 9.9
adult 9.9
technology 9.6
happiness 9.4
senior 9.4
grunge 9.4
two 9.3
space 9.3
newspaper 9.2
blackboard 9
black 9
outdoors 9
worker 8.9
color 8.9
job 8.8
couple 8.7
wall 8.7
light 8.7
antique 8.7
office 8.5
structure 8.3
city 8.3
professional 7.9
love 7.9
design 7.9
art 7.8
roof 7.6
grungy 7.6
pattern 7.5
frame 7.5
holding 7.4
retro 7.4
smiling 7.2
dirty 7.2
portrait 7.1
science 7.1
businessman 7.1
travel 7

Google
created on 2019-05-29

White 96.9
Photograph 96.2
Snapshot 84.7
Black-and-white 80.7
Photography 70.6
Monochrome 70.2
Room 65.7
Architecture 65.5
Window 65
House 59.3
Style 51

Microsoft
created on 2019-05-29

fog 95.6
man 94.8
clothing 89.9
person 87.2
human face 84.6
old 79.1
black and white 72.9
house 56.4

Face analysis

Amazon

AWS Rekognition

Age 30-47
Gender Female, 60.3%
Happy 3.5%
Disgusted 59%
Surprised 2.1%
Confused 1.9%
Sad 3.4%
Angry 5.6%
Calm 24.5%

AWS Rekognition

Age 20-38
Gender Female, 97.8%
Disgusted 4.6%
Sad 9.1%
Confused 21.6%
Calm 26.5%
Happy 8.7%
Angry 4.5%
Surprised 25%

Feature analysis

Amazon

Person 99.4%

Captions

Microsoft

a man standing in front of a window 76.7%
an old man standing in front of a window 75.5%
a man that is standing in front of a window 72.2%

Text analysis

Amazon

Aep