Human Generated Data

Title

[Store window]

Date

1950s?

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.329.3

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Store window]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1950s?

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.329.3

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-05-29

Human 99.5
Person 99.5
Nature 99.4
Person 99.1
Outdoors 96.8
Snow 85.1
Person 81.2
Person 79.1
Weather 71
Winter 69.6
Ice 66.6
Fog 62.8
Smoke 57.5

Clarifai
created on 2019-05-29

people 99.8
adult 97.6
man 96.2
group 96
home 94
family 93
woman 92.1
furniture 91.9
vehicle 91.4
room 90.8
war 86
window 85.6
administration 84.9
indoors 84.8
one 83.3
monochrome 83.1
house 82.5
watercraft 81.3
offense 80.8
many 80.6

Imagga
created on 2019-05-29

man 28.2
people 26.2
person 24.9
home 24.7
newspaper 24
couple 22.6
barbershop 22
room 21.7
male 21.5
shop 20.7
happy 19.4
adult 19.3
smiling 18.8
happiness 18.8
building 17
product 16.1
two 16.1
school 16.1
mercantile establishment 16
groom 15.6
portrait 15.5
house 15
sitting 14.6
family 14.2
office 14
lifestyle 13.7
smile 13.5
creation 12.5
interior 12.4
indoors 12.3
love 11.8
wife 11.4
senior 11.2
men 11.2
old 11.1
place of business 10.6
together 10.5
husband 10.5
hospital 10.4
back 10.1
dress 9.9
holding 9.9
businessman 9.7
structure 9.7
looking 9.6
work 9.4
horizontal 9.2
business 9.1
outdoors 9
classroom 8.9
working 8.8
bride 8.6
day 8.6
togetherness 8.5
relationship 8.4
clinic 8.4
relaxation 8.4
mother 8.3
wedding 8.3
vintage 8.3
fun 8.2
care 8.2
cheerful 8.1
new 8.1
life 8.1
standing 7.8
face 7.8
professional 7.8
world 7.7
outdoor 7.6
health 7.6
worker 7.4
black 7.2
nurse 7.1
women 7.1
patient 7.1

Google
created on 2019-05-29

Microsoft
created on 2019-05-29

window 84.7
fog 84.7
clothing 84.1
house 78.7
person 75.7
black and white 54.2
building 53.6
old 53.2
posing 47.3
store 39.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 45-65
Gender Female, 96.8%
Surprised 2%
Disgusted 76.8%
Angry 2.1%
Sad 2.4%
Happy 4.8%
Calm 8.8%
Confused 3.2%

AWS Rekognition

Age 26-43
Gender Female, 54.7%
Disgusted 45.3%
Angry 45.8%
Surprised 45.4%
Happy 46%
Calm 51.5%
Confused 45.3%
Sad 45.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Categories

Imagga

interior objects 99.9%

Text analysis

Amazon

NTURE
NTURE SOBLA
SOBLA
LLE