Human Generated Data

Title

[Reflection in shop window]

Date

1940s-1950s

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1009.232

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Reflection in shop window]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1009.232

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2023-10-25

Window 99.1
Adult 96.5
Female 96.5
Person 96.5
Woman 96.5
Face 93.7
Head 93.7
Person 92
Baby 92
Adult 90.1
Female 90.1
Person 90.1
Woman 90.1
Person 86.1
Tub 83.5
Outdoors 82.8
Pool 74.6
Water 74.6
Bathing 74.1
Person 73.6
Nature 71.3
Porthole 57
Snow 56.2
Bathtub 55.6

Clarifai
created on 2023-10-15

people 99.8
monochrome 99.5
street 97.2
group 95.5
man 93.9
window 93.7
adult 93.6
woman 92.6
child 92.1
train 87.7
family 87.1
two 86
vintage 85
interaction 84.3
administration 84.1
war 83.8
one 83
group together 82.9
city 82.5
art 82.3

Imagga
created on 2019-02-01

case 40.6
refrigerator 40
white goods 33.2
furniture 31.5
china cabinet 28.9
cabinet 25.9
home appliance 24.9
window 20.6
furnishing 20.4
interior 19.4
home 19.1
appliance 18.4
house 18.4
wall 17.9
architecture 15.6
old 14.6
design 14.6
decoration 14.5
vintage 13.2
glass 12.7
building 12.1
room 12
light 11.3
blackboard 11.2
people 11.1
retro 10.6
business 10.3
grunge 10.2
modern 9.8
family 9.8
black 9.6
bath 9.5
bathroom 9.1
texture 9
digital 8.9
pattern 8.9
incubator 8.9
indoors 8.8
man 8.7
ancient 8.6
child 8.5
equipment 8.5
frame 8.4
durables 8.3
inside 8.3
technology 8.2
dirty 8.1
urban 7.9
art 7.8
antique 7.8
space 7.7
luxury 7.7
clean 7.5
style 7.4
smile 7.1
adult 7.1
apparatus 7.1

Google
created on 2019-02-01

Microsoft
created on 2019-02-01

window 92.7
old 41.7
picture frame 8.4
museum 8.4
black and white 8.1
art 5.6
bus 2.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 27-37
Gender Female, 100%
Calm 96.4%
Surprised 6.8%
Fear 6%
Sad 2.4%
Angry 0.5%
Disgusted 0.3%
Confused 0.3%
Happy 0.2%

AWS Rekognition

Age 29-39
Gender Female, 99.9%
Calm 98.1%
Surprised 6.4%
Fear 5.9%
Sad 2.4%
Angry 0.2%
Happy 0.2%
Disgusted 0.1%
Confused 0.1%

AWS Rekognition

Age 24-34
Gender Female, 100%
Calm 91.9%
Surprised 6.8%
Fear 6.1%
Sad 3.2%
Happy 1.1%
Confused 0.9%
Angry 0.9%
Disgusted 0.7%

AWS Rekognition

Age 28-38
Gender Female, 100%
Happy 62.3%
Calm 35.5%
Surprised 6.5%
Fear 5.9%
Sad 2.3%
Confused 0.4%
Disgusted 0.4%
Angry 0.3%

AWS Rekognition

Age 14-22
Gender Female, 84.5%
Calm 96.8%
Surprised 6.3%
Fear 6%
Sad 2.4%
Confused 1.3%
Happy 0.3%
Angry 0.2%
Disgusted 0.1%

Microsoft Cognitive Services

Age 26
Gender Female

Microsoft Cognitive Services

Age 24
Gender Female

Microsoft Cognitive Services

Age 28
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Feature analysis

Amazon

Adult 96.5%
Female 96.5%
Person 96.5%
Woman 96.5%
Baby 92%

Categories

Text analysis

Amazon

JAPON
ANS242 JAPON
are
ANS242