Human Generated Data

Title

[Reflection in shop window]

Date

1940s-1950s

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1009.255

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Reflection in shop window]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1009.255

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2023-10-25

Person 97.5
Photography 97.1
Person 96.7
Person 95
Adult 95
Female 95
Woman 95
Indoors 92.8
Machine 92
Wheel 92
Person 91.5
Adult 91.5
Adult 91.5
Female 91.5
Female 91.5
Woman 91.5
Bride 91.5
Wedding 91.5
Face 91.4
Head 91.4
Person 90
Adult 90
Adult 90
Female 90
Female 90
Woman 90
Bride 90
Computer Hardware 75
Electronics 75
Hardware 75
Monitor 75
Screen 75
Person 72.1
Person 70.7
Person 68.9
Portrait 67.9
Urban 64.2
Dressing Room 57.6
Room 57.6
Restaurant 57.4
Shop 57.2
Window 56.3
Art 55.9
Collage 55.9
Clothing 55.5
Swimwear 55.5
City 55.4

Clarifai
created on 2023-10-15

people 99.9
monochrome 99.7
group 98.8
man 95
group together 95
many 94.6
woman 93.9
several 93.2
adult 92.7
leader 91.2
two 91.1
street 90.9
child 90.6
commerce 90.4
movie 86.9
administration 86.1
interaction 81.6
three 81
four 80.9
family 80.2

Imagga
created on 2019-02-01

newspaper 37.8
product 29
creation 22.5
people 19.5
kin 18.9
shop 18.8
man 18.8
stall 17.3
negative 16.4
black 16.2
window 15.1
film 15
barbershop 14.7
person 13.5
couple 13.1
mercantile establishment 12.7
grandma 12.6
old 12.5
groom 12.1
counter 11.9
night 11.5
happy 11.3
love 11
portrait 11
family 10.7
male 10.6
bride 10.5
business 10.3
photographic paper 10.1
vintage 9.9
home 9.6
smiling 9.4
happiness 9.4
adult 9.3
glass 9.3
mother 9.2
dark 9.2
religion 9
hair 8.7
antique 8.6
ancient 8.6
building 8.6
smile 8.5
place of business 8.5
art 8.4
senior 8.4
house 8.3
hand 8.3
light 8
day 7.8
money 7.6
marriage 7.6
human 7.5
wedding 7.3
girls 7.3
decoration 7.2
dress 7.2
history 7.1
face 7.1
interior 7.1
architecture 7

Google
created on 2019-02-01

Microsoft
created on 2019-02-01

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 25-35
Gender Female, 98.6%
Calm 92%
Surprised 6.5%
Fear 6%
Sad 4.2%
Confused 1.1%
Angry 0.8%
Disgusted 0.3%
Happy 0.2%

AWS Rekognition

Age 16-24
Gender Male, 86.5%
Calm 97.2%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Happy 2.2%
Confused 0.2%
Disgusted 0.1%
Angry 0.1%

AWS Rekognition

Age 18-24
Gender Female, 88.5%
Calm 99%
Surprised 6.3%
Fear 6%
Sad 2.2%
Happy 0.2%
Disgusted 0.1%
Angry 0%
Confused 0%

AWS Rekognition

Age 23-33
Gender Female, 60.4%
Calm 68.3%
Happy 22.7%
Surprised 6.9%
Fear 6.7%
Sad 3.4%
Angry 1.3%
Disgusted 0.8%
Confused 0.4%

Microsoft Cognitive Services

Age 32
Gender Female

Feature analysis

Amazon

Person 97.5%
Adult 95%
Female 95%
Woman 95%
Wheel 92%
Bride 91.5%
Monitor 75%

Categories

Imagga

paintings art 46.3%
interior objects 38.1%
food drinks 12.8%
text visuals 2.1%

Text analysis

Amazon

-
I
nitsperd
times