Human Generated Data

Title

[Hats and mannequins in shop window, New York]

Date

1940s-1950s

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1006.71

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Hats and mannequins in shop window, New York]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1006.71

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2023-10-24

Person 97.6
Person 96.2
Person 96.2
Person 95.8
Baby 95.8
Face 95.2
Head 95.2
Person 93.3
Baby 93.3
Shop 93.2
Person 92.7
Person 92.1
Baby 92.1
Person 90.7
Adult 90.7
Male 90.7
Man 90.7
Window Display 81.8
Animal 74.4
Bird 74.4
Person 67
Indoors 65.1
Bathroom 57.2
Room 57.2

Clarifai
created on 2023-10-14

monochrome 99.3
people 96.8
vintage 95.7
art 94
street 94
window 91.8
no person 90.1
light 89.1
old 88.7
retro 87.2
sepia 86.6
man 86.5
black and white 84.8
analogue 84.2
color 82.8
room 82.8
family 82.4
collage 82.2
coffee 81.3
decay 81

Imagga
created on 2019-02-03

interior 41.6
furniture 40.1
kitchen 37.3
espresso maker 36.9
table 31.1
room 30.2
home 28.7
coffee maker 28.7
kitchen appliance 28.5
luxury 24
modern 23.8
house 23.4
decor 22.1
decoration 21.7
glass 21
home appliance 20.5
pot 20.5
restaurant 20
vessel 20
coffeepot 19.9
cabinet 18.6
lamp 17.7
container 17.1
design 16.9
stove 16.8
light 16.7
inside 16.6
style 16.3
appliance 16.3
wood 15.8
furnishing 15.6
dining 15.2
indoors 14.9
architecture 14.8
window 14.8
chair 14.8
food 14.7
sink 14.6
elegant 14.6
counter 13.6
coffee 13.1
cooking 13.1
dinner 12.9
washbasin 12.9
wall 12.8
cooking utensil 12.4
oven 11.8
wooden 11.4
medicine chest 11.3
basin 11.3
floor 11.2
building 11.1
cook 11
indoor 10.9
drink 10.9
vase 10.8
apartment 10.5
living 10.4
plate 10.2
seat 10.2
case 10
reflection 9.7
stainless 9.7
hotel 9.5
lifestyle 9.4
nobody 9.3
black 9
faucet 8.9
steel 8.8
tablecloth 8.8
comfortable 8.6
empty 8.6
mirror 8.6
estate 8.5
cup 8.4
toaster 8.4
elegance 8.4
clean 8.3
kitchen utensil 8.2
new 8.1
brown 8.1
real 7.6
rest 7.4
close 7.4
sconce 7.4
service 7.4
bar 7.4
domestic 7.2
bracket 7.1

Google
created on 2019-02-03

Microsoft
created on 2019-02-03

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-31
Gender Female, 99.2%
Calm 96.9%
Surprised 7.1%
Fear 6%
Sad 2.3%
Angry 0.5%
Confused 0.2%
Happy 0.1%
Disgusted 0.1%

AWS Rekognition

Age 16-24
Gender Female, 99.6%
Calm 99.9%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Confused 0%
Angry 0%
Disgusted 0%
Happy 0%

AWS Rekognition

Age 16-24
Gender Female, 99.9%
Calm 98.7%
Surprised 6.6%
Fear 5.9%
Sad 2.3%
Angry 0.1%
Confused 0.1%
Happy 0%
Disgusted 0%

AWS Rekognition

Age 16-22
Gender Female, 66.5%
Calm 99.3%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Happy 0.4%
Confused 0.1%
Angry 0%
Disgusted 0%

AWS Rekognition

Age 11-19
Gender Female, 100%
Calm 100%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Confused 0%
Angry 0%
Happy 0%
Disgusted 0%

AWS Rekognition

Age 18-26
Gender Female, 100%
Calm 85.8%
Surprised 6.7%
Fear 6.6%
Sad 4.8%
Confused 1.8%
Angry 1.5%
Happy 1.3%
Disgusted 1%

AWS Rekognition

Age 10-18
Gender Female, 58.8%
Calm 99.7%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Angry 0.1%
Confused 0%
Disgusted 0%
Happy 0%

AWS Rekognition

Age 30-40
Gender Female, 99.6%
Calm 99.9%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Confused 0%
Angry 0%
Happy 0%
Disgusted 0%

AWS Rekognition

Age 23-33
Gender Female, 99.8%
Calm 96.7%
Surprised 6.5%
Fear 5.9%
Sad 2.2%
Confused 1.8%
Happy 0.3%
Angry 0.2%
Disgusted 0.1%

Microsoft Cognitive Services

Age 36
Gender Female

Microsoft Cognitive Services

Age 31
Gender Female

Microsoft Cognitive Services

Age 26
Gender Female

Microsoft Cognitive Services

Age 22
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.6%
Baby 95.8%
Adult 90.7%
Male 90.7%
Man 90.7%
Bird 74.4%

Categories

Imagga

interior objects 99%

Text analysis

Amazon

SALE

Google

SALE
SALE