Human Generated Data

Title

[Woman standing before female mannequins in shop window, New England]

Date

1940s-1950s

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1005.155

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Woman standing before female mannequins in shop window, New England]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1005.155

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2023-10-23

Photography 99.4
Adult 99.1
Male 99.1
Man 99.1
Person 99.1
Adult 98.5
Adult 98.5
Person 98.5
Bride 98.5
Female 98.5
Female 98.5
Wedding 98.5
Woman 98.5
Adult 96.4
Male 96.4
Man 96.4
Person 96.4
Adult 94.4
Male 94.4
Man 94.4
Person 94.4
Person 87.2
Face 85.7
Head 85.7
Portrait 80.3
Door 78.4
Clothing 75
Shirt 75
Adult 69
Person 69
Female 69
Woman 69
Computer Hardware 62.6
Electronics 62.6
Hardware 62.6
Monitor 62.6
Screen 62.6
Barbershop 60.6
Indoors 60.6
Shop 60.6
Window 60.2
Outdoors 58
Hairdresser 57.3
Transportation 56.9
Vehicle 56.9
Architecture 56.3
Building 56.3
Shelter 56.3
Urban 56.1
Home Decor 55.7
Cinema 55.6
Mirror 55.5
Coat 55.3

Clarifai
created on 2023-10-15

people 99.4
monochrome 98
street 97.7
window 96.6
man 95
vehicle window 95
child 93.7
woman 91.9
adult 88.4
train 87.9
wait 87.2
sit 87.2
family 87.1
indoors 85.9
rain 85.1
boy 82.8
administration 82.2
locomotive 81.2
two 81.1
portrait 79.1

Imagga
created on 2019-02-03

barbershop 100
shop 100
mercantile establishment 93.6
place of business 62.4
establishment 31.2
man 27.5
people 20.6
adult 16.2
male 14.9
smiling 14.5
person 14.4
lifestyle 13
business 12.7
happy 12.5
couple 12.2
indoors 11.4
happiness 11
window 10.5
black 10.2
work 10.2
smile 10
old 9.7
portrait 9.7
technology 9.6
looking 9.6
home 9.6
love 9.5
sitting 9.4
men 9.4
room 9.1
family 8.9
patient 8.9
working 8.8
hospital 8.7
office 8.7
glass 8.6
health 8.3
restaurant 8.2
worker 8.2
interior 8
medical 7.9
women 7.9
face 7.8
casual 7.6
illness 7.6
senior 7.5
equipment 7.4
holding 7.4
back 7.3
job 7.1
modern 7

Google
created on 2019-02-03

Microsoft
created on 2019-02-03

person 98.1
man 95.4
window 89.2
street 89.2
black and white 72.7
monochrome 18.5
bus 16.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 28-38
Gender Female, 99.7%
Calm 99.5%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Happy 0.1%
Disgusted 0.1%
Confused 0%
Angry 0%

AWS Rekognition

Age 4-12
Gender Female, 100%
Calm 90.1%
Surprised 6.4%
Sad 6%
Fear 5.9%
Happy 0.7%
Confused 0.5%
Angry 0.5%
Disgusted 0.2%

AWS Rekognition

Age 18-24
Gender Female, 100%
Calm 96.8%
Surprised 7.1%
Fear 6%
Sad 2.3%
Angry 0.3%
Confused 0.3%
Happy 0.2%
Disgusted 0.1%

Microsoft Cognitive Services

Age 25
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.1%
Male 99.1%
Man 99.1%
Person 99.1%
Bride 98.5%
Female 98.5%
Woman 98.5%

Categories

Imagga

interior objects 99.6%

Text analysis

Amazon

August
Coals
and
the
and Suils
Deduct
Suils
Tall Coals
Deduct 20%
The
Tall
20%
August Sale
a The
= the las
guss
a
=
Sale
las