Human Generated Data

Title

[Lux and Jeanne Feininger in front of MoMA, New York]

Date

1940s-1950s

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1006.217

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Lux and Jeanne Feininger in front of MoMA, New York]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1006.217

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2023-10-24

City 100
Urban 99.3
Adult 99.1
Male 99.1
Man 99.1
Person 99.1
Person 98.9
Nature 97.8
Outdoors 97.8
Weather 97.8
Clothing 97.3
Coat 97.3
Architecture 97
Building 97
Office Building 97
Face 92.2
Head 92.2
Car 78.1
Transportation 78.1
Vehicle 78.1
Metropolis 77.2
Jacket 74.8
Smoke 74.2
Shelter 73.8
House 68.7
Housing 68.7
Staircase 68.7
Fog 64.8
Smog 62.8
People 57.7
Condo 57.7
Snow 57.5
Handrail 57.4
Window 57
Road 56.6
Street 56.6
High Rise 56.6
Awning 56.6
Canopy 56.6
Monastery 56.6
Brick 56.6
Neighborhood 56.5
Photography 56.4
Cinema 55.8
Blazer 55.6
Winter 55.2

Clarifai
created on 2023-10-15

people 99.8
monochrome 99.7
street 98.2
adult 97.6
man 97.5
woman 97.2
portrait 95.9
two 94.9
child 94.2
window 94.1
group 93.1
group together 91
architecture 86.9
one 86.7
administration 86.4
wear 86.3
city 86.1
transportation system 85.7
retro 84.9
girl 84

Imagga
created on 2019-02-03

grand piano 44.8
piano 35.9
building 33.6
architecture 31.1
window 28.8
office 26.8
percussion instrument 26.5
stringed instrument 26.5
keyboard instrument 26.4
business 20.6
musical instrument 18.8
urban 18.4
cinema 18.1
glass 17.9
city 17.5
modern 16.8
structure 15.9
people 15.6
theater 14.7
room 13.8
male 12.1
black 12
finance 11.8
house 11.7
interior 11.5
man 11.4
indoors 11.4
home 11.2
old 11.1
silhouette 10.8
light 10.7
construction 10.3
day 10.2
lifestyle 10.1
indoor 10
framework 9.7
businessman 9.7
groom 9.6
buildings 9.5
corporate 9.5
travel 9.2
life 8.9
meeting 8.5
television 8.5
adult 8.4
reflection 8.4
exterior 8.3
hall 8.2
laptop 8.2
person 8.1
computer 8
hair 7.9
love 7.9
sky 7.7
roof 7.6
supporting structure 7.5
inside 7.4
art 7.3
success 7.2
metal 7.2
color 7.2
transportation 7.2
financial 7.1
women 7.1
wall 7.1
steel 7.1
work 7.1

Google
created on 2019-02-03

Microsoft
created on 2019-02-03

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 54-64
Gender Male, 98%
Happy 99.6%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Calm 0.1%
Confused 0.1%
Disgusted 0%
Angry 0%

AWS Rekognition

Age 31-41
Gender Female, 100%
Happy 99.5%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Calm 0.2%
Angry 0.1%
Confused 0%
Disgusted 0%

Microsoft Cognitive Services

Age 24
Gender Female

Microsoft Cognitive Services

Age 38
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very likely

Feature analysis

Amazon

Adult 99.1%
Male 99.1%
Man 99.1%
Person 99.1%
Car 78.1%

Categories

Text analysis

Amazon

HARTLEY
FEININGER
ART
N

Google

FEININCER HARTLEY
FEININCER
HARTLEY