Human Generated Data

Title

[Lux and Jeanne Feininger in front of MoMA, New York]

Date

1940s-1950s

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1006.216

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Lux and Jeanne Feininger in front of MoMA, New York]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1006.216

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2023-10-24

Clothing 100
Coat 100
Photography 99.9
Architecture 99.8
Building 99.8
Outdoors 99.8
Shelter 99.8
City 99.7
Road 99.3
Street 99.3
Urban 99.3
Person 99.2
Person 99.1
Person 96.5
Person 95.9
Adult 95.9
Male 95.9
Man 95.9
People 90.4
Person 89.5
Jacket 88.6
Formal Wear 87.6
Suit 87.6
Face 87.2
Head 87.2
Person 86.9
Person 81.4
Car 80.5
Transportation 80.5
Vehicle 80.5
Portrait 78.1
Indoors 72.6
Restaurant 72.6
Person 72.1
Nature 69.8
Neighborhood 68.9
Alcohol 57.4
Beverage 57.4
Smoke 57.2
Metropolis 56.9
Diner 56.9
Food 56.9
Beer 56.5
Hat 56.5
House 56.5
Housing 56.5
Staircase 56.5
Monastery 56.1
Electronics 56
Phone 56
Awning 55.6
Canopy 55.6
Bus Stop 55.6
Alley 55.6
Snow 55.5
Overcoat 55.4
Camera 55.4
Walking 55.2
Blazer 55.1
Photographer 55

Clarifai
created on 2023-10-15

people 99.9
monochrome 99.7
adult 98.4
street 98.2
group together 98
man 97.1
woman 96.7
group 96.3
transportation system 95
vehicle 94.9
two 94.2
child 90.9
three 88.1
retro 87.1
portrait 87
several 86
vintage 84.5
four 84.4
one 81.3
boy 79.2

Imagga
created on 2019-02-03

building 29.7
shop 22.8
structure 22.4
architecture 20.3
mercantile establishment 17
city 15.8
barbershop 15.7
man 14.8
people 14.5
black 14.4
restaurant 14.1
urban 14
street 12.9
house 11.7
office 11.7
working 11.5
place of business 11.4
old 11.1
billboard 11.1
cinema 11.1
business 10.9
transportation 10.8
light 10.7
car 10.6
glass 10.3
sky 10.2
public house 10
male 9.9
travel 9.9
wheeled vehicle 9.7
wagon 9.6
smoke 9.3
theater 9.2
vintage 9.1
passenger 8.9
buildings 8.5
room 8.4
vehicle 8.4
television 8.4
person 8.2
night 8
home 8
day 7.8
adult 7.8
power 7.6
signboard 7.4
road 7.2
looking 7.2
job 7.1

Google
created on 2019-02-03

Microsoft
created on 2019-02-03

window 93
old 82.7
posing 37.2
black and white 37.2
street 13.2
monochrome 7.3
person 7.2
music 5.5
people 5.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 29-39
Gender Female, 99.4%
Happy 94.4%
Surprised 6.5%
Fear 6.1%
Calm 2.5%
Sad 2.3%
Angry 0.5%
Disgusted 0.5%
Confused 0.4%

AWS Rekognition

Age 38-46
Gender Male, 99.9%
Calm 98.8%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Happy 0.5%
Confused 0.1%
Disgusted 0.1%
Angry 0.1%

AWS Rekognition

Age 25-35
Gender Male, 74.9%
Calm 90.8%
Surprised 6.5%
Fear 6.1%
Sad 2.8%
Happy 2.4%
Angry 2%
Confused 0.9%
Disgusted 0.7%

AWS Rekognition

Age 12-20
Gender Male, 73.3%
Calm 88.7%
Surprised 6.6%
Fear 6.1%
Sad 3.6%
Angry 3.5%
Confused 1.7%
Happy 0.7%
Disgusted 0.6%

Microsoft Cognitive Services

Age 23
Gender Male

Microsoft Cognitive Services

Age 22
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very likely

Feature analysis

Amazon

Person 99.2%
Adult 95.9%
Male 95.9%
Man 95.9%
Car 80.5%

Categories

Imagga

interior objects 39.1%
paintings art 28.4%
text visuals 23.5%
food drinks 7.1%

Text analysis

Amazon

FEININGER
HARTLEY

Google

FEININCER HARTLEY
FEININCER
HARTLEY