Human Generated Data

Title

[Charles Kuhn and Julia Feininger, Busch-Reisinger Museum Library, Adolphus Busch Hall, Harvard University, Cambridge, Massachusetts]

Date

1949-1950

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.596.5

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Charles Kuhn and Julia Feininger, Busch-Reisinger Museum Library, Adolphus Busch Hall, Harvard University, Cambridge, Massachusetts]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1949-1950

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.596.5

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-20

Person 99.3
Human 99.3
Person 95.3
Clothing 95.3
Apparel 95.3
Person 84.9
Fashion 74.6
Robe 74.6
Evening Dress 74.6
Gown 74.6
Nature 74.6
Outdoors 73.1
Female 63.5
Corridor 57.8
Crypt 55.7
Woman 55.1

Clarifai
created on 2019-11-20

people 99.8
adult 98.9
one 96.9
indoors 96.6
group 95.7
man 95.4
woman 95.1
room 94.9
wear 93.6
two 93.3
furniture 92.3
vehicle 91.3
group together 88.7
monochrome 87.6
veil 86
administration 82.9
home 82.5
transportation system 81.6
leader 79.4
war 78

Imagga
created on 2019-11-20

architecture 31
building 24.4
city 22.4
old 18.1
travel 17.6
tourism 16.5
street 15.6
urban 14.9
window 13.7
nurse 12
modern 11.9
transport 11.9
arch 11.6
business 11.5
interior 11.5
ancient 11.2
people 11.2
stretcher 11.1
stone 11
transportation 10.8
town 10.2
sky 10.2
house 10.1
door 9.6
construction 9.4
device 9.1
history 8.9
litter 8.9
patient 8.6
work 8.6
church 8.3
conveyance 8.3
historic 8.2
industrial 8.2
to 8
indoors 7.9
steps 7.8
scene 7.8
wall 7.7
world 7.5
passenger 7.4
light 7.3
speed 7.3
design 7.3
stairs 7.2
home 7.2
religion 7.2
room 7.1
negative 7
film 7

Google
created on 2019-11-20

Microsoft
created on 2019-11-20

indoor 93.5
black and white 73.1
text 72.7
clothing 65.6
person 60.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 41-59
Gender Male, 50.3%
Calm 46.5%
Disgusted 45%
Surprised 46.4%
Angry 45.3%
Fear 45%
Confused 45.4%
Sad 45.4%
Happy 51%

AWS Rekognition

Age 4-14
Gender Male, 50.1%
Disgusted 45.1%
Sad 46.3%
Angry 51%
Confused 46.3%
Fear 45.4%
Happy 45.1%
Surprised 45.2%
Calm 45.5%

AWS Rekognition

Age 50-68
Gender Female, 52%
Angry 45.1%
Confused 45.1%
Fear 45.1%
Surprised 45.1%
Disgusted 45%
Calm 46%
Sad 46.2%
Happy 52.4%

Feature analysis

Amazon

Person 99.3%

Categories