Human Generated Data

Title

[Connecticut]

Date

late 1930s-1940s

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.118.30

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Connecticut]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

late 1930s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.118.30

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2023-10-12

Architecture 99.6
Building 99.6
Countryside 99.6
Hut 99.6
Nature 99.6
Outdoors 99.6
Rural 99.6
Adult 98.4
Female 98.4
Person 98.4
Woman 98.4
Shelter 98.4
Person 97.8
Adult 97.7
Person 97.7
Male 97.7
Man 97.7
Person 97.7
Person 97.5
Person 97.5
Grass 97.1
Plant 97.1
Chair 96.3
Furniture 96.3
Indoors 95.6
Restaurant 95.6
Adult 95.3
Person 95.3
Male 95.3
Man 95.3
Chair 94.4
Person 94
Adult 93.9
Person 93.9
Male 93.9
Man 93.9
Person 93.3
Person 93.1
Chair 86.3
Housing 82.8
Chair 80
Chair 79.8
Person 78.6
Person 78.4
Face 77.2
Head 77.2
Chair 74.9
Shack 72.4
Chair 71.1
House 69.9
Dining Table 66.6
Table 66.6
Cafeteria 57.6
Clothing 57
Hat 57
Yard 56.8
People 56.8
Cafe 56.5
Neighborhood 55.4
Lawn 55.2

Clarifai
created on 2018-08-23

people 100
group together 99.5
group 98.9
furniture 98.9
adult 98.7
seat 97.9
many 97.8
several 97.1
monochrome 95.1
sit 94.8
man 94.7
chair 94.6
child 91.4
woman 90.9
four 90.1
bench 89.6
administration 88.7
one 88.5
vehicle 87.9
leader 87.8

Imagga
created on 2018-08-23

chair 74.4
seat 64.1
bench 29.8
furniture 26.2
park bench 20.4
shopping cart 18.9
empty 16.3
old 16
park 15.6
tree 15.4
handcart 15.4
rocking chair 14.3
grass 14.2
support 14.1
summer 13.5
relax 13.5
room 13.4
sky 13.4
house 13.4
outdoors 12.7
trees 12.4
table 12.2
outdoor 12.2
barber chair 12.1
wheeled vehicle 12.1
sitting 12
wood 11.7
upright 11.2
landscape 11.2
barbershop 11.1
furnishing 11.1
device 10.9
rural 10.6
travel 10.6
scene 10.4
day 10.2
water 10
building 10
winter 9.4
shop 9.4
snow 9.3
relaxation 9.2
piano 9.2
sun 8.9
container 8.7
sea 8.6
architecture 8.6
outside 8.6
lawn 8.5
field 8.4
man 8.1
lifestyle 7.9
classroom 7.9
autumn 7.9
urban 7.9
keyboard instrument 7.9
forest 7.8
sunny 7.7
death 7.7
grunge 7.7
fence 7.6
style 7.4
musical instrument 7.4
light 7.3
percussion instrument 7.3
road 7.2
stringed instrument 7.2
holiday 7.2
scenic 7

Google
created on 2018-08-23

Microsoft
created on 2018-08-23

old 46.9
chair 43
furniture 23.3
dining table 7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 20-28
Gender Male, 91.5%
Calm 83.5%
Fear 7.5%
Surprised 7%
Sad 5.9%
Disgusted 1.2%
Angry 1%
Confused 1%
Happy 0.5%

AWS Rekognition

Age 21-29
Gender Male, 64.7%
Calm 88.9%
Surprised 7%
Fear 6.2%
Happy 5.3%
Sad 2.4%
Disgusted 1.8%
Confused 0.5%
Angry 0.4%

AWS Rekognition

Age 16-24
Gender Female, 62.6%
Calm 80.7%
Sad 8.8%
Fear 7.1%
Surprised 6.4%
Happy 2.1%
Angry 1.3%
Disgusted 1.1%
Confused 1%

AWS Rekognition

Age 6-12
Gender Female, 88.6%
Surprised 68.1%
Calm 39.9%
Sad 7.2%
Disgusted 6.7%
Fear 6.5%
Confused 1.2%
Angry 0.7%
Happy 0.7%

AWS Rekognition

Age 22-30
Gender Male, 87.1%
Calm 71.1%
Happy 18%
Surprised 7.6%
Fear 6.4%
Sad 2.8%
Disgusted 2.6%
Confused 1.6%
Angry 0.8%

Feature analysis

Amazon

Adult 98.4%
Female 98.4%
Person 98.4%
Woman 98.4%
Male 97.7%
Man 97.7%
Chair 96.3%