Human Generated Data

Title

[Sun through clouds]

Date

1945-1955

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.489.22

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Sun through clouds]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1945-1955

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-04-05

Furniture 99.8
Human 96.4
Person 96.4
Text 85.5
Shoe 83.8
Apparel 83.8
Clothing 83.8
Footwear 83.8
Chair 82.4
Sitting 79.7
Flooring 67.9
Armchair 59
Reading 57.1

Clarifai
created on 2021-04-05

people 99.7
monochrome 98.7
chair 98.5
one 97.5
furniture 97.1
man 95.8
adult 95.8
sit 94
seat 93.2
portrait 92.7
woman 90.8
indoors 87.4
reclining 87.3
book series 86.4
art 84.3
girl 84
wear 83.7
child 83.2
desk 82.2
education 82.1

Imagga
created on 2021-04-05

seat 100
folding chair 100
chair 100
furniture 78.8
furnishing 35.3
sitting 31.8
home 23.9
rocking chair 19.7
people 19.5
lifestyle 18.8
adult 18.7
person 17.6
relax 16.8
relaxation 15.9
indoors 15.8
armchair 15.7
room 15.5
relaxing 15.5
man 15.5
male 14.2
sit 14.2
leisure 14.1
laptop 13.8
casual 13.6
happy 13.2
computer 12.9
outdoors 12.7
interior 12.4
senior 12.2
technology 11.9
smiling 11.6
wooden 11.4
summer 10.9
chairs 10.8
retired 10.7
elderly 10.5
one 10.4
business 10.3
day 10.2
rest 10.1
alone 10
indoor 10
wood 10
portrait 9.7
work 9.4
happiness 9.4
mature 9.3
beach 9.3
smile 9.3
pretty 9.1
health 9
vacation 9
table 9
looking 8.8
attractive 8.4
old 8.4
cheerful 8.1
lady 8.1
office 8
support 8
working 7.9
medical 7.9
women 7.9
face 7.8
color 7.8
nobody 7.8
retirement 7.7
comfortable 7.6
resting 7.6
bed 7.6
relaxed 7.5
resort 7.5
domestic 7.2
blond 7.2
recreation 7.2
holiday 7.2
hair 7.1
grass 7.1
worker 7.1
modern 7

Google
created on 2021-04-05

Microsoft
created on 2021-04-05

furniture 97.9
person 90.9
table 83.6
black and white 76.9
seat 55.4
chair 46.4

Face analysis

Amazon

Google

AWS Rekognition

Age 23-35
Gender Female, 76.1%
Happy 56.8%
Calm 41.6%
Sad 0.6%
Surprised 0.4%
Angry 0.3%
Confused 0.1%
Fear 0.1%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96.4%
Shoe 83.8%
Chair 82.4%

Captions

Microsoft

a person sitting on a chair 82.4%
a person sitting in a chair 82.3%
a person sitting on a chair 82.2%