Human Generated Data

Title

[Lyonel Feininger in Deep]

Date

1932

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.355.8

Human Generated Data

Title

[Lyonel Feininger in Deep]

People

Artist: Unidentified Artist,

Date

1932

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.355.8

Machine Generated Data

Tags

Amazon
created on 2019-05-29

Furniture 98.8
Person 97.1
Human 97.1
Apparel 92.1
Clothing 92.1
Chair 86.9
Face 81.6
Sitting 77.3
Couch 75.6
Photo 63.4
Portrait 63.4
Photography 63.4
Shoe 55.4
Footwear 55.4

Clarifai
created on 2019-05-29

people 99.9
one 99.3
adult 98.9
man 98.4
wear 97.1
two 96.9
war 92.8
military 92.7
administration 89.5
reclining 87.2
soldier 86.2
woman 86.2
portrait 85.9
blanket 84.2
group 84.2
vehicle 83.8
transportation system 83.2
furniture 80.7
child 80.2
basket 79

Imagga
created on 2019-05-29

person 27.9
man 26.2
adult 25.5
sitting 24
people 24
male 23.9
lifestyle 23.1
child 19.9
happy 18.2
attractive 17.5
sofa 15.9
holding 14.9
smiling 14.5
cute 14.3
portrait 14.2
outdoors 14.2
wicker 13.8
fashion 13.6
home 13.6
couch 13.5
women 13.4
love 13.4
couple 13.1
lady 13
human 12.7
room 12.3
face 12.1
clothing 12
pretty 11.9
chair 11.5
one 11.2
casual 11
work 10.9
armchair 10.8
cheerful 10.6
old 10.4
looking 10.4
furniture 10.2
happiness 10.2
model 10.1
relax 10.1
relaxation 10
relaxing 10
park 9.9
family 9.8
kid 9.7
together 9.6
sexy 9.6
expression 9.4
senior 9.4
product 9.3
guy 9.3
summer 9
seat 8.8
indoors 8.8
hair 8.7
boy 8.7
youth 8.5
two 8.5
black 8.4
elegance 8.4
joy 8.4
grandfather 8.3
fun 8.2
bench 8.2
dress 8.1
body 8
look 7.9
husband 7.8
wall 7.7
trainer 7.6
loving 7.6
laptop 7.6
leisure 7.5
20s 7.3
rest 7.3
sensuality 7.3
musical instrument 7

Google
created on 2019-05-29

Microsoft
created on 2019-05-29

clothing 95
person 91.1
black and white 86.6
man 80.4
human face 60.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 48-68
Gender Male, 97.7%
Sad 6%
Surprised 1%
Confused 6.2%
Calm 81%
Happy 1.6%
Angry 3.6%
Disgusted 0.6%

AWS Rekognition

Age 26-43
Gender Male, 74.3%
Angry 4.1%
Calm 80%
Sad 3.4%
Disgusted 6.3%
Surprised 2.7%
Happy 1.6%
Confused 1.9%

Feature analysis

Amazon

Person
Person 97.1%

Captions

Clarifai
created by general-english-image-caption-blip on 2025-05-15

a photograph of a man in a suit and tie sitting on a wicker -100%

Google Gemini

Created by gemini-2.0-flash on 2025-05-12

The black and white photograph features a man relaxing on a wicker beach chair, set against a backdrop of sand. The man is dressed in a suit jacket, a sweater or vest, and a tie. He is seated with one leg stretched out on the chair and the other slightly bent. His head is tilted downwards, and he appears to be either reading or in deep thought.

The wicker beach chair, constructed with a handle on the side, is the focal point of the image. The background is composed of sand, and the textures create shadows that enhance the depth of the scene. The high-contrast, high-resolution image quality emphasizes the details of the man's clothing, the chair's wicker pattern, and the sand's texture. The overall tone suggests a moment of tranquility.

Created by gemini-2.0-flash-lite on 2025-05-12

Here's a description of the image:

The image is a black and white photograph showing a man relaxing on a wicker beach chair. The man is wearing a suit jacket, a sweater, and trousers. He's lying back with his legs stretched out. The beach chair is positioned on a sandy surface, likely a beach or dunes, as indicated by the textured sand in the background. The man appears to be at ease, and the image has a relaxed, leisurely feel.