Unable to open [object Object]: HTTP 0 attempting to load TileSource

Human Generated Data

Title

Untitled (Dr. Herman M. Juergens driving car and smoking cigar)

Date

1965-1968

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.483.6

Human Generated Data

Title

Untitled (Dr. Herman M. Juergens driving car and smoking cigar)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1965-1968

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.483.6

Machine Generated Data

Tags

Amazon
created on 2019-08-09

Clothing 98.9
Hat 98.9
Apparel 98.9
Human 94.3
Person 94.3
Finger 88.7
Aircraft 77.9
Airplane 77.9
Transportation 77.9
Vehicle 77.9
Window 60.1

Clarifai
created on 2019-08-09

people 99.2
one 97.7
adult 97.5
man 96.8
vehicle 95.4
transportation system 91.7
woman 90.1
wear 89.5
indoors 88.7
military 88.2
two 86.3
aircraft 85.9
lid 85.8
portrait 84.6
industry 84.3
music 83.3
technology 81.2
veil 80.3
retro 79.8
war 79.5

Imagga
created on 2019-08-09

car 61.1
vehicle 39.4
car mirror 36.4
automobile 35.4
mirror 34.6
driver 31.1
transportation 30.5
auto 29.7
cockpit 27.6
windshield 26.7
drive 25.5
transport 23.7
reflector 22.8
device 22.4
driving 21.3
person 21.2
man 20.8
screen 19.8
seat 19.8
road 19
sitting 18.9
people 16.2
wheel 16
smiling 15.9
male 15.6
speed 15.6
traffic 15.2
protective covering 14.9
inside 14.7
happy 14.4
adult 14.2
travel 14.1
support 12.9
business 12.8
luxury 12
modern 11.9
attractive 11.9
fast 11.2
aviator 11.2
portrait 11
black 10.8
hand 10.6
highway 10.6
engine 10.6
new 10.5
covering 10.5
sports 10.2
smile 10
holding 9.9
mechanism 9.8
motor 9.8
looking 9.6
windshield wiper 9.1
headrest 9.1
outdoors 9
technology 8.9
mechanical device 8.6
face 8.5
professional 8.4
power 8.4
safety 8.3
handsome 8
work 7.8
happiness 7.8
oxygen mask 7.8
elegant 7.7
side 7.5
street 7.4
light 7.4
occupation 7.3
window 7.3
cheerful 7.3
plane seat 7.3
worker 7.1
day 7.1

Google
created on 2019-08-09

Microsoft
created on 2019-08-09

black and white 95.1
text 63.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 39-57
Gender Male, 89.7%
Confused 3.4%
Disgusted 0.6%
Angry 9.6%
Happy 2.6%
Sad 30%
Calm 43.6%
Fear 6.2%
Surprised 4.1%

Feature analysis

Amazon

Hat
Person
Airplane
Hat 98.9%

Categories

Imagga

paintings art 92.6%
food drinks 3.5%
interior objects 2.7%

Captions

Microsoft
created on 2019-08-09

a man sitting in a car 35.7%
a man sitting on top of a car 31.4%
a man sitting on a car 27.2%

Google Gemini

Created by gemini-2.0-flash on 2025-05-12

Here is a description of the image:

This black and white image shows an older man driving a car. He's wearing a light-colored suit and a fedora. He has glasses on and is holding the steering wheel with both hands. The dashboard of the car is visible, with a speedometer that goes up to 120. There's also a tag hanging from something on the dashboard. The overall style of the image gives it a vintage feel.

Created by gemini-2.0-flash-lite on 2025-05-12

Certainly! Here's a description of the image:

Overall Impression:

The image is a black and white, inverted photograph showing a man seated in the driver's seat of a car.

Detailed Description:

  • Subject: The man is elderly, wearing a white suit, a white fedora hat, and glasses. He is holding the steering wheel with both hands, and his gaze is directed forward.
  • Environment: The interior of the car is partially visible. We can see the dashboard, the windshield, and what appears to be the side window. The car's interior elements are relatively plain.
  • Composition: The composition focuses on the man and his interaction with the car. The lighting is such that the contrast is high.
  • Atmosphere: The inverted colors create a somewhat surreal or unusual feeling. The focus is on the man's expression and the details of his posture.

Let me know if you would like a more detailed description!

Text analysis

Amazon

5o