Stop Praising FSD Blindly! Tesla Crashes into Wall on Familiar Route, Ex-Uber Executive Injured, Car Totaled

image_1773887036918

Jikai ijikai.com, March 18th — A seasoned expert in the field of autonomous driving has surprisingly “stumbled” in the very domain he knows best. Recently, Raffi Krikorian, the current Chief Technology Officer of Mozilla and former head of Uber’s autonomous driving division, shared a harrowing experience in a lengthy post: his Tesla Model X was involved in a severe accident while the Full Self-Driving (FSD) feature was engaged, resulting in the vehicle being totaled on the spot.


Accident Reconstruction: A Familiar Road Traveled Hundreds of Times, An Unpredictable Loss of Control

According to Krikorian’s account, the accident occurred on a residential street in the Bay Area of the United States. For him, this road was already second nature, having driven it back and forth hundreds of times before.

The accident happened on a Sunday while he was driving his son to a Boy Scouts activity. The FSD system’s performance had been very smooth until the vehicle entered a curve. However, at the moment of turning, the system suddenly malfunctioned:

  • Sudden Abnormal Noise and Vibration: The steering wheel began to shake violently without warning.
  • Abnormal Deceleration: The vehicle decelerated while shaking.
  • Intervention Ineffective: Despite Krikorian’s rapid professional reaction, immediately reaching to take back control of the steering wheel, the vehicle had already lost control and crashed directly into a concrete wall by the roadside.

Casualties: Vehicle Totaled, Driver Suffers Concussion

The immense impact force caused the Model X to be totaled on the spot. Due to the violent collision, Krikorian himself suffered a concussion, accompanied by headaches and neck stiffness lasting for several days. Fortunately, his son, seated in the back, was not injured in the accident.

It is noteworthy that Krikorian was not a “careless” driver. He stated that he always followed Tesla’s requirements, keeping his hands on the steering wheel and remaining vigilant. He revealed that he was very cautious in using FSD, having first familiarized himself with it on simple highway conditions before gradually enabling it on regular roads. He never expected that he would ultimately not be spared.


Legal Dilemma: Expert Takes the Blame, System “Exonerated”

Although this appears to be an accident caused by a suspected failure of the system’s algorithm or actuators, the legal determination yielded a rather frustrating result for the parties involved.

Insurance Determination Result: Full responsibility for the accident was assigned to Raffi Krikorian.

In-depth Analysis:

Currently, Tesla’s FSD (Full Self-Driving) system is legally defined as a Level 2 (Combined Driving Assistance) system within the legal framework. This means:

  1. Responsibility Attribution: Regardless of how human-like the system behaves, the driver is always the primary responsible party for driving.
  2. Monitoring Obligation: The driver must be responsible for monitoring road conditions at all times and is accountable for any erroneous operations by the system.

This accident has once again sparked industry discussion about the “autonomous driving trust trap”: when a system performs perfectly 99% of the time, even professionals can unconsciously lower their guard, and the remaining 1% often comes at a fatal cost.


Commentator’s Observation: Stop Blindly “Deifying” Tesla FSD

The most thought-provoking aspect of this accident is that the victim was not an ordinary car owner clueless about technology, but a top industry expert who once led Uber’s autonomous driving division and currently serves as the CTO of Mozilla.

For a long time, Tesla FSD has been touted by some supporters in online discourse as the “god of driving,” even seen as “full self-driving” surpassing human drivers. However, reality has poured cold water on this notion:

  • The “Takeover Trap” is Ubiquitous: When a system behaves like an experienced driver 99% of the time, human vigilance inevitably declines. Even with his hands on the wheel, Krikorian was unable to prevent the vehicle from crashing into the concrete wall in the face of the system’s sudden abnormal noise and vibration within a mere one or two seconds.
  • Disconnect Between Marketing Rhetoric and Reality: Despite being named “Full Self-Driving,” legally and technically, it remains only a Level 2 driver-assistance system. This means that while Tesla enjoys high optional fees, it transfers all accident liability to the driver through its service terms.
  • Experts Can Also Be “Assimilated” by Habit: As Krikorian stated, he gradually formed habits and lowered his guard over extended use. This “frog-boiling” type of trust is precisely the biggest safety hazard of current driver-assistance systems.

Conclusion:

We should not deny technological progress, but we must absolutely avoid developing a blind, “religious-like worship” of it. Before laws and regulations for true L3 or even L4 autonomous driving are perfected, any act of completely entrusting one’s life to algorithms is a high-stakes gamble with costly consequences.

 

Share your love
rocky TT
rocky TT

one world one dream

Articles: 23
0 0 votes
Article Rating
Subscribe
Notify of
guest

0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x