Tesla Owner Nearly Collides With Train in Full Self-Driving Mode; How Safe Is FSD Mode?
(Photo: Wikimedia/Alexander-93)

A Tesla owner found himself in a dangerous situation after trusting his car's full self-driving mode. He had to intervene because the car didn't slow down while approaching a passing train.

Tesla Owner Claims Full Self-Driving Mode Didn't Work

Earlier this month, Craig Doty II, from Ohio, was traveling along a route at night. His dashcam film captured his Tesla rapidly approaching a passing train and showing no signs of slowing down. He did not identify the brand or model of the automobile, but he stated that it was in Full Self-Driving (FSD) mode at the time and did not slow down despite the train crossing the road.

In the video, the driver is compelled to stop by swerving straight through the railroad crossing sign and stopping only feet away from the approaching train.

Tesla has been hit with several lawsuits from owners claiming that their vehicle crashed because the FSD or Autopilot technology failed to stop for another car or drove into an object, sometimes killing the drivers.

When asked why he kept using the FSD system following the initial near collision, he claimed he trusted it to work correctly since he hadn't experienced any other problems. Apparently, FSD generally worked.

Like with adaptive cruise control, you come to trust the FSD system to work as intended throughout use, Doty said. He added that when you get close to a slower car ahead, you think it will slow down, but when it doesn't, you have to take control.

"This complacency can build up over time due to the system usually performing as expected, making incidents like this particularly concerning," Doty said.

For FSD mode's safety, Tesla warns customers not to use the system in low light or inclement weather, such as rain, snow, direct sunshine, or fog, as it can "significantly degrade performance." The environment interferes with the operation of Tesla's sensors, which include ultrasonic ones that bounce off adjacent objects using high-frequency sound waves.

In addition, it uses 360-degree cameras and radar devices that emit low light frequencies to detect the presence of a car.

These systems work together to collect information about the environment, including traffic, road conditions, and objects in the immediate vicinity. However, in low visibility, the systems cannot precisely identify the surrounding conditions.

Additionally, Tesla's handbook suggests that drivers should always maintain their hands on the steering wheel and "be mindful of road conditions and surrounding traffic, pay attention to pedestrians and cyclists, and always be prepared to take immediate action."

ALSO READ: Tesla's Cybertruck Problems Could Be Linked to Gas and Brake Pedal Issues From Model S, Former Employee Claims

Previous Complaints About FSD Mode's Safety

On May 16, 2022, Hans Von Ohain's family said he was operating the 2021 Tesla Model 3's Autopilot system when it abruptly veered to the right off the road. However, Erik Rossiter, a passenger, stated that the driver was under the influence of alcohol at the time of the collision.

Ohain attempted to take back control of the car but was unable to do so, and he perished when the vehicle struck a tree and caught fire. The autopsy report revealed that he had three times the permitted alcohol limit in his system when he passed away.

In 2019, another man was killed in Florida after a semi-truck came onto the road and the Tesla Model 3's Autopilot failed to brake, causing the car to fall under the trailer and killing the man instantaneously.

Over claims that the Autopilot system caused a Los Angeles man to die when the Model 3 drove off the highway and into a palm tree before bursting into flames, Tesla won its first lawsuit against the company in October of last year.

RELATED ARTICLE: Technical Glitch or Ghosts: What Has This TikToker's Tesla Detected in an Abandoned Graveyard? [Fact Check]

Check out more news and information on Tesla in Science Times.