Tesla S Malfunction Caused 8 Car Crash Incidents, But Company Remains Silent Despite US Data Showing FSD Failure

According to data released by the federal government, the Tesla Model S that caused an eight-car crash in San Francisco in November had the automaker's driver-assist software engaged at the time of the incident. The data shows that the car slowed to 7 mph on the highway and moved into the far-left lane before braking abruptly, following a CNN report.

The driver of the Tesla has previously claimed that the vehicle's "full self-driving" software unexpectedly activated and caused the pileup. The National Highway Traffic Safety Administration has sent a special crash investigation team to examine the incident, which took place shortly after Tesla CEO Elon Musk announced that the "full self-driving" technology or the driver-assist system was available to everyone from North America who asked for it and had paid for the feature.

Tesla has not commented on the incident and does not engage with professional news media. The company has previously stated that it is proud of its Autopilot software's performance and its impact on reducing traffic collisions. However, traffic safety experts have questioned the accuracy of Tesla's findings, which show fewer crashes when driver-assist technologies are active, as they are generally used on highways where crashes are less common.

Tesla's Silent Treatment

Bryan Reimer, an autonomous vehicle researcher with the Massachusetts Institute of Technology's AgeLab, said that the revelation that driver-assist technology was engaged raises questions about when NHTSA will act on its investigation and what the fate holds for Tesla's driver-assist markers. He also questioned how many more crashes there will be before NHTSA releases findings. It remains to be seen if there's a recall of any Tesla driver-assist features and what it means for the automaker's future. Elon Musk, CEO of Tesla, had previously said that the company would be "worth zero" if it didn't provide "full self-driving."

The recent release of a video showing a Tesla driver struggling to maintain control of the vehicle while using the Full Self-Driving (FSD) software on snowy roads in Detroit raises concerns about the readiness of Tesla's FSD software for use on public roads. Despite allowing people to test an incomplete version of the software on public roads, the video demonstrates that the software cannot safely handle slippery road conditions.

It is also concerning that Tesla has made the beta version of the FSD software available to anyone willing to pay the $15,000 fee. The video shows that the software is not able to handle difficult weather conditions, as the vehicle is shown having difficulty coming to a complete stop, causing the wheels to lock, also the software is not able to accurately identify the road, and the vehicle is seen veering off into the curb, potentially because the cameras are not able to differentiate between the road and the sidewalk.

Amidst US government data, Tesla FSD caused a new malfunction in San Fo involving their S model auto-pilot technology.
Amidst US government data, Tesla FSD caused a new malfunction in San Fo involving their S model auto-pilot technology. Tesla

Slippery Self-Driving Technology

It is important to note that the road conditions in the video are not extreme; they are typical for a city like Detroit in the winter. According to the driver in the video, the FSD software has not been trained to handle snowy conditions, and the neural nets have not been trained for snow.

In summary, despite Tesla's marketing claims, the company still has a lot of work to do before its vehicles can truly drive themselves in normal weather conditions. The video also highlights the risks of using the FSD software on public roads without proper supervision. This is in line with last year's report from ScienceTimes, which noted similar issues with the FSD software.

It is expected that there will be some issues with the software while it is still in testing. For example, in recent tests conducted by The Dawn Project, an organization opposed to using FSD, the software reportedly failed to recognize school bus stop signs. However, it is important to consider the potential bias in the results from The Dawn Project as an organization actively working to prohibit the use of FSD.

Check out more news and information on Tesla Malfunction in Science Times.

Join the Discussion

Recommended Stories

Real Time Analytics