Tesla 'Full Self-Driving' Malfunction on Snowy Road in Detroit Captured in Video Footage

Tesla is currently allowing people to test an incomplete version of its "Full Self-Driving" software on public roads. However, a recently released video shows that the software is not yet ready for use as it is not capable of handling slippery road conditions safely. The video, posted by the YouTube account Detroit Tesla, shows a driver struggling to maintain control of the vehicle while using the FSD software on snowy roads in Detroit.

This is concerning because Tesla recently made the beta version of the FSD software available to anyone willing to pay the $15,000 fee. While the software may be able to handle driving in California's favorable weather, it is not yet equipped to handle difficult weather conditions in other areas.

Slipping on Snowy Road

The video demonstrates that the FSD software is not yet ready for use as the vehicle is shown driving at high speeds on snowy roads and having difficulty coming to a complete stop, causing the wheels to lock. The video also shows that the FSD software is not able to accurately identify the road, as the vehicle is seen veering off into the curb, potentially due to its cameras not being able to differentiate between the road and the sidewalk.

It's important to note that the road conditions in the video are not extreme; they are typical for a city like Detroit in the winter. The driver in the video comments that the FSD software has not been trained to handle snowy conditions, stating that he doesn't think they've trained the neural nets at all for snow still.

To summarize, despite Tesla's marketing claims, the company still has a lot of work to do before its vehicles can truly drive themselves in normal weather conditions. The video also illustrates the risks of using the FSD software on public roads without proper supervision.

Prior Malfunction

Following last year's report from ScienceTimes, last year, Tesla's FSD shed light on the same issue. It is expected that there will be some issues with Tesla's FSD software while it is still in testing. In recent tests conducted by The Dawn Project, an organization that is opposed to the use of FSD, the software reportedly failed to recognize school bus stop signs. It is worth noting that these tests were carried out by an organization that is actively working to prohibit the use of FSD, so it is important to consider the potential bias in the results.

The tests also included the use of an internal camera to show that the operator was not accelerating the vehicle and the use of the interface screen, which indicated that Auto Steering was activated. It is unclear whether full self-driving was also activated during the tests. Based on a report by The Drive, some Tesla owners have reported that their vehicles stop too soon when approaching stop signs.

A Tesla owner who goes by the username @cowcumber on Twitter investigated the issue and found that it only occurred on certain exit ramps. Upon further examination, @cowcumber discovered that the stop signals at exit ramps were 60% larger than stop signs in residential areas. Tesla has not commented on this issue and has previously requested that people do not share videos of its FSD Beta-equipped cars driving over child-sized dummies.

Check out more news and information on Tesla in Science Times.

Join the Discussion

Recommended Stories

Real Time Analytics