top of page
Lawrence Ng

Tesla Stopped Full Self-Driving Test, Autopilot Unlikely Cause of Singapore Tesla Crash

Tesla just pulled the latest version of its Full Self-Driving (FSD) beta software in less than a day after its release amid users' complaints about false collision warnings and other concerns. This comes as the company is facing regulatory scrutiny over FSD, its semi-autonomous driving technology.

Credit: Tesla

Tesla Chief Executive Officer Elon Musk tweeted that software issues influenced the decision to take down FSD Beta 10.3.


"Seeing some issues with 10.3, so rolling back to 10.2 temporarily. Please note, this is to be expected with beta software. It is impossible to test all hardware configs in all conditions with internal QA, hence public beta," wrote Musk.


The U.S. government has recently probed into Tesla's Autopilot system after a series of crashes involving Tesla Cars.

In Singapore, a Tesla car accident video has made its rounds on the internet. As seen in dashcam footage posted on Reddit by user u/Legocraze_Z, a driver riding in a red Tesla Model 3 was making a right turn into Shan Road but failed to stop as another vehicle was heading towards them. According to ROADS.sg, the car that collided with the Tesla had right of way and hit the aforementioned vehicle as it turned right. The accident occurred along Balestier Road on 22 October 2021.


The incident sparked an online debate on what caused the crash. On Reddit, one user suggested that it could be influenced by the Tesla vehicle's Autopilot system.


"Honestly I don't think auto driving works in SEA country, maybe highway still ok," the user wrote.


The other person said that perhaps, the Tesla car driver should have used Autopilot.


Another user mentioned that some Tesla drivers have let their cars go on Autopilot without paying attention to the road while the vehicle is driving on its own, resulting in fatalities.


Currently, all Tesla cars are only approved for level 2 autonomous driving. This includes the Model 3, which is the only model available in Singapore. Level 2 or partial driving automation means that combined automated functions manoeuvring the car's speed and direction are present. Still, the driver must observe his surroundings. Meanwhile, conditional driving or Level 3 automation does not compel the driver to monitor his environment at all times. However, he must be prepared to do so when the situation calls for it.

Credit: Tesla

As per Tesla, the Model 3 has Autopilot features that let it steer, accelerate and brake automatically within its lane. There's also full self-driving capabilities that can be used in the future such as those that navigate your car from highway on-ramp to off-ramp, can enable the car to parallel and reverse park on its own as well as automatically change lanes. Aside from that, you will be able to automatically summon your car from one place to another using your car keys. On its website, Tesla made it clear that the "current Autopilot features require active driver supervision and do not make the vehicle autonomous".


The accident in the video did not happen on a highway and as such, it is unlikely the Autopilot system was activated and caused the accident.


Other safety features like the Front Collision Warning in the Tesla vehicles could have prevented the accident. Unfortunately, this current feature only warns drivers of possible collisions with slower-moving or stationary cars. Judging by the speed of the oncoming vehicle, it is possible that the Tesla driver was not aware of the approaching vehicle and did not get a Front Collision Warning in time.


Do you think this accident could have been prevented by technology? Let us know your thoughts on our Facebook page.

 

Written by Sophia Lopez

As technology advances and has a greater impact on our lives than ever before, being informed is the only way to keep up.  Through our product reviews and news articles, we want to be able to aid our readers in doing so. All of our reviews are carefully written, offer unique insights and critiques, and provide trustworthy recommendations. Our news stories are sourced from trustworthy sources, fact-checked by our team, and presented with the help of AI to make them easier to comprehend for our readers. If you notice any errors in our product reviews or news stories, please email us at editorial@tech360.tv.  Your input will be important in ensuring that our articles are accurate for all of our readers.

bottom of page