Tesla's full self-driving beta

The public is taking unreasonable risks with Tesla’s full self-driving beta test.  And these risks could potentially be deadly.  For those who don’t know, the full self-driving beta test is a rollout of software which allows select users the use of autopilot on non-highway streets.

Customers with Tesla’s Early Access Program will received the update.  This effectively allows these users to access the autonomous autopilot system for city streets.  As The Verge stated, “the early access program is used as a testing platform to help iron out software bugs.”

Iron out software bugs.  On city streets.  With real people who never consented to this science experiment.  And the people running the science experiment?  Normal, untrained customers of Tesla.  If that doesn’t sound like a science project gone mad, watch this video.

And if you still don’t think this is crazy check out this video, where a beta test car stops in the middle of an intersection, causing a car behind it to honk its horn.

Look, we are not haters of technology.  And to be frank, the development of a self-driving car is a wonderful technology and complete game changer.  But rolling a software beta test out on non-consenting drivers and using random untrained customers to run the experiment is just reckless.

As Zero Hedge reported in March in their article titled: Attention NHTSA: Second Tesla In A Week Has Plowed Through Storefront In Coachella Valley. It is only a matter of time before a Tesla in autopilot mode will crash and kill someone.  Tesla’s autopilot does not prevent accidents.  Show me high tech software and I will show you a bug that will infect it.  

Brian W. Kernighan stated this perfectly, “Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it.” 

Tesla’s full self-driving beta test?  It’s a mad science experiment gone wrong.



1 COMMENT

  1. so your conclusion to software always has bugs or is breakable with intend is: don’t use software then ?
    and besides: have you not had a car in front of you stop in an intersection seemingly without cause?

    i have, multiple times. Human drivers are distracted, don’t know the way or are confused in which lane they should go.

    if i remove a stop sign from an intersection it will likely cause a collision, that doesen’t mean human drivers should not be allowed, because i can make the driver make a mistake if i want to.

    If you don’t want to have such things the bar for a drivers license should also be rised substantially. especially in the US. And new drivers shouldn’t be allowed to drive on streets where nobody consented to driving along with somebody who doesn’t drive as good as they are …

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.