Tesla's full self-driving beta

The public is taking unreasonable risks with Tesla’s full self-driving beta test.  And these risks could potentially be deadly.  For those who don’t know, the full self-driving beta test is a rollout of software which allows select users the use of autopilot on non-highway streets.

Customers with Tesla’s Early Access Program will received the update.  This effectively allows these users to access the autonomous autopilot system for city streets.  As The Verge stated, “the early access program is used as a testing platform to help iron out software bugs.”

Iron out software bugs.  On city streets.  With real people who never consented to this science experiment.  And the people running the science experiment?  Normal, untrained customers of Tesla.  If that doesn’t sound like a science project gone mad, watch this video.

And if you still don’t think this is crazy check out this video, where a beta test car stops in the middle of an intersection, causing a car behind it to honk its horn.

Look, we are not haters of technology.  And to be frank, the development of a self-driving car is a wonderful technology and complete game changer.  But rolling a software beta test out on non-consenting drivers and using random untrained customers to run the experiment is just reckless.

As Zero Hedge reported in March in their article titled: Attention NHTSA: Second Tesla In A Week Has Plowed Through Storefront In Coachella Valley. It is only a matter of time before a Tesla in autopilot mode will crash and kill someone.  Tesla’s autopilot does not prevent accidents.  Show me high tech software and I will show you a bug that will infect it.  

Brian W. Kernighan stated this perfectly, “Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it.” 

Tesla’s full self-driving beta test?  It’s a mad science experiment gone wrong.