BOSTON (CBS) – One auto-safety expert is calling for more government oversight of so-called autonomous vehicles after a Tesla operator was apparently seen sleeping behind the wheel on the Massachusetts Turnpike over the weekend.
“What we’re following is a very poor model. One that is destined to fail,” said Sean Kane of Safety Research and Strategies, Inc.
Tesla claims that its vehicles have full self-driving hardware but Kane argues Tesla and other car companies have created a false sense of security because their cars are not fully autonomous.
“If you have a vehicle that is stopped in the road or a very significant speed differential between the vehicle you’re driving or operating and the vehicle ahead of you, the systems aren’t capable of detecting them quick enough and making the driver alert and bringing the vehicle to a stop,” Kane said.
Last week, the National Transportation Safety Board report determined that a design flaw in Tesla’s autopilot system and driver inattention caused a Model S electric car to slam into a firetruck parked along a California freeway.
The report also raises questions about the effectiveness of Tesla’s system which was in operation before two fatal crashes in Florida and Silicon Valley.
Kane says the Tesla system allows drivers to set unsafe conditions using the autopilot feature such as the ability to follow other moving cars at dangerously close distances.
“Those are things that are usually on the Federal governments watch list in terms of regulating. Why are we building systems that will induce problems and cause crashes?” Kane said.
Autonomous cars and driver assist systems are largely unregulated by the Federal government. Massachusetts lawmakers drafted a bill regulating them last year but tabled it.
Kane says the lack of oversight of incidents like the one on the Pike is a recipe for disaster.
“If you build a design that gives you the opportunity to disengage your attention span, and at the same time you need to be engaged, what do you expect is going to happen?” said Kane.
In a statement, Tesla said the system reminds drivers to keep their hands on the wheel.
“Many of these videos appear to be dangerous pranks or hoaxes,” a Tesla spokesperson said. “Our driver-monitoring system repeatedly reminds drivers to remain engaged and prohibits the use of Autopilot when warnings are ignored. At highway speeds, drivers typically receive warnings every 30 seconds or less if their hands aren’t detected on the wheel. Tesla owners have driven billions of miles using Autopilot, and data from our quarterly Vehicle Safety Report indicates that drivers using Autopilot experience fewer accidents than those operating without assistance.”