The Collision of Reality with the Fantasy of Self-Driving Cars
The fantasy of self-driving cars is now a reality. Auto makers from A to Z, from Audi to Tesla, are now designing or manufacturing cars that drive themselves. Some are on the market. U.S. Transportation Secretary Anthony Foxx recently said, “Automated vehicles are coming. Ready or not, they are coming.”
An obvious consumer appeal of self-driving cars is that drivers can put their cars on auto-pilot, sit back and relax. Watch a movie. Read a book. Take a nap. A societal appeal of the auto-pilot system is perhaps safer roadways. Statistics that show that some 95% of all traffic collisions are caused by driver error. The appeal of the technology, and the fervor of investors to profit from the systems, must be tempered by great caution until the technology is perfected.
The recent fatal crash of a self-driving Tesla brought the issue of hope versus safety into sharp focus. The tragedy called attention to design and system flaws that must be fully corrected before this technology is unleashed on our public roads.
Tesla Fatal Crash
A Tesla car on autopilot failed to apprehend and avoid a left turning tractor-trailer on May 7, 2016 on a Florida highway. As a result, the car drove under the 18-wheel truck in the mid-afternoon. The top of the Tesla was torn off, it kept going, ran off the road and smashed through two fences before finally plowing into a utility pole. The Tesla operator was killed. The truck driver was uninjured. The accident made international news.
The National Highway Transportation Safety Administration (NHTSA) is still investigating the accident. The initial theory is that the auto-pilot system failed to detect the white sided trailer against the bright sky. Whatever the cause, the current state of the technology is unacceptably flawed. One known fact is that the Tesla was on Autopilot at the time of the accident. Preliminary reports indicated the driver was watching a Harry Potter movie.
Tesla CEO, Elon Musk, acknowledged that the system has not been perfected and still requires drivers to keep their hands on the steering wheel and to remain alert. Musk says that contributing factors to the Florida accident were likely that the white color of the truck against a “brightly lit sky” made the Tesla’s radar system identify the truck as an overhead sign so the autopilot brakes were not applied. That is a problem.
Musk reiterated that customers who buy the Autopilot Tesla model know that it is in “public beta phase” and agree to keep their hands on the wheel. He said that drivers understand that they are “to maintain control and responsibility” for their vehicle.
In a new development, on July 26, 2016, Mobileye, a company who was in partnership with Tesla in developing computer chips for the Autopilot program, announced it terminated its partnership with Tesla. The reason for the parting of ways was not announced, but sources “close to Mobileye” have speculated the fatal Florida accident was a cause of the decision.
Self-Driving Cars: Not Ready for “Primetime?”
Even before the unfortunate accident, Volvo, a car manufacturer who is working on a similar type of vehicle, criticized Tesla for selling automobiles with the autopilot feature. A Volvo engineer commented, “It gives you the impression that it's doing more than it is." General Motors (GM) is working on a similar function it calls Super Cruise. GM says that unlike Tesla, it will not release its auto with Super Cruise until it is absolutely sure it is ready.
Some specific problems with self-driving cars that have not yet been resolved include:
- Confusion and conflict with sensors: Self-driving vehicles rely on radar as well as cameras and other sensors. If there is a conflict between sensors, accidents happen. A preliminary investigation of the Florida Tesla accident indicates that one sensor observed the obstacle, the semi-truck, but interpreted it as an overhead road sign so the car did not apply the brakes.
- Road conditions: Unpredictable road conditions interfere with the sensors, particularly when lanes are not marked or there is construction. In February 2016, a Google self-driving car rammed into a public bus in Mountain View, CA when the car was trying to navigate among sandbags. Apparently, the autopilot detected the bus, but misinterpreted signals and expected the bus to yield.
- Weather conditions: Rain is a particular problem for self-driving cars. It reduces the distance at which sensors can do their work and obstructs cameras.
- Hackers: So many things can go wrong here if hackers interrupt the working of the vehicle software. In addition, simple laser pointers can confuse the self-driving vehicle into detecting obstacles that are not really there.
- Human error: This involves errors of humans both in self-driving cars and those in other cars on the road.
These problems should give everyone pause. The manufacturers, our government officials, potential purchasers and the entire roadway using public must be vigilant in putting safety above desires for simplicity or for profit. We cannot allow untested, unregulated, dangerous vehicles to be on our roadways until the systems are perfected.
If you, or someone you love, was injured or killed in an accident with a self-driving vehicle, contact our car accident attorneys at the Blumenshine Law Group (312)766-1000. We offer a free conversation and case evaluation.