Self-Driving Cars: Are We There Yet

Apr 27, 2018 | Car Accidents

Some day, and perhaps sooner than we realize, the software in our cars may become so sophisticated, flawless, and easy to use that the morning commute to work will be truly stress-free, accident-free, and even hands-free. But recent reports of setbacks for cars that “drive themselves,” to one degree or another, indicate that motorists still have a long way to go before they can turn over the wheel to a truly autonomous driving system.

A series of accidents involving cars that steer themselves, including three fatal crashes, has triggered National Transportation Safety Board investigations and prompted companies developing semi-autonomous cars to suspend testing and ponder additional safety features. The mishaps have also raised questions about whether the current technology can compensate for and overcome the real possibility of human error, even in a largely automated process — or if technology might even compound the problem.

For decades, the notion of a driverless car seemed as fantastic as the flying bubble-car George Jetson used to ferry his family through the skies in a popular 1960s animated TV series. But the emergence of “smart” vehicles that can effectively monitor many of the car’s functions, and even help control steering, acceleration and brakes, has opened up a range of possibilities, from Cadillac’s Super Cruise Control (which encourages a driver to take hands off the wheel, but not eyes off the road) to Tesla’s Autopilot, which automates steering and certain responses but still requires driver monitoring, to testing of fully automated vehicles that don’t require any human driver at all.

The manufacturers of the new generation of self-steering cars claim that, when used properly, the vehicles are as safe or safer than traditional manual operation. For example, Tesla claims that a vehicle equipped with Autopilot is nearly four times less likely to be involved in a fatal accident than other vehicles on the road. Consulting with an automobile lawyer can provide valuable insights and guidance regarding the legal implications and responsibilities associated with the use of self-steering cars. But the systems that are currently available for everyday use all depend on keeping a live driver in the loop, available to take over if something goes wrong. And human error appears to have played a significant role in at least some of the self-driving accidents.

— In 2016 the first fatal crash of a self-driving car claimed the life of Joshua Brown, after a collision with a tractor-trailer that was crossing the highway in front of Brown’s Tesla Model S. Regulators cleared Tesla of responsibility for the crash, finding that Brown had ignored multiple automatic warnings, set his cruise control to 74 mph two minutes before the crash, and failed to brake in the seconds leading up to the collision, suggesting he wasn’t monitoring traffic.

— Last January another Tesla Model S slammed into a firetruck that was parked on a freeway after responding to an emergency in Culver City, California. Although the Tesla was reportedly going 65 miles per hour on Autopilot, there were no injuries. The crash is still under investigation.

— On March 18, 2018, a Volvo XC90 testing Uber’s autonomous system struck and killed a pedestrian in Tempe, Arizona — the first pedestrian fatality involving a self-driving car. An interior video indicates that the car’s “safety driver,” who is supposed to take the wheel as needed, wasn’t watching the road for several seconds before the collision, but other factors may have contributed to the accident, which occurred when the pedestrian, who was walking her bike, stepped out of night shadows into the car’s lane.

— Four days later, a Tesla Model X in Autopilot crashed into a highway barrier in Mountain View, California, killing its driver, Walter Huang. The company has claimed that Huang received “several visual and one hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision.”

In response to reports of drivers not paying enough attention to cars that are steering themselves, car manufacturers have sought ways to improve their ability to monitor what the drivers are doing and warn them that they should be monitoring, too. They have added cameras and facial recognition software to try to determine if the drivers (or, more accurately, “backup drivers”) have their eyes on the road. But even frequent scoldings may not be enough, in a vehicular environment that’s brimming with distractions, from cell phones to fancy navigation and audio systems.

As the self-driving systems improve, they are expected to become less reliant on human intervention in possible collision scenarios; if a safety driver ignores warnings to take the wheel, for example, the vehicle may simply decelerate, find its way to the curb or shoulder, and turn itself off. But given that driving conditions can change drastically in just a few seconds, even that process may take too long. And turning over critical decisions to a robotic pilot presents a series of ethical and legal quandaries.

Statistically, smarter cars are also safer, with collision avoidance and self-parking features that can compensate, to some extent, for driver carelessness. At the very least, the vast amount of sensor data now available in newer models provides more information about how a crash occurred — and has the potential to reduce legal disputes over who’s at fault. At the same time, the new technology is altering the terms of the question, from “who’s at fault” to “what’s at fault.” As motorists give up control of their cars, liability issues may revolve around imperfect autonomous systems rather than driver negligence.

Can such a system be trusted to make the “right” calls in every traffic situation? Legal analysts have proposed hypothetical scenarios, in which a self-driving car has the option of veering away from a possible head-on collision but risks hitting pedestrians. Would the system’s decision-making prioritize the driver’s safety over that of everyone else? How will insurance companies deal with claims that the manufacturer, not the driver, is responsible for a pileup on I-70?

With all these imponderables, it’s clear that the brave new world of self-driving cars is still sorting itself out. There may be clear skies ahead for George Jetson, but back here on earth, drivers still have to keep their eyes on the road.

If you or a loved one is involved in a car accident, contact the offices of Frank Azar Car & Truck Accident Lawyers. For the past thirty years, Frank Azar and his team of auto accident lawyers have successfully represented thousands of injured clients, fighting to get them the compensation they deserve. Call us for a free consultation or contact us here.

Free case evaluation. Contact us now!