Tesla and the legal risk of Autopilot
Tesla has been cool for years now, ever since the first car rolled off the assembly line people have been clamoring to get their hands on one. Those with enough money to pay for the pricey early models had to tolerate months-long waitlists. One reason for the excitement is how technologically cutting edge the vehicles are. Not only are the cars fully electric, but a host of snazzy features have captured the country’s attention and curiosity.
One excellent example is the feature known as “Autopilot”.
As the name implies, once a driver turns on the Autopilot feature, the
vehicle will essentially control itself. Drivers often take their hands
of the steering wheel and marvel as the vehicle moves itself, steering,
braking and accelerating, all on its own. Though the feature is undoubtedly
neat, many questions have arisen in recent weeks about what this feature
may mean for the company from a legal perspective following the death
recent death of a driver who had the Autopilot feature engaged.
According to news reports, the deadly crash occurred on May 7th in Florida. The car was heading down a highway with Autopilot turned on when a white tractor trailer crossed the road in the path of the Tesla. Neither the driver nor the Autopilot system detected the tractor trailer against a very bright sky, so the brake was never applied. The car drove straight into the side of the trailer, slicing off the top half of the vehicle. The driver died immediately.
The family has since hired a well-known personal injury attorney to represent them, a sign to many that a personal injury lawsuit is around the corner. If such a suit were to happen, legal experts say that Tesla has areas of potential liability. For its part, Tesla argues that Autopilot is only meant to assist drivers, not do the driving for them. Tesla even warns drivers about their need to keep both hands on the wheel and actively scan traffic, an attempt to make clear drivers are ultimately responsible for what happens in their car when the system is engaged.
Despite these warnings, lawyers will likely argue that drivers were not truly aware of the risks they took by using Autopilot. For one thing, the name Autopilot implies that the program is sophisticated and capable of responding to dangers, potentially lulling drivers into a false sense of security. If drivers are found to be aware of this risk, Tesla may not be liable. If courts hold Tesla to a higher standard, saying ordinary drivers would trust in the integrity of the system, then Tesla may be on the hook.
A final way in which Tesla may find itself legally responsible for Autopilot-related accidents is from a products liability claim. If plaintiffs are able to successfully argue that the Autopilot system was rushed into production too quickly or not subjected to careful enough scrutiny, that may be enough to create liability for Tesla. After all, it’s Tesla’s job to take efforts to protect consumers from its product and, if its efforts fall short of what other reasonable companies would do in similar circumstances, Autopilot could prove to be a costly mistake for the automaker.