February 10, 2022, Kitchener, Ontario
Posted by: Robert Deutschmann, Personal Injury Lawyer
We’ve written about autonomous vehicles frequently in the past and have questioned liability issues as well. It’s a very serious question that will require serious contemplation, accountability and changes to legislation.
We love the capabilities of AI – the lane change assist, lane drift warnings, backup assist, parking assist, and emergency braking are all useful and can make driving safer. How far are we willing to go with it though?
When a human drives and is a fault for a death, generally they are held accountable. What happens when it’s the AI of the autonomous vehicle that causes the crash or allows it to happen? There are many places to lay the blame.
Someone or some group had to have programmed the AI and given it a base set of values. For example, is the life of the pedestrian crossing the road worth less than the 3 individuals in the car, or on the sidewalk who might be killed if evasive maneuvers were taken? The manufacturer of the vehicle may be held liable, the failsafe program creators could be held liable, or is it the insurer or owner of the vehicle?
A recent case considered in California courts may begin to provide some guidance and precedent here. California prosecutors recently filed two counts of “vehicular manslaughter" in the state against a driver who ran a red light hitting another car and killing two people. The driver was in a Tesla S model and had the car on Autopilot mode at the time. The families of the victims are suing Tesla.
This is the first case to be tried in the USA involving autonomous vehicles in a felony for a fatal crash using the ‘partial’ automation mode. The arguments will involve details around whether the driver was paying attention at the time of the crash. He has pleaded not guilty.
Under the National Highway Traffic Safety Administration, the Tesla S in partial automation is considered a Level 2 vehicle. This is defined as a vehicle that steers, brakes and accelerates autonomously but also requires the driver in the vehicle to be ready to take control of driving tasks. For reference Level 0 has no autonomous driving features and Level 5 is fully automated. There are no vehicles that are Level 5 available to the public for commercial purchase.
Critics note that companies like Tesla, GM and Nissan are confusing customers by using various terms like ProPILOT and Autopilot and misleading them into believing that the technology is far more competent than it is. Many consumers who are eager to adopt the new tech and free themselves of the task of driving are ready to turn the wheel over to the AI and let it drive.
The state of California is now reviewing some of the marketing terms being used by auto manufacturers in light of the misleading and irresponsible connotations of the terms.
The results of the pending court case will be a useful reference point for the rest of America and Canada as well. Our laws have a long way to go to catch up to the technology.
If you or a loved one are seriously injured in any car accident you should contact our highly experienced personal injury lawyers today.
|