Rules of the road
Following the law can get self-driving cars into accidents
Full access isn’t far.
We can’t release more of our sound journalism without a subscription, but we can make it easy for you to come aboard.
Get started for as low as $3.99 per month.
Current WORLD subscribers can log in to access content. Just go to "SIGN IN" at the top right.
LET'S GOAlready a member? Sign in.
Developers of driverless cars scrupulously program them to obey all traffic laws. You might think that would be a good thing, but research has found that driverless cars are racking up accidents at a rate twice that of cars with human drivers. The reason? Unlike humans, the autonomous cars obey traffic laws without exception.
According to a recent Bloomberg report, a University of Michigan Transportation Research Institute study of self-driving vehicle accidents found that the driverless vehicles were never at fault. They were typically involved in minor fender benders in which an inattentive human motorist rear-ended the autonomous vehicle.
Even the safest human drivers sometimes “fudge” the posted speed limit in order to keep up with traffic flow. And who among us always comes to a complete stop at stop signs? Autonomous vehicles that ignore these human behaviors might catch other motorists by surprise, perhaps triggering a human-induced accident.
If driverless vehicles eventually make up a significant portion of cars on the road, it raises the question: Should engineers program them to break the traffic laws once in a while in order to adapt to human driving patterns?
“It’s a constant debate inside our group,” Raj Rajkumar, co-director of the General Motors Collaborative Research Lab at Carnegie Mellon University, told Bloomberg. “We have basically decided to stick to the speed limit. But when you go out and drive the speed limit on the highway, pretty much everybody on the road is just zipping past you.”
Google researchers, at the forefront of autonomous vehicle science, have been working to make their driverless vehicles behave more like assertive yet law-abiding human drivers, according to Bloomberg. For example, they have programmed their cars to inch forward at four-way stops to signal they’re the next to go.
Disruption of the status quo is normal anytime a new technology emerges, and driverless cars will be no exception—particularly when it comes to who’s at fault in an accident. Scott Gant, an antitrust lawyer writing in Forbes, said car companies such as Volvo and Mercedes-Benz have already announced they will “accept full liability” for accidents caused by their driverless vehicles (when they become available).
Yet California’s long-awaited preliminary rules for autonomous vehicles, issued last month, “hold motorists responsible for obeying traffic laws, regardless of whether they are at the wheel,” according to The Wall Street Journal.
Gant believes the arrival of driverless cars will eventually lead to safer roads, but the need for automobile insurance will likely never disappear: Software glitches may replace driver error as the major cause of accidents.
Googly eyes?
As school districts around the country embrace classroom technology, many are looking to inexpensive, cloud-based solutions such as Google’s Chromebook and its free suite of apps called Google Apps for Education. But in a complaint filed last month with the Federal Trade Commission, privacy watchdog Electronic Frontier Foundation accused Google of “collecting, maintaining, using, and sharing student personal information” in violation of the K-12 Student Privacy Pledge, to which Google is a signatory.
The “sync” feature on Google’s Chrome browser lets the company collect and store records of every student internet search or website visit outside the Apps for Education suite. Such tracking ability isn’t restricted to the Chromebook device. It could follow a student wherever he uses his school Chrome account—even at a home computer. —M.C.
Please wait while we load the latest comments...
Comments
Please register, subscribe, or log in to comment on this article.