When robots and humans take turns at the wheel - Los Angeles Times
Advertisement

When robots and humans take turns at the wheel

According to a State Farm survey, many drivers look forward to texting and taking care of other business in their semiautonomous cars.
(Mel Melcon / Los Angeles Times)
Share via

Until recently, there was no question about who’s responsible for an automobile’s operation: the driver. One-hundred percent.

When driverless cars without a steering wheel or brake pedal start hitting the highway, your only role will be ordering the car where to go.

For the record:

2:56 p.m. June 2, 2019An earlier version of this story spelled the name of a Volvo executive as Jonas Nillson. His last name is Nilsson. It also said Tesla Autopilot changes were rolled out Sunday; the changes began rolling out Wednesday and were announced Sept. 11.

Between now and then — about five years by automakers’ estimates — the relationship between drivers and their cars will enter uncharted and potentially hazardous territory. Robot-like features will take over an increasing share of the driving duties — but not all of them.

Advertisement

Humans and robots will share the wheel, and it’s uncertain how well people will adapt to this in-between state — whether they’ll remain appropriately vigilant or leave everything to the machine, possibly at their own peril.

More than a third of respondents to a recent State Farm survey said that if a semiautonomous car took over part of the driving duties, they’d eat, read, text, take pictures and access the Internet while driving. That would not be safe.

“There’s something we used to call split responsibility,” said Hod Lipson, director of Columbia University’s Creative Machines Lab. “If you give the same responsibility to two people, they each will feel safe to drop the ball. Nobody has to be 100%, and that’s a dangerous thing.”

Advertisement

It’s an issue that Tesla has wrestled with ever since the May death of a Model S driver using Autopilot, the company’s popular driver-assist feature. Autopilot users are instructed to keep their hands on the wheel and to stay alert, but many — lulled by a false sense of security — have ignored those warnings. Tesla last week started rolling out improvements to the software that it says will make the feature safer.

Automakers say most customers don’t know yet what to make of driverless cars, but many want new vehicles equipped to take over some aspects of driving. The companies are happy to oblige: More excitement brings more people into the showroom, and more options mean higher revenue and profit.

Supporters, including federal transportation officials, believe that these cars will prove safer too, though there’s plenty of statistical analysis yet to be done.

Advertisement

No matter what, there will still be spectacular crashes, and the more often humans let their attention drift, the more crashes and bad publicity there will be.

That’s the reality now for the world’s roadways. New vehicles will be something in between: part traditional automobile, part robot, with the robot increasingly picking up the driving duties.

The in-between period could last awhile. Raj Nair, Ford Motor Co.’s chief technology officer, estimates that only 20% of new vehicle sales in 2030 will be completely driverless cars.

The pace of evolution in driver-assist technology varies among automakers. Tesla, General Motors and Mercedes-Benz are taking an aggressive approach.

Tesla’s Autopilot is the most advanced semiautonomous system currently available. Mercedes and Audi offer semiautonomous features that go well beyond adaptive cruise control. The 2017 Cadillac CT6 will have an Autopilot-like set of features called SuperCruise, in which the car will steer, change lanes and pass other vehicles, all with little driver effort. The same goes for Mercedes’ E-Class.

Advertisement

In carmaker lingo, there are six levels that generally describe a vehicle’s driverless capability, from zero to five.

Level 0 is no driver-assist technology at all. Level 1 covers old-fashioned stuff like traditional cruise control. At Level 2, where most driver-assist technologies stand now, the driver is expected to pay full attention. With Level 3, the robot drives most of the time, but not all the time. Level 4 is driverless on most roads, and Level 5 is driverless anywhere.

Ford Motor Co. plans driverless cars by 2021 but will skip Level 3. Google, an early leader in autonomous vehicle technology, and Volvo, where safety is leveraged as a marketing tool, also say they plan to skip Level 3 and go straight to fully autonomous.

“From a technical perspective, there are really only two levels,” said Jonas Nilsson, an autonomous-driving executive at Volvo Car Group. “Whether the driver is responsible or not.”

Circling over the issue is the fatal Tesla crash, when a driver crashed into a truck while cruising on Autopilot. The truck driver told police he heard a “Harry Potter” movie soundtrack playing in the crumpled car after the crash.

Elon Musk, Tesla’s chief executive, said it was the first known fatality in more than 130 million miles during which Autopilot was activated.

Advertisement

Musk has pledged to push ahead with autonomous features. This month he said it “would be morally wrong to withhold functionality that improves safety simply in order to avoid criticism or fear of being embroiled in lawsuits.”

Any subsequent tragedies involving driver-assist technologies are likely to draw more media attention and continue the debate on auto safety.

“When there are crashes, there will be a post-crash minefield of recrimination,” said Bryant Walker Smith, assistant professor of law at the University of South Carolina. Smith noted that “on the day the Tesla driver was killed, 100 other people were killed and it did not make the front page.”

In 2015, more than 35,000 people were killed in traffic in the United States, an average of 96 people a day. Distracted driving caused 3,477 of those crashes, up 8.8% in a year.

Engineering researchers in the psychology department at the University of Utah are studying whether semiautomated driving technology will make things better or worse.

During the experiments, people are put in semiautonomous driving simulators to measure their reaction times when something goes wrong. When subjects were distracted, average reaction time in the simulator almost doubled, researcher Kelly Funkhouser said.

Advertisement

The longer the subjects remained “cognitively disengaged,” the worse their reaction times got. Some, in fact, fell asleep.

Funkhouser’s next experiments will look at different types of alert systems that are intended to keep drivers engaged.

Cadillac recently announced that its SuperCruise feature will monitor drivers’ eyeballs and send warnings when it detects a lack of attention.

Thus far, there are few regulations aimed specifically at semiautonomous vehicles.

On Tuesday, the Department of Transportation and the National Highway Traffic Safety Administration issued loose guidelines for driverless-vehicle development, applying some of them to cars with driver-assist features.

The agencies made clear in the document that they retain the authority to recall cars equipped with semiautonomous technology that are deemed unsafe.

[email protected]

Advertisement

@russ1mitchell

ALSO

Drivers over 85 are the fastest-growing group on the road

Government paves the way for driverless cars to hit the roads

Tesla is rolling out Autopilot update under shadow of security hack


UPDATES:

Advertisement

This article was updated on September 28 with an added video. The original article was published on September 22.

Advertisement