Why self-driving cars may not be in your future

Preface. This is a summary of four articles about why an automated vehicle is not likely to ever happen. It would also be a huge waste of energy as people drove more miles, used mass transit less if ever (which is far more energy-efficient), and increase congestion. Researchers have found that people will drive 76% more miles, and stop using bicycles, mass transit, and services like Lyft and Uber.

Alice Friedemann   www.energyskeptic.com  author of “When Trucks Stop Running: Energy and the Future of Transportation”, 2015, Springer and “Crunch! Whole Grain Artisan Chips and Crackers”. Podcasts: Practical Prepping, KunstlerCast 253, KunstlerCast278, Peak Prosperity , XX2 report ]

Mervis, J. December 15, 2017. Not so fast. We can’t even agree on what autonomous, much less how they will affect our lives. Science.

[There is a great deal in this article about what an Automated Vehicle (AV) could do at each of the 6 levels, 0 through 5, but since I think AVs can’t possibly be developed above a level 1 (driver in full control but car assists in acceleration, barking, or steering), it’s not worth including. Besidesand the articles below discuss this, I didn’t include it.]

First of all, human drivers aren’t as unsafe as they’re made out to be.  A fatal crash now occurs once every 3.3 million hours of vehicle travel, and it will be hard for an automated system to beat that. The public will be much less accepting of crashes caused by software glitches or malfunctioning hardware rather than human error. “Society now tolerates a significant amount of human error on our roads,” Pratt told a congressional panel earlier this year. “We are, after all, only human.”

While developers amass data on the sensors and algorithms that allow cars to drive themselves, research on the social, economic, and environmental effects of AVs is sparse. Truly autonomous driving is still decades away, according to most transportation experts.

In the dystopian view, driverless cars add to many of the world’s woes. Freed from driving, people rely more heavily on cars—increasing congestion, energy consumption, and pollution. A more productive commute induces people to move farther from their jobs, exacerbating urban sprawl. At the same time, unexpected software glitches lead to repeated recalls, triggering massive travel disruptions. Wealthier consumers buy their own AVs, eschewing fleet vehicles that come with annoying fellow commuters, dirty back seats, and logistical hassles. A new metric of inequality emerges as the world is divided into AV haves and have-nots.

Companies have good reason for painting the rosiest scenario for their technology, Shladover, a transportation engineer at the California Partners for Advanced Transportation Technology program in Richmond says. “Nobody wants to appear to be lagging behind the technology of a competitor because it could hurt sales, their ability to recruit top talent, or even affect their stock price,” he says.

As a result, it’s easy for the public to overestimate the capabilities of existing technology. In a fatal crash involving a Tesla Model S and a semitrailer in May 2016, the driver was using what Tesla describes as the car’s “autopilot” features—essentially an advanced cruise control system that can adjust the car’s speed to sync with other vehicles and keep the car within its lane. That fits the definition of a level-two vehicle, which means the driver is still in charge. But he wasn’t able to react in time when the car failed to detect the semi.

Shladover believes AV companies need to be much clearer about the “operational design” of their vehicles—in other words, the specific set of conditions under which the cars can function without a driver’s assistance. “But most of the time they won’t say, or they don’t even know themselves,” he says.

But progress will likely be anything but steady. Level three, for example, signifies that the car can drive itself under some conditions and will notify drivers when a potential problem arises in enough time, say 15 seconds, to allow the human to regain control. But many engineers believe that such a smooth hand-off is all but impossible because of myriad real-life scenarios, and because humans aren’t very good at refocusing quickly once their minds are elsewhere. So many companies say they plan to skip level three and go directly to level four—vehicles that operate without any human intervention.

Even a level-four car, however, will operate autonomously only under certain conditions, say in good weather during the day, or on a road with controlled access.

Rural communities might need government subsidies to give residents of a sparsely populated area the same access to AVs that their urban neighbors enjoy. And advocates for mass transit, bicycling, and carpooling are likely to demand that AV fleets enhance, rather than compete against, these sustainable forms of transportation.

Pavlus, John. July 18, 2016. What NASA Could Teach Tesla about Autopilot’s Limits. Scientific American.

Decades of research have warned about the human attention span in automated cockpits

After the Tesla’s Model S in auto-pilot mode crashed into a truck and killed its driver, the safety of self-driving cars has been questioned due to 3 factors: the autopilot system didn’t see the truck coming, the driver didn’t notice the truck either, so neither applied the brakes.

Who better knows the dangers than NASA, where automation in cockpits has been studied for decades (i.e. cars, space shuttle, airplane).  They describe how connected a person is to a decision-making process as “in the loop”, which, say driving a car yourself, means Observe, Orient, Decide, Act (OODA).  But if your car is in autopilot but you can still interact with the system to brake or whatever, you are “ON the loop”.

Airplanes fly automated, with pilots observing.  But this is very different from a car.  If something goes wrong the pilot has many minutes to react. The plane is 8 miles in  the air.

But in a car, you have just ONE SECOND.  That requires a faster reflex reaction time than a test pilot. There’s almost no margin for error.  This means you might as well be driving manually since you still have to be paying full attention when the car is on autopilot, not sitting in the back seat reading a book.

Tesla tries to get around this by having the autopilot make sure the driver’s hands are on the wheel and visual and audible alerts are triggered if not.

But NASA has found this doesn’t work because the better the auto-pilot is, the less attention the driver pays to what’s going on.  It is tiring, and boring, to monitor a process that does well for a long time, and was called a “vigilance decrement” as far back as 1948. Experiments back then showed that after just 15 minutes vigilance drops off.

So the better the system the more we’re likely to stop paying attention.  But no one would want to buy a self-driving car that they may as well be driving. The whole point is that dangerous stuff we’re already doing now like changing the radio, eating, and talking on the phone would be less dangerous in autopilot mode.

These findings expose a contradiction in systems like Tesla’s Autopilot. The better they work, the more they may encourage us to zone out—but in order to ensure their safe operation they require continuous attention. Even if Joshua Brown was not watching Harry Potter behind the wheel, his own psychology may still have conspired against him.

Tesla’s plan assumes that automation advances will eventually get around this problem.

Transportation experts have set up 6 levels of automation.

What the car does at each of the 6 levels:

  • 0: nothing
  • 1: accelerates, brakes, OR steers
  • 2: accelerates, brakes, AND steers
  • 3: assumes full control within narrow parameters, such as when driving on the freeway, but not during merges or exit.
  • 4: Everything, only under certain conditions (e.g. specific locations, speed, weather, time of day)
  • 5: everything: goes everywhere, any time, and under all conditions

What the driver does:

  • 0: Everything
  • 1: Everything but with some assistance
  • 2: remains in control, monitors and reacts to conditions
  • 3: must be capable of regaining control within 10-15 seconds
  • 4: Nothing under certain conditions, but everything at other times
  • 5: Nothing, and unable to assume control

Our take on the prospects:

  • 0: older fleet
  • 1: present fleet
  • 2: Now in testing
  • 3: might never be devloped
  • 4: where the industry wants to be
  • 5: Never

 

Computers do not deal well with anything unexpected, with sudden and unforeseen events.   Self-driving cars can obey the rules of the road, but they cannot anticipate how other car drivers will behave. Without super-accurate GPS automation relies on seeing lines on the pavement to keep in their lane, but snow, rain, and fog can make them go away. Self-driving cars rely on special detailed maps of the location of intersections, on-ramps, stop signs and so on. very few roads are mapped to this degree, or updated with construction, detours, conversions to roundabouts, new stop lights, and so on.  They don’t detect potholes, puddles, or oil spots well and can be confounded by the shadows of overpasses.  If a collision is unavoidable, do you run over the child or swerve into a light pole and kill the driver potentially? (Boudette 2016).

Excerpts from John Markoff. January 17, 2016. For Now, Self-Driving Cars Still Need Humans. New York Times.

Self-driving cars will require human supervision. On many occasions, the cars will tell their human drivers, “Here, you take the wheel,” when they encounter complex driving situations or emergencies.  In the automotive industry, this is referred to as the hand-off problem, and automotive engineers say there is no easy solution to make a driver who may be distracted by texting, reading email or watching a movie perk up and retake control of the car in the fraction of a second that is required in an emergency. The danger is that by inducing human drivers to pay even less attention to driving, the safety technology may be creating new hazards. The ability to know if the driver is ready, and if you’re giving them enough notice to hand off, is a really tricky question.

The Tesla performed well in freeway driving, but on city streets and country roads, Autopilot performance could be described as hair-raising. The car, which uses only a camera to track the roadway by identifying lane markers, did not follow the curves smoothly or slow down when approaching turns.  On a 220-mile drive to Lake Tahoe from Palo Alto, Calif., Dr. Thrun said he had to intervene more than a dozen times.

Like the Tesla, the new autonomous Nissan models will require human oversight and even their most advanced models aren’t autonomous in  snow, heavy rain and some nighttime driving.

You could propose various fixes, but none of them get around the 1 second time for the driver to react. That is not fixable.

References

Boudette, N. June 4, 2016. 5 Things That Give Self-Driving Cars Headaches. New York Times.

Please follow and like us:
This entry was posted in Automobiles and tagged , , , , , . Bookmark the permalink.

Comments are closed.