Elon Musk’s problematic plan for “full self driving” Teslas

A parking lot full of Tesla automobiles.
Tesla is one of several car companies introducing increasingly autonomous features into cars meant to be operated by human drivers. | Toru Hanai/Bloomberg via Getty Images

Concerns about features like Autopilot aren’t going unnoticed in Washington.

More advanced autonomous driving is coming to cities across America. Last week, Tesla CEO Elon Musk promised Tesla owners people who have good driving records would soon be able to request access to the beta version of the carmaker’s Full Self-Driving feature, which expands the AI-powered highway navigation software to work not only on highways but also in urban environments. But some regulators think Tesla should pause the rollout of more autonomous features until its current safety issues are fixed.

“Basic safety issues have to be addressed before they’re then expanding it to other city streets and other areas,” Jennifer Homendy, the head of the National Transportation Safety Board (NTSB), told the Wall Street Journal after Elon Musk tweeted about the update. It’s worth pointing out that, much to Tesla owners’ frustration, Musk has made promises about a wider rollout of Full Self-Driving for years and continuously pushed back the date the feature would be available to anyone willing to pay for the upgrade to Autopilot, Tesla’s standard driver assistance technology.

Regulators are also not happy with how Tesla has been releasing its autonomous driving technology. In August, the Department of Transportation’s National Highway Traffic Safety Administration (NHTSA) announced that it would investigate safety problems in Autopilot, Tesla’s advanced driver assistance technology. Sens. Ed Markey (D-MA) and Richard Blumenthal (D-CT) have also accused the company of misrepresenting the quality of its Autopilot and Full Self-Driving tech and urged the Federal Trade Commission (FTC) to open an investigation.

In response to regulators’ concerns that the company doesn’t ensure drivers using autonomous features are paying enough attention to the road, Musk said in a tweet last week that drivers who want the latest update would have to turn on a driver behavior tracking tool that Tesla uses to calculate insurance rates. That feature will tell Tesla owners in real time how well they’re driving, and “what actions are needed to be rated ‘good driver,’” Musk said. Only drivers who maintain a good driving record for at least a week will be able to use the new version of Full Self-Driving.

Musk’s latest announcement about Full Self-Driving comes just a few months after both the NTSB and NHTSA investigated whether Autopilot was at play in a Tesla crash in Texas that killed two people in April. The NTSB later found that, despite initial suspicions that the driver’s seat was empty, the driver was indeed sitting in the front seat before the crash. In May, the California Highway Patrol arrested a man who brought a Tesla onto public roads while sitting in the back seat. Later that month, the state’s Department of Motor Vehicle announced that it would review whether the automaker is misleading customers about the state of its Full Self-Driving technology.

These events highlight the dangerous, ongoing confusion over Tesla’s autonomous driving capabilities and how people are using them. All new Tesla vehicles come with the sensors and cameras the company says it needs to deliver autonomous driving features, including its latest Full Self-Driving capability for city driving, though the technology is not quite the same as more elaborate setups you might see in self-driving cars from companies like Waymo. While Autopilot currently comes standard on all Tesla vehicles, drivers can buy the Full Self-Driving capability as a software upgrade. Back in July, the technology research institute New Street Research estimated that about 360,000 users had paid for Full Self-Driving, which is available for a $10,000 flat fee or as a $199 monthly subscription.

There even seems to be some confusion between Musk and Tesla over what the self-driving features can do. A public records report published in May showed Tesla officials saying that Elon Musk had been overpromising the autonomous abilities of Tesla cars. Musk had previously said that he was “extremely confident” that Tesla cars would reach full autonomy by the end of this year. He’s made similar statements over the past five years. In recent weeks, Musk has expressed less confidence in the tech and has acknowledged that drivers can become overconfident in Tesla’s semi-autonomous abilities.

Ongoing concerns about Tesla highlight how lawmakers and regulators are struggling to keep up with self-driving technology that’s showing up in cars that aren’t quite fully autonomous. While states make their own rules for the testing of self-driving vehicles, federal standards for commercially available vehicles are set by the NHTSA. The body can also exempt a certain number of vehicles from these standards for the purpose of testing self-driving cars.

But there’s still ongoing debate about how the government should approach the increasingly autonomous features popping up in our everyday cars. Some members of Congress have been pushing the Transportation Department to do more, and through new proposed legislation, lawmakers are broadening the agency’s role in order to evaluate the safety and efficacy of new features, like pedestrian avoidance and driver monitoring. In May, Rep. Bobby Rush (D-IL) proposed new legislation that would force the agency to study crash avoidance tech, following up on legislation reintroduced this year that would force companies with advanced driver assistance tech to monitor that drivers are actually paying attention.

But as long as car companies, like Tesla, continue to push out new, ever-more-autonomous features — without clear regulatory standards — people will be driving in a potentially dangerous gray zone.

Self-driving car tech, briefly explained

While fully autonomous cars that don’t need a human driver behind the wheel are still in development, plenty of semi-autonomous features are already available in the vehicles that are on the road. These tools use different types of sensors to observe what’s happening on the road, and then employ sophisticated computing power to make decisions for the vehicle.

The transition to fully autonomous vehicles isn’t happening all at once. It’s happening gradually as individual features that require the driver to do less and less get rolled out. The NHTSA sorts autonomy into six levels, where Level 0 has no autonomous features and Level 5 is fully autonomous and doesn’t require a driver.

“Right now, the automation systems that are on the road from companies such as Tesla, Mercedes, GM, and Volvo, are Level 2, meaning the car controls steering and speed on a well-marked highway, but a driver still has to supervise,” explained Vox’s Emily Stewart in 2019. “By comparison, a Honda vehicle equipped with its ‘Sensing’ suite of technologies, including adaptive cruise control, lane keeping assistance, and emergency braking detection, is a Level 1.”

Sorting out and enforcing the dividing line between these various levels of autonomy has proven complicated and can give people a false sense of security in these cars’ capabilities. Tesla’s Autopilot feature, in particular, has been a source of confusion.

Autopilot allows the car to operate itself within a given lane, combining a cruise control feature and an auto-steering feature. In the recently published documents that showed the gap between what Elon Musk has said in public about Autopilot’s capabilities and what the technology can actually do, the California Department of Motor Vehicles said that “Tesla is currently at Level 2.” Since at least 2016, Musk has been saying that every new Tesla could drive itself, a claim he’s repeated many times. Tesla officials have said privately that what Musk says about Autopilot and full self-driving capabilities for Tesla’s vehicles does not “match engineering reality.” (Waymo, which is owned by Google’s parent company Alphabet, dropped the term “self-driving” earlier this year and committed to using “more deliberate language” in its marketing.)

Autopilot currently requires drivers to pay attention and keep their hands on the steering wheel. But drivers can end up over-relying on the tech, and it appears some have figured out ways to avoid Tesla’s related safety features. In addition to the many videos showing people riding alone in the back seat of Tesla vehicles, some people have been caught asleep at the wheel presumably with Autopilot engaged. There is also a growing list of Autopilot-related crashes. The same week that Musk announced the expansion of Full Self-Driving, a California woman reportedly using Autopilot was arrested on DUI charges after passing out in a moving vehicle.

At the same time, Tesla continues to beef up Autopilot’s autonomous capabilities — for example, by adding a feature for automatic lane changing or the latest update that will enable Full Self-Driving to work in cities. But it’s not clear that Autopilot or Full Self-Driving is entirely safe. As of March, NHSTA was investigating 23 crashes that may have involved Tesla Autopilot. Tesla, which dissolved its PR department last year, did not reply to Recode’s request for comment in May.

Federal agencies like the NHTSA are supposed to be taking the lead on setting standards for evaluating autonomous features. However, in April, Sens. Richard Blumenthal (D-CT) and Ed Markey (D-MA) urged the agency to “develop recommendations for improving automated driving and driver assistance systems” and “implement policy changes that stop these preventable deaths from occurring.” They’re not alone; other members of Congress have also been thinking about creating new rules, like expanding the number of self-driving exemptions the NHSTA can give.

Even car manufacturers have signed on to the idea that the NHSTA could do more. The Alliance for Automotive Innovation, a trade group that represents carmakers like Ford and General Motors, says that forward collision warnings, automatic braking, and lane assistance tech need to be evaluated by regulators and included in NHSTA’s new car rating system.

Lawmakers want murky standards improved

Lawmakers, safety advocates, and even representatives of the industry are demanding more discerning federal standards to govern autonomous features, including crash avoidance features and driver assistance tools built into cars that are already on the road. These critics are specifically calling for more research from the Transportation Department, a task they say is important even before fully self-driving cars are on the road.

“Before we get to autonomous technology that can do everything that people can do, there’s a real opportunity to introduce lifesaving technology into vehicles that people will still be driving,” said Jason Levine, the executive director of the Center for Auto Safety, a nonprofit focused on vehicle safety in May.

The NHTSA has created testing protocols for some features, like collision warnings and automatic emergency braking. It has also requested public comment on what autonomous vehicle safety rules should be. But the agency has yet to create any national standards for how well crash avoidance and driver assistance features ought to perform, according to Ensar Becic, an investigator for highway safety for the NTSB.

Still, more cars are being equipped with increasingly autonomous features. As automakers debut more and more advanced driver and safety features and inch toward more self-driving abilities, NHSTA has recommended more and more of these tools. But there’s also growing concern that the agency isn’t providing enough information about how well these tools should actually work.

“Manufacturers are out there advertising their different versions of this technology, without any true sense of oversight,” Levine added.

Lawmakers now think the NHTSA and the Transportation Department as a whole should have a role in more stringently evaluating this tech. Last month, Sens. Markey, Blumenthal, and Amy Klobuchar (D-MN) reintroduced the Stay Aware for Everyone Act, which would require the Department of Transportation to look at how driver assistance tools, like Tesla’s Autopilot, are impacting driver disengagement and distraction, and would mandate that companies institute driver monitoring tools to make sure drivers are paying attention to the road.

“With NHTSA often slow to act and auto manufacturers rushing to put new autonomous features in cars, this bill and other congressional action that puts public and driver safety first is necessary,” Blumenthal told Recode in May. He’s also urging President Joe ***** to fill the vacancy for NHTSA administrator to “ensure our country’s top auto safety agency has the leadership needed as this new technology rapidly advances.”

Others also want a better system for regulating how well these autonomous features perform. The legislation Rush, the Democratic representative from Illinois, introduced last week with his Republican co-sponsor Larry Bucshon (R-IN) would order Transportation Secretary Pete Buttigieg to commission a study on the safety of crash avoidance features and how well these systems identify pedestrians and cyclists with different skin tones. The bill, called the Crash Avoidance System Evaluation Act, comes after research from the Georgia Institute of Technology finding that people with darker skin tones are less accurately detected by technology that could be used in self-driving cars.

“We certainly do not want to unleash vehicles on our nation’s streets and highways that can’t guarantee all Americans, all pedestrians, all bicyclists that they are protected equally,” Rush told Recode this past spring. “I am concerned … the technology can’t guarantee that I have the same protection against being harmed by a self-driving vehicle as someone who has a darker skin tone or a lighter skin tone.” Rush’s proposal, Levine added, would force the agency to make this key type of safety information public.

In February, the NTSB chair wrote to the NHTSA urging the agency to develop performance standards for collision avoidance features, like vehicle detection and emergency braking.

“We know that creating new motor vehicle safety standards or revising old ones to bring up to date is very time-consuming and very resource-intensive,” said Will Wallace, the manager for safety policy at Consumer Reports in May. “This is an agency that is chronically underfunded. The agency doesn’t have anywhere near the resources that it needs to protect the public effectively. It’s incumbent on Congress to give the agency what it really needs.”

Lack of detailed requirements for these kinds of autonomous tools puts the US behind other parts of the world, including new car rating systems in Japan, Australia, and Europe. The US’s new car assessment program doesn’t rate these advantaged technologies, explained Becic of the NTSB.

Neither automatic braking nor lane assistance features are designed to allow a car to operate without a driver’s full attention. And, again, the public availability of fully autonomous cars is still years away. Some think that moment may never arrive. Still, these features set a foundation for what regulating roads full of self-driving vehicles could eventually involve. Figuring out how to regulate autonomous car features is important not just for cars that already offer them — it’s key to building a future where the roads are safe for everyone.

In the meantime, Tesla appears ready to plow ahead and introduce the newest version of Full Self-Driving in American cities. We don’t know how many drivers will ultimately use this tool, but as more and more Tesla vehicles edge closer to autonomous navigation, Elon Musk seems to be daring regulators to act.


Update, September 20, 2021: The story has been updated to include information about Tesla’s Full Self-Driving technology and regulators’ concerns.

Clarification, May 12, 2021: The story has been updated to include the information that, following publication, the NTSB said that its preliminary research found that Autopilot’s Autosteer function couldn’t be used during a test in the crash location and that it had not made conclusions about the crash. The story has also been updated to note that the man who operated a Tesla without someone in the driver’s seat was arrested.

via Vox – Recode

Check out the Finding Your Identity Podcast!