GREENTECH

Tesla secret configuration allows select drivers to use Autopilot, FSD without a nag to take the wheel

In this article

A Tesla Model Y is seen on a Tesla car lot on May 31, 2023 in Austin, Texas.
Brandon Bell | Getty Images
A security researcher who uses the handle “@GreentheOnly” has discovered a secret setting in Tesla vehicles that can be enabled by the company and allows a driver to use Tesla’s advanced driver assistance systems, marketed as Autopilot and Full Self-Driving, without keeping their hands on the steering wheel for an extended period of time.

When a Tesla vehicle has this mode enabled, it eliminates what owners of the cars refer to as the “nag.” The researcher has nicknamed the feature “Elon Mode,” but that is not the company’s internal nomenclature for it, he said.

Tesla does not offer a self-driving vehicle today. CEO Elon Musk has promised to deliver a self-driving car since at least 2016, and said a Tesla would be able to complete a demo drive across the United States without human intervention by the end of 2017.

Instead, Tesla driver assistance systems require a human driver to remain attentive and ready to brake or steer at any moment.

Typically, when a Tesla driver is using Autopilot or FSD (or their variations), a visual symbol blinks on the car’s touchscreen to prompt drivers to apply resistance to the steering wheel at frequent intervals. If the driver does not grasp the steering wheel, the nag escalates to a beeping noise. If the driver still does not apply torque to the steering wheel at that point, the vehicle can temporarily disable the use of Autopilot for up to several weeks.

Elon Musk said in a tweet last year in December, he would remove the “nag” for at least some Tesla owners in January. That plan never came to fruition. By April 2023, Musk said in a tweet, “We are gradually reducing it, proportionate to improved safety” in reference to the nags.

The security researcher who revealed “Elon mode,” and whose identity is known to both Tesla and CNBC, asked to remain pseudonymous, citing privacy concerns.

He has tested features of Tesla’s vehicles for years and is an owner of a Tesla Model X. He has also reported bugs to the company consistently, and earned tens of thousands of dollars from filing successful Tesla bug bounties, as previously reported.

The “white hat hacker” said in an interview via direct message on Tuesday, that “Unless you work at Tesla, or otherwise have access to relevant databases at the company,” there’s no way to know how many cars have “Elon mode” available today.

In February, Tesla issued a voluntary recall in the U.S. for 362,758 of its vehicles, warning that its Full Self-Driving Beta system may cause crashes. (It was the second such recall.) Tesla delivered an over-the-air software update to address the issues.

The FSD Beta system at that time could cause crashes, the safety recall report said, by allowing affected vehicles to: “Act unsafe around intersections, such as traveling straight through an intersection while in a turn-only lane, entering a stop sign-controlled intersection without coming to a complete stop, or proceeding into an intersection during a steady yellow traffic signal without due caution.”

GreentheOnly said he expects future recalls related to issues with FSD Beta and how well the system automatically stops for “traffic-control devices” like traffic lights and stop signs.

According to the most recent available data from the National Highway Traffic Safety Administration, Tesla has reported 19 incidents to the agency that resulted in at least one fatality, and where the company’s driver assistance systems were in use within 30 seconds of the collision.

There are 21 total incidents that Tesla reported to NHTSA that resulted in fatalities and where the cars were equipped with its driver assistance systems.

Tesla did not immediately respond to a request for comment.