California Law Tries To Force Tesla To Rename ‘FSD’ Product But It May Not Work

Topic: 
Tags: 

California recently passed a law that is obviously aimed at forcing Tesla to stop using the name “Full Self-Driving” to describe the expensive software add-on they sell for their cars which does not, at this time, provide self driving, full or otherwise. The ostensible reason for this is to avoid customer confusion and the potential danger that could come from people thinking they have a self-driving car when they don’t. But while it’s clear that the public (and legislators) get confused about that, it’s less clear that Tesla customers do, or that Tesla can’t change their language slightly to comply with these rules.

The important elements of the California rule demand the following:

  • 24011.5. (a) A dealer or manufacturer shall not sell any new passenger vehicle that is equipped with any partial driving automation feature, [Defined as SAE Level 2] or provide any software update or other vehicle upgrade that adds any partial driving automation feature, without, at the time of delivering or upgrading the vehicle, providing the buyer or owner with a distinct notice that provides the name of the feature and clearly describes the functions and limitations of the feature.
  • (b) A manufacturer or dealer shall not name any partial driving automation feature, or describe any partial driving automation feature in marketing materials, using language that implies or would otherwise lead a reasonable person to believe, that the feature allows the vehicle to function as an autonomous vehicle, as defined in Section 38750, or otherwise has functionality not actually included in the feature. A violation of this subdivision shall be considered a misleading advertisement for the purposes of Section 11713. In other words, “don’t call it self-driving if it’s driver-assist.” There is not much question that members of the public have gotten the two confused — this began with the mistake of calling self-driving and driver-assist two different “levels” of the same technology, which most industry insiders say they definitely aren’t. There was also confusion over Tesla’s “Autopilot”name since many of the public mistakenly believe that an airplane Autopilot takes over the full flying task rather than just keeping the plane flying straight and level. (Tesla’s Autopilot also only does part of the driving task but it’s a great deal more sophisticated than an airplane Autopilot.)

While the public gets confused from time to time, it’s less clear that people who have actually bought and turned on Tesla’s “FSD” system in its so-called “beta” state are confused about that. It isn’t a beta — it’s not even remotely close yet to what would be considered “alpha” in the product quality scale, but it’s a prototype of a hoped-for self-driving system that Tesla sells and lets customers get access to. California’s issue is that it is called FSD even though it’s not ready, though Tesla readily admits it’s not ready. When you buy it and a Tesla car, their language is quite explicit. When you attempt to enable the prototype, it’s even more explicit, and you need to agree you understand that the system needs constant supervision and might do, in Tesla’s words, “the wrong thing at the worst time.” It seems that Tesla does comply with part (a) in communication with customers.

Part (b) is strong and just says “don’t name a driver-assist tool in a way that might make a reasonable person think it can be an autonomous vehicle.” Tesla sells, under the umbrella term of “Full self-driving package” a set of features:

  1. Enhanced driver-assist features for its Autopilot, such as navigating on highways and automatic lane change
  2. Auto-parking, Park Assist and Summoning the car in the parking lot — currently disabled in newer cars which do not have ultrasonic sensors, but promised to be restored some day
  3. A version of Autopilot for city streets, called “Autosteer on city streets” which is the early access to the prototype self-driving product, but modified so it can only be used supervised as a driver-assist tool
  4. The promise that, if and when the self-driving product actually works in the future and can do autonomous driving, the customer will get it free.

It’s not entirely clear if their language even promises #4 today. A lot of customers have been eager to try #3 and bought the product just for that, though they all hope to get the real future product. Initially only a subset of customers could get #3 and you had to pass a fairy poorly arranged safe-driving test to qualify. More recently, all buyers get the early access driver-assist tool.

What you get in “Autosteer on city streets” is definitely driver assist. First of all, it’s not very good. Compared to the standards of self-driving systems it’s atrocious and will have some major problem on a large fraction of the drives you take with it. Earlier this year I gave it an “F,” and while it’s improved over the course of the year, it’s still very much in “F” territory.

Ironically, it is the poor quality of the system that makes sure that no actual Tesla buyers using the system are confused and think it’s an autonomous system. Anybody who treated this tool as self-driving would be crashing almost every trip, and often getting honked at. There are now hundreds of thousands of Tesla owners with this system, and it’s safe to say they are not all crashing each day.

Are drivers confused?

In fact, their safety record is remarkably good. Not the driving ability of the system, but the record of the drivers watching it and grabbing the controls when it goes wrong. Even though that’s probably happening 100,000 times a day or more, there are very few reports of crashes. The NHTSA complaint database contains only a handful of (unverified) complaints, and they are not of serious crashes with major car damage or injuries, and definitely no fatalities. There are videos of people deliberately doing demonstrations involving hitting the odd curb, or in one case lightly striking a plastic bollard, but reports of real problems aren’t surfacing. One recent video shows clipping a side-mirror on a trash can. There are surely more, but if there were large numbers it couldn’t stay hidden.

We know this because there are lots of reports of problems including serious ones, with Tesla’s Autopilot system. Even though it is very clearly sold and warned as a driver-assist tool that needs supervision, it’s much better than FSD at what it does — following lanes and keeping pace with traffic on freeways. Sufficiently better that it lulls people into complacency, and that has resulted in a wide range of serious crashes, including fatal crashes and impacts into emergency vehicles. The fatal crashes are tracked at an independent website and NHTSA is investigating the emergency crashes. Several prior crashes have been the subject of NTSB investigations, too.

Yet reports on such failures with FSD are hard to find. I made multiple requests to the “Dawn Project,” a special effort aimed squarely at shutting down Tesla FSD, funded by wealthy software entrepreneur Dan O’Dowd, and they declined to provide even one example. Some will point at crashes that were blamed on FSD but actually took place on the freeway, where currently FSD does not function — it switches over to Autopilot, though that will soon change, Tesla says.

Safety driving, the system developed first at Waymo of having a human driver ready to take over from the self-driving system if there is doubt, works. It’s worked very well for Waymo and others, whose record shows that robocars being tested with safety drivers are less at-fault in accidents than ordinary human drivers are over the same system. The system failed once in a spectacular way for Uber ATG, when they had only one safety driver who ignored her job, watched a TV show and eventually faced criminal charges for doing so. When the safety drivers pay attention, it works.

And absent other evidence, it seems to work even with “amateur” safety-drivers in the form of Tesla owners, as long as they remain diligent. Tesla FSD is too early and low in quality to allow them to do anything else. Soon, though, it may improve to the point where, despite all the warnings, they start to treat it like a self-driving system and stop watching the road. Fortunately, this time, in most Teslas, there is a camera watching the gaze of the drivers, and they get nagged if they look away for too long. One hopes that will continue to do the job. If people treat it like Autopilot, the danger level could go way up as the system gets better, which is ironic.

It’s unclear just how useful a driver-assist city-street Autopilot is. Many drivers find the experience harrowing, unlike the freeway Autopilot which is relaxing. However, there are drivers who enjoy driving with the FSD prototype.

Back to the name

California’s law will prohibit calling a system “full self-driving” if it is driver assist. But it won’t stop Tesla from making the division among the features above clearer, and more tightly associating the name FSD only with the not-yet-delivered future product. Indeed, they might change FSD to mean “Future Self-Driving” and make other changes to clarify the difference. It’s not clear the California law stops you from calling a self-driving product “full self driving,” and as long as it’s super-clear that this is a future product customers are buying in advance, rather than one they get today, it should comply with the law. They might just stop calling the prototype “beta” product they do offer today by the name “FSD Beta.” They’ll find some language to comply with the law and California won’t get what it was hoping for.

It is true that a lot of the public, seeing the name, think Tesla sells a self-driving product today, even if drivers are pretty clear on the fact. That public confusion might remain with drivers, even after they look at the warnings when they buy, but there does not seem to be a lot of evidence for that.

Add new comment