Tesla Inc told a California regulator that it may not achieve full self-driving technology or level 5 autonomous driving by the end of year, according to a memo by the US state’s Department of Motor Vehicles (DMV). As pointed out by Reuters, the memo was released by legal transparency group PlainSite, which obtained it under the Freedom of Information Act (FOIA).
The brand’s cars can be equipped with an optional Full Self-Driving (FSD) software, which grants limited vehicle autonomy to those who subscribed to include the addon. Naturally, the feature is usually endorsed by company CEO Elon Musk on Twitter, who even claimed during an earnings conference call in January that he is “highly confident the car will be able to drive itself with reliability in excess of human [interaction] this year.”
However, several accidents involving Tesla-branded cars have been recorded in the US as of late, with numbers totalling up to 20 cases – some being fatal. The most recent traffic collision was an unspecified Tesla vehicle which crashed into an overturned truck on a highway in California – killing the EV’s driver. Federal highway safety regulators and the state’s highway patrol are currently investigating the cause of the crash, with some speculation that the Tesla’s FSD software may be the one to blame.
The California DMV noted in the memo that Musk’s claims of Tesla cars’ self-driving capability on Twitter are extrapolated. It confirmed this after a conference call with Tesla representatives including the company’s autopilot engineer CJ Moore, who explained that its vehicles could only achieve Level 2 (L2) autonomy at this current time.
For those unfamiliar, L2 autonomy only handles partial tasks such as acceleration, steering and braking, but will still require the driver’s full attention. This can is often used to assist drivers when driving down highways at regulated speeds – think of it as somewhat of a more advanced version of cruise control. Level 2 is not capable of getting you from your point of origin to your destination automatically.
“Tesla indicated that they are still firmly in L2,” California DMV said in the memo. “As Tesla is aware, the public’s misunderstanding about the limits of the technology and its misuse can have tragic consequences.”
Another crash in late April which involved a 2019 Tesla Model S and killed two onboard individuals was first thought to have been caused by the car’s Autopilot system. Police noted that both individuals were seated in the front passenger seat and in the back of the vehicle, with no one seemingly driving the vehicle. Musk himself refuted this on Twitter, revealing that recovered data logs from the crashed vehicle indicated that the Autopilot system was not enabled and the owner did not include FSD in their purchase.
Regardless, as noted by the California DMV, the term “self-driving” or “autonomous driving” itself is capable of causing harm. This is especially when the feature is often associated with the potential capabilities of Level 5 autonomy which does not require any human supervision whatsoever. That, however, is merely a proposed technology milestone that has not been fully achieved by any manufacturer just yet, and may take several more years before it becomes a reality.
No comments:
Post a Comment