Tesla’s Self-Driving Features- What’s the Difference?

By Richard K. Hy, Esq.

Listen to this blog:

If you or a loved one has been injured in a collision involving a Tesla vehicle, you likely have questions about Tesla’s self-driving technology. Tesla offers two different autonomous driving features – Autopilot and Full Self-Driving (FSD). Understanding the key distinctions between these two systems is important.

Autopilot

Tesla first introduced Autopilot capability in 2015. Autopilot is considered a Level 2 advanced driver assistance system. Level 2 means it can steer, accelerate, and brake automatically within its lane, but a human driver must continually monitor the system and be ready to take control immediately if needed.

Autopilot uses a forward-facing camera, forward radar, 12 ultrasonic sensors, and GPS data to detect lane markings, vehicles, pedestrians, cyclists, and other objects surrounding the Tesla. However, Autopilot is intended for highway use and does not recognize traffic lights, stop signs, or other road signs.

Specific Autopilot capabilities include:

  • Traffic-Aware Cruise Control: Maintains a set speed and distance from cars ahead.
  • Autosteer: Keeps the car centered within its lane.
  • Automatic Lane Changes: Changes lanes automatically when engaged by the driver.
  • Autopark: Finds parking spots and parks automatically.
  • Summon: Moves the parked car to the driver’s location.

Despite these features, Autopilot has limitations. It does not make the vehicle autonomous or fully self-driving. The National Highway Traffic Safety Administration (NHTSA) has investigated crashes where drivers misused Autopilot or failed to pay attention to the roadway. Tesla emphasizes that human drivers must keep their hands on the wheel and remain alert while Autopilot is engaged.

Full Self-Driving (FSD)

In contrast to Autopilot, Tesla’s Full Self-Driving system is designed to be an SAE Level 4 fully autonomous system when completed. Level 4 means the car can drive itself without any human input under certain conditions.

FSD uses more cameras, sensors, and computing power than Autopilot to process a detailed view of the driving environment. FSD is intended to autonomously navigate urban streets, obey traffic signs/signals, make turns, change lanes, park, and avoid collisions without human intervention.

However, the current FSD software remains in beta testing. Tesla warns it does not make their cars autonomous yet. Human drivers must continually monitor FSD and be prepared to immediately take control.

According to the NHTSA, there have been 17 crashes involving FSD since 2019. While Tesla believes further refinement will make FSD safer than human drivers, FSD beta still has limitations requiring human supervision at this time. Misuse or over-reliance on current FSD could be catastrophic.

Tesla is rapidly evolving FSD technology. Tesla initially relied on hardware in developing their automated driving systems, such as radar and lidar. It then transitioned from relying on what it called “HydraNets” to “Occupancy Networks.” Now in 2023, Tesla is shifting FSD to use end-to-end deep learning. This change should eventually enable full self-driving capability.

 2021: Introducing Simple but Powerful Networks

In 2021, Tesla introduced HydraNets, a type of technology that simplified how their cars understand what they see on the road. Imagine a car using just one smart brain to do many things at once, like identifying traffic lights, other cars, and road signs. This was a big step from their earlier systems and made Tesla’s cars smarter in understanding their surroundings.

 2022: Making Cars More Aware of Their Environment

In 2022, Tesla added something called the Occupancy Network to their cars. Think of this as giving the car a better sense of space around it, almost like an advanced 3D map. This helps the car know not just what objects are around, like other cars and pedestrians, but also how much space is around it. It’s like having an extra set of eyes that help the car plan its movements more safely and accurately.

 2023: Tesla’s Transition to End-To-End Learning

Tesla’s recent shift to end-to-end deep learning represents a paradigm shift in autonomous driving. In simple terms, end-to-end deep learning is akin to teaching a car to drive by observing and imitating, much like a human learner. Rather than relying on predefined rules for specific scenarios, the car learns appropriate responses by analyzing millions of videos of real-world driving situations taken by Tesla vehicles on the road. This approach integrates various system components into a cohesive, adaptable system, allowing the vehicle to make more intuitive decisions based on extensive data. It mirrors good human driving behaviors, enabling the car to handle complex scenarios with a level of nuance and adaptability that was previously unattainable.

This transformative approach in Tesla’s technology signifies a major leap in the realm of autonomous vehicles, highlighting the critical role of machine learning in navigating complex environments. By learning from vast amounts of data and imitating skilled human drivers, Tesla’s vehicles are poised to navigate our roads with unprecedented sophistication and safety.

Who is at Fault: Autopilot versus FSD?

Liability questions arise when injuries are caused by Tesla vehicles using Autopilot or Full Self-Driving capabilities.

Autopilot is only intended to assist and not fully replace an attentive human driver. So, the human driver may likely share fault if they improperly rely on Autopilot or fail to stay alert. However, Tesla may share liability if Autopilot malfunctions.

With FSD beta, Tesla currently requires human monitoring and supervision. So again, the human driver shares fault for misuse or inattention. But when the software finally comes out of beta testing— likely when the software is updated to Version 12, which incorporates end-to-end learning techniques— Tesla may share liability if FSD fails to avoid a crash that proper autonomous technology should have prevented. Since both systems have limitations, Autopilot and FSD crashes can potentially involve shared liability between the driver and Tesla depending on the specific circumstances.

Have Questions?

If you or someone you love has been injured by a Tesla vehicle involving Autopilot or FSD, please contact our law firm immediately, our experienced attorneys can investigate the collision and pinpoint whether Tesla’s automated systems contributed to the crash. Contact us today for a free consultation.

 

About Attorney Richard K. Hy, Esq.

Richard K. Hy, a partner at Eglet Adams, primarily focuses his practice on complex civil litigation including mass torts and class actions. When the Nevada Legislature is in session, Richard also works closely with the Nevada Justice Association advocating for legislation that protects consumers from false and misleading advertising, deceptive trade practices, and unsafe products. A proponent for adopting new and emerging technologies, Richard constantly explores how these innovations will revolutionize the legal landscape, advocating for their integration and understanding their profound implications within the legal sphere.