Existing Teslas switch to Tesla Vision in latest software update

We knew this was coming for a while, but with today’s over-the-air software update (2022.24.6) on my Model 3, my radar is now effectively useless.

Tesla’s approach to autonomy involves using computer vision, taking inputs from car cameras located around the car and making inferences about the environment around it, then planning a way through this environment, often taking advantage of your entered route in your navigation.

While all new Model Ys manufactured since June 2022 have no radar (behind the front bar), my 2019 Model 3, like many others, previously used a combination of sensor inputs, including homing radar forward to adapt to cars and objects in front of it. , as well as a variety of active safety features.

It’s no secret that customers experience phantom braking from time to time and as detailed during CVPR’s 2021 keynote, former Tesla AI chief Andrej Karpathy made a great job explaining why the company was so optimistic in moving to vision only. Karparthy explained that inputs from various sensors were often in competition, and it was complex to figure out what to believe. When considering what the source of truth should be, they considered which might be more accurate, not just now, but over time.

Radar is ultimately a pretty noisy entry, and even back then Tesla saw that they could achieve better results using only vision.

While my car got 2022.24.6 today, checking the release notes revealed a new item, listed under the previous branch of 2022.20.

TeslaVision
Your vehicle is now using Tesla Vision! It will rely on camera vision coupled with neural network processing to provide some autopilot and active safety functionality. Vehicles using Tesla Vision received the highest safety ratings, and fleet data shows it provides improved overall safety for our customers. Note that with Tesla Vision, the following distance settings available range from 2 to 7 and the maximum autoguiding speed is 140 km/h (85 mph).

The following active safety features previously used Radar and now use Tesla Vision.

  • Forward Collision Warning
  • Automatic emergency braking
  • Lane Departure/Avoidance Warning
  • Emergency Lane Departure Avoidance
  • Mitigation of pedal misapplication
  • Automatic high beams
  • Automatic wiper
  • Blind Spot Collision Warning Chime
  • Side collision warning

Tesla Vision was a key part of the FSD beta that has yet to make its way outside of the US and Canada, but after over 100,000 beta users, it’s clearly proving a success. Tesla clearly has the data to show that radar is inferior to Vision alone, despite many naysayers and competitors claiming radar and lidar are necessary.

Today being Father’s Day in Australia, I had the opportunity to drive the car nearly 300km using vision alone and can now compare it to my experience with a radar-assisted tech stack.

When I drive, I spend 95% of my time on autopilot, activating it wherever it allows, because it makes me a safer driver. Being sure that the car will match the speed of the cars in front of me and keep me within the lane lines, it provides more opportunities to enjoy the environment around you.

When you’re in this mode, you’re sure the car has your back, but knowing the drastic change in this new software version, I was more cautious than usual.

The real risk areas are around sections of the road that have gaps in the markings, or where the road widens significantly or new lanes are added, or lanes end and you have to merge. You also have attributes like how the car centers itself in the lane, especially in corners and in those areas I felt it did a really good job.

On my ride today, the only time I encountered braking where it was not intended was when I used the wipers to clean the windshield. In this case, I saw an alert appear on the screen and I felt the car slow down by about 5 km/h. A few seconds later, the windshield was clean, the wipers stopped and the speed returned to normal.

There was a part of the trip that I felt like driving, to test if any improvements had been made. This stretch of road is something the car had struggled on before. The road segment features a long left turn, where a right turn lane emerges (blind to the car). The natural flow leaves you as the driver crossing the entrance to the right turn lane, which the dotted lines allow to enter the turn lane.

Technically you’re probably supposed to move left as you go straight and avoid the turn lane completely, but that’s not really the challenge. The car spots the turn lane late, as humans would at 100 km/h and when it does, the car slows down before making the decision to go through the entrance to the right turn, as would many humans. Today, there has really been no change in this behavior.

The key takeaway from all of this is ultimately this, ghost shadow braking, cross traffic, confusion at intersections or a million other things could still happen today with Vision alone but there’s an awful lot of trail here.

Tesla’s autonomous efforts are powered by data from millions of vehicles around the world and every day that number continues to grow. While things aren’t perfect, Tesla’s approach has the potential to fix problems unlikely for virtually any other car on the market.

Phantom braking occurs in vehicles of other brands, and where it occurs, it will occur for the life of that vehicle. Tesla’s approach of ingesting fleet data, training on that data, and then sending improved models back to our cars to make better inferences allows the system to improve over time. With noisy radar out of the equation, I expect vision-only to really stretch its legs and deliver improvements in the weeks and months to come.

Comments are closed.