My friend recently sent me a message regarding BYD is now offering its “God’s Eye” self‑driving system for free to its drivers. I looked into it further and discovered that the “God’s Eye A” variant - exclusive to BYD’s Yangwang models - comes equipped with 3 LiDAR sensors. This revelation got me wondering: just how important is LiDAR in autonomous driving, and why did Tesla decide not to use it and other ultrasonic sensors but focus on a vision-only approach?
Let’s first understand what LiDAR is and how it works.
How LiDAR works
LiDAR (Light Detection and Ranging) is a highly precise laser scanner that emits thousands of laser pulses every second and measures how long each pulse returns after striking an object. This process creates a detailed 3D “point cloud” that accurately maps the surroundings. LiDAR's clarity and precision make it exceptionally useful for tasks where exact distance measurements are critical, such as detecting small obstacles or navigating in low‑visibility conditions.
LiDAR System from Waymo (Video source from Waymo.com)
Besides assistance in autonomous driving, LiDAR also has wide applications in different industries:
Archaeology: Uncovering over 60,000 ancient Maya structures in Guatemala's jungle, providing unprecedented insights into the scale of the Maya civilization in 2018.
Forestry: Assess forest canopy density, measure tree heights, and estimate biomass.
Agriculture: Farmers create detailed topographic maps of their fields, enabling precise irrigation planning, soil analysis, and crop management.
Surveying: Produce high-resolution digital elevation models (DEMs) for urban planning.
Renewable Energy: Assess wind patterns and turbulence at potential turbine sites, which could optimize turbine placement and energy production.
Regarding the topic, LiDAR’s ability to generate a rich, three‑dimensional picture of the environment is a massive advantage in autonomous driving. For instance, Waymo relies heavily on LiDAR as a key part of its sensor fusion strategy. Waymo’s vehicles combine LiDAR data with inputs from radar and high‑definition cameras, ensuring the system has multiple layers of perception to interpret the road. This redundant setup helps Waymo’s robotaxis to achieve Level 4 autonomy in well‑defined, geo‑fenced areas, allowing them to operate without a human safety driver.
Waymo’s LiDAR Integration Approach
Waymo’s strategy is built on a multi‑sensors fusion approach. Their vehicles are equipped with LiDAR, other radars, and high‑definition cameras. LiDAR provides precise 3D mapping, radar tracks the speed and movement of objects even in adverse weather, and cameras add the crucial layer of contextual understanding, such as reading traffic signals or distinguishing between pedestrians and cyclists. This integrated system creates a robust safety net, enabling Waymo’s vehicles to operate autonomously in certain areas with Level 4 capability. This “redundant” sensor setup minimizes risk and enhances reliability in controlled environments like specific urban corridors and geo‑fenced regions.
Radar Imaging System from Waymo (Video source from Waymo.com)
Tesla’s Vision‑Only Approach
In contrast, Tesla has chosen to pursue a vision‑only strategy. Rather than incorporating LiDAR - an expensive and complex piece of hardware (although the cost has been reduced throughout the years) - Tesla relies on an array of cameras (usually eight per vehicle) to “see” its environment. These cameras feed data into deep neural networks trained on billions of miles of real‑world driving data, given how many Tesla vehicles are running on the road now.
Tesla’s engineering team initially started experimenting with different sensors, including LiDAR, for their autonomous driving system. However, the CEO, Elon Musk, has consistently advocated for a vision-based approach to autonomous driving, emphasizing cameras over other sensor technologies like radar and LiDAR. He has argued that if humans can drive using just their eyes, a well-trained AI should be able to do the same. This perspective underpins Tesla's decision to adopt a vision-only strategy for its autonomous driving system, aiming for cost-effectiveness and scalability through over-the-air software updates.
Therefore, Tesla announced in 2021:
"In 2021, we began our transition to Tesla Vision by removing radar from Model 3 and Model Y, followed by Model S and Model X in 2022. Today, in most regions around the globe, these vehicles now rely on Tesla Vision, our camera-based Autopilot system."
This method comes with trade‑offs. Tesla’s self‑driving system, known as Full Self‑Driving (FSD), remains at Level 2 automation, which means the driver must always be ready to take control. While Tesla’s vision‑only approach leverages vast amounts of data to improve its system continuously, critics (and even engineers from Tesla!) argue that it may struggle in certain challenging conditions where the depth precision of LiDAR could prove invaluable.
My Opinions
Tesla’s initial decision to exclude LiDAR was influenced by its higher costs and vision of achieving full autonomy through camera-based systems. Given that LiDAR was still a costly technology at the time, I agreed with that. Tesla has a different model than Waymo (ride-hailing), and it needs to empower all its drivers with self-driving capability. Thus, cost is an important consideration.
However, as LiDAR technology becomes more affordable and its benefits in accuracy and reliability become more apparent, incorporating LiDAR into personal vehicles is increasingly seen as a viable and sensible approach. 3D information about the environment from LiDAR is also a key component of Waymo’s neural network training dataset.
As I mentioned at the beginning of this post, BYD has integrated LiDAR(s) into its self-driving system at zero cost to the drivers. This has excited me to explore China’s EV and autonomous driving landscape.
In my next post, I will be sharing more findings about the self-driving scene in China.
If this interests you, subscribe to “A Curious Semi-Expert” newsletter to receive my weekly learnings directly in your inbox. I’ll keep you company on this journey as we adapt to these exciting technologies together.
Cheers.