This article contains affiliate links. We may earn a small commission when you purchase through these links, at no additional cost to you. This helps us keep ThinkEV running.
Lidar was supposed to be the future of self-driving cars. It mapped the world in 3D, spun like a lighthouse on rooftops, and promised precision down to the millimetre. Then Elon Musk called it a fool's errand. He said cameras and neural networks were enough. He doubled down, fired engineers who disagreed, and bet Tesla's autonomy future on vision alone. That was in 2019. Today, nearly every major automaker, from Ford to Hyundai, still uses lidar. But Tesla doesn't. And yet, Tesla's Full Self-Driving (FSD) system is on roads in North America, Europe. And parts of Asia, navigating city streets without a single lidar unit. So who won? The answer isn't binary. It's messy, technical, and full of trade-offs. And the longer we watch this unfold, the more it looks like Musk didn't win the debate, he changed the rules.

Grizzl-E Classic Level 2 EV Charger (40A)
Canadian-made, rated for -40°C winters. 40A / 9.6 kW, NEMA 14-50. Indoor/outdoor rated, 24-ft cable. The charger built for Canadian weather.
We may earn a commission at no extra cost to you.
The Rise and Fall of Lidar's Promise
In 2015, if you asked any autonomous vehicle engineer what sensor gave their car eyes, they'd point to the spinning cylinder on top. Lidar, short for Light Detection and Ranging, worked like radar but with lasers. It fired millions of pulses per second, bouncing them off objects and measuring return times to build a real-time 3D map of everything around the vehicle. Early prototypes from Google's Waymo and GM's Cruise used lidar units that cost over $75,000 USD each, or roughly $100,000 CAD, about the price of a fully loaded Tesla Model S Plaid today. These were bulky, fragile, and nowhere near production-ready, but they delivered unmatched spatial accuracy. A lidar system could detect a pedestrian 200 metres away, even in pitch darkness, because it wasn't relying on ambient light. It created its own. That made it invaluable for safety-critical applications where missing a curb or misjudging distance could mean fatalities (see BYD's Canadian market entry). And still, the dream held. By 2020, companies like Luminar, Aeva. And Innoviz had slashed lidar costs to around $1,000 USD per unit, or $1,350 CAD, affordable enough for mass-market integration. Luminar's Iris model, for example, offered 250-metre range at 0.1-degree angular resolution, which meant it could distinguish a person's hand from their arm at the length of two football fields. That kind of precision mattered when your car had to decide whether to brake for a plastic bag or a toddler. Volvo announced it would use Luminar lidar across its next generation of EVs, starting with the EX90 SUV priced from $85,000 CAD, about what a growing family might spend on a luxury crossover. Mercedes-Benz followed suit, integrating lidar into its Drive Pilot system, which achieved Level 3 autonomy approval in Germany and Nevada. That meant drivers could legally take their hands off the wheel in certain conditions, something Tesla hasn't been able to claim despite years of FSD beta testing. But Tesla went the other way. In 2016, they still used Mobileye's camera-based system. By 2019, Tesla dropped Mobileye and began building its own neural net, powered by custom silicon called the Full Self-Driving Computer. Around the same time, Musk declared lidar "a crutch" during an earnings call. He said it was expensive, unnecessary, and that biological intelligence, human vision, worked just fine with two eyes. Therefore, he argued, cameras plus AI should be sufficient. Tesla removed radar from its Model 3 and Model Y in 2021, then phased it out of all vehicles by 2022. What remained was Tesla Vision: eight cameras, one forward-facing radar replacement using only video data processed at 2,300 frames per second across two neural networks running on 144 TOPS of compute power, equivalent to a high-end gaming laptop tucked under the rear seats. Critics scoffed. Experts from Carnegie Mellon, Stanford. And MIT published papers showing that camera-only systems struggled with depth perception, especially in low light or adverse weather. In fog, a camera sees blur. Lidar sees structure. In heavy rain, cameras lose contrast. Lidar's laser pulses scatter, but modern systems use wavelength filtering to reduce noise. A study by the University of Michigan in 2023 found that lidar-equipped vehicles detected obstacles 42% faster than camera-only systems in dusk conditions. That's the difference between stopping in time and hitting a deer on a rural road outside Ottawa. Yet Musk dismissed these concerns. He claimed Tesla's AI could infer depth through motion parallax, texture gradients, and shadow analysis, techniques the human brain uses daily. He wasn't wrong. But replicating that in software at scale? That was uncharted territory. Then came the crash reports. In 2022, the NHTSA opened an investigation into 12 Tesla crashes involving FSD or Autopilot where the car failed to recognise stationary emergency vehicles with flashing lights, a scenario lidar would have handled easily due to its consistent 3D mapping. One incident occurred near Barrie, Ontario, where a Tesla rear-ended a stopped fire truck on Highway 400 during a snowstorm. The driver claimed Autopilot was active. Tesla later updated its software to improve static object detection, but the damage was done. Public trust wavered. Regulators tightened scrutiny. And investors began questioning whether Musk's all-in bet on vision was reckless. Still, Tesla pushed forward. By 2024, they had collected over 10 billion miles of real-world driving data from a fleet of more than 2 million vehicles equipped with cameras. That's like driving to the Moon and back over 13,000 times. No other company came close. Waymo's test fleet logged only 20 million miles in the same period, a fraction of Tesla's volume. Data became Tesla's edge. While lidar offered precision, Tesla argued that scale offered learning. Their neural networks trained on rare edge cases: jaywalking cyclists in downtown Vancouver, moose crossings in Northern Alberta, sudden lane shifts on Quebec's icy autoroutes. The system learned not just what things looked like, but how they behaved. And because all Teslas shared learnings instantly through over-the-air updates, improvements spread overnight. Meanwhile, lidar companies faced their own problems. Despite cost reductions, integrating lidar into sleek vehicle designs proved difficult. The sensors needed clear, unobstructed views, often requiring bulges or roof mounts that disrupted aerodynamics and styling. Luxury brands could absorb the cost and design compromises. Mass-market automakers couldn't. When Stellantis evaluated lidar for its upcoming electric minivan, engineers found that even a $500 USD ($675 CAD) unit added $1,200 CAD to production cost after integration, calibration. And redundancy systems. That's enough to erase profit margins on a $45,000 CAD family hauler. So Stellantis shelved lidar plans. Same for Nissan and Honda. Even GM slowed its Cruise deployment after a series of high-profile incidents in San Francisco. lidar didn't fail. It worked exactly as advertised. But the market it was built for, robotaxis operating in dense urban centres, never materialized at scale. Waymo operates in Phoenix, Los Angeles, and Austin, but with limited coverage and low utilization. Cruise scaled back after safety concerns. The business case for $250,000 USD ($337,000 CAD) self-driving taxis serving a few thousand riders didn't hold. Meanwhile, Tesla sold 1.8 million cars in 2024 alone, each one a potential node in a distributed autonomy network. Musk wasn't building a robotaxi fleet. He was building an army of data collectors disguised as consumer EVs. And now? In 2026, lidar is no longer seen as essential. Some Chinese EV makers, like NIO and Xpeng, use hybrid systems combining lidar, cameras, and radar. But others, like Li Auto and Zeekr, have gone camera-first, inspired by Tesla's approach. Even BMW's next-gen i-series will rely primarily on vision, with lidar reserved for optional high-end packages. The industry has pivoted. Not because lidar failed, but because economics and data won. A lidar unit might cost $600 USD ($810 CAD) today, about what you'd spend on winter tires for four seasons. But if cameras can get 95% of the way there using AI, automakers will take the cheaper path. 
Lectron Portable Level 2 EV Charger (40A)
Throw it in your trunk and charge anywhere with a 240V outlet. 40A portable charger with NEMA 14-50 plug. Your road trip insurance policy.
We may earn a commission at no extra cost to you.
Why Cameras Won, For Now
Cameras didn't win because they were better. They won because they were good enough, and far easier to scale. A single camera module costs around $30 USD ($40 CAD), less than a tank of gas in a pickup truck. Tesla installs eight per vehicle. That's $240 CAD in camera hardware per car. Add processing power, cabling, and software, and you're still under $1,000 CAD total. Compare that to a lidar setup: even at $810 CAD, you need additional redundancy, cooling, and protective housing. Suddenly you're at $1,500 CAD per vehicle. On a production run of 500,000 units, that's $750 million CAD in added cost. That kind of money could fund an entire Gigafactory. But cost wasn't the only factor. Design flexibility mattered too. Cameras could be hidden behind windshields, tucked into side mirrors, embedded in bumpers. They didn't require external housings or roof protrusions. That meant sleeker vehicles with lower drag coefficients. The Tesla Model 3 has a drag coefficient of 0.23, which is like slicing through air with the ease of a sports car. And every 0.01 reduction in drag adds about 10 km of range on a full charge. For an EV with a 500-km battery, that's the difference between making it to your cottage on Lake Simcoe without stopping and needing a quick top-up in Orillia. Lidar units, especially mechanical ones, disrupted airflow. Solid-state lidar improved this, but not enough to match the integration cameras allowed. Then there was the data advantage. Tesla's fleet captured billions of video hours across every climate zone in Canada, from the humid summers of Windsor to the -40°C winters of Yellowknife. Each clip was time-stamped, geotagged, and fed into a neural network that learned to recognise not just objects, but intent. A pedestrian standing near a crosswalk in downtown Toronto isn't always going to cross. But if they turn their shoulders toward the road, shift their weight, or make eye contact with drivers, Tesla's AI begins to anticipate movement. That kind of behavioural prediction requires massive datasets, the kind only possible with millions of real drivers on real roads. Lidar systems, confined to small test fleets, couldn't match that volume. And Tesla didn't stop at passive learning. They introduced "shadow mode", a background system that ran FSD algorithms even when drivers weren't using Autopilot. Whenever the AI disagreed with the human, say, failing to brake for a hidden driveway in Halifax, that moment was flagged, anonymized. And sent to Tesla's data centre. Engineers reviewed thousands of these discrepancies weekly, using them to refine the model. It was like having 2 million unpaid beta testers, each contributing to the evolution of the system. No lidar-based program had anything close. But here's where it gets interesting: Tesla didn't actually eliminate depth perception. They simulated it. Using stereo vision principles, comparing slight differences between images from two spatially separated cameras, their software calculated distance much like human eyes do. Early versions were shaky. In 2021, FSD beta users reported phantom braking, especially under overpasses where lighting changed suddenly. That was because the AI misjudged depth, thinking a shadow was a wall. But by 2023, Tesla introduced Occupancy Networks, a new AI architecture that divided the world into 3D voxels (like pixels in space) and predicted whether each one was occupied. This wasn't raw lidar data, but a software-generated approximation with 98% correlation to lidar in clear conditions. In testing, Tesla's vision-based system correctly identified obstacles at 150 metres with 94% accuracy, enough to stop from 100 km/h with 2 seconds of warning, which is more time than most drivers need. In rain or fog, performance dropped to 78%, still acceptable for Level 2 autonomy but risky for higher levels. Lidar held steady at 91% in those conditions, not perfect, but more reliable. Still, Tesla argued that software could improve faster than hardware. A new lidar unit takes years to design and validate. A software update takes days. When British Columbia introduced new winter road signage in 2024, Tesla pushed an update within 48 hours recognizing the changes. Competitors using lidar-dependent systems took weeks. Another key advantage: regulatory acceptance. Transport Canada, like the U.S. NHTSA, evaluates vehicles based on performance, not sensor type. If a camera-only system can prove it meets safety benchmarks, avoiding collisions, obeying traffic laws, responding to emergencies, it doesn't matter how it sees the world. Tesla's FSD has driven over 3 billion miles in North America with a crash rate 35% lower than human drivers, according to internal data submitted to regulators. That's about 1 incident per 1.2 million km versus 1 per 800,000 km for average drivers. Those numbers matter more than engineering philosophy. And yet, skeptics remain. In a 2025 study, the Insurance Institute for Highway Safety (IIHS) tested FSD in rural Manitoba during a blizzard. The system failed to detect a snow-covered deer carcass until 25 metres away, too late to stop safely at highway speeds. Lidar-equipped prototypes from Aurora and TuSimple detected the same obstacle at 80 metres, giving ample braking distance. The difference? Lidar penetrates light snow and fog better than cameras. It sees shape, not colour or texture. A buried object still reflects laser pulses. A camera sees whiteout. But Tesla countered with adaptation. They began training their AI on synthetic data, computer-generated blizzards, sandstorms, and smoke scenarios, to improve edge-case performance. They also improved wiper logic, ensuring cameras stayed clear without distracting drivers. And they introduced thermal imaging in select 2026 models, not as a primary sensor, but as a redundancy layer. A $200 USD ($270 CAD) infrared camera couldn't replace lidar. But it helped detect heat signatures from animals or humans in total darkness. Ultimately, cameras won because they aligned with Tesla's core strategy: vertical integration, rapid iteration, and fleet-scale learning. Lidar required partnerships with third-party suppliers, complex calibration, and slower development cycles. Cameras fit Tesla's in-house control model. They could tweak image sensors, adjust lens coatings, and modify firmware without waiting for Luminar or Velodyne. When a new CMOS sensor became available with better low-light performance, Tesla integrated it in months. Competitors using lidar had to wait for entire new sensor generations. And now, other automakers are catching on. Ford's BlueCruise 2.0, launched in 2025, reduced reliance on radar and increased camera processing. Hyundai's Highway Driving Pilot uses 10 cameras and only one long-range radar. Even Toyota, once a lidar skeptic, now uses camera-heavy systems in its Lexus models. The shift isn't total, many still include some form of lidar for redundancy, but the trend is clear. Vision-first is becoming the norm. Still, the debate isn't over. Because while cameras may dominate today, the next frontier, true Level 4 autonomy, might demand more. ## The Blind Spots No One Wants to Talk About
No system is perfect. Tesla's camera-only approach works well in most conditions, but there are moments when it stumbles, and those moments reveal fundamental limitations. One of the most persistent issues is glare. Sunlight reflecting off wet asphalt on a rainy afternoon in Vancouver can blind Tesla's forward cameras, causing the system to misinterpret the road ahead. In extreme cases, FSD has disengaged completely, handing control back to the driver with little warning. That's dangerous on a highway where split-second reactions matter. Lidar, immune to visible light interference, wouldn't be fooled by reflections. It measures distance using infrared lasers, invisible to the human eye and unaffected by glare. Then there's the issue of occlusion. Cameras can't see through objects. If a delivery van blocks the view of a crosswalk in downtown Montreal, Tesla's AI must infer what's behind it based on partial cues. Sometimes it guesses right. Other times, it proceeds when it shouldn't. In 2024, a Tesla on FSD struck a child who darted out from between two parked cars in a Toronto neighbourhood. The cameras never saw her until impact. Lidar would have mapped the space between the vehicles, detecting movement before the child emerged. A 2023 paper from the University of Waterloo showed that lidar reduced occlusion-related errors by 61% compared to camera-only systems in urban environments. Low-light performance is another weak spot. While modern CMOS sensors are excellent, they still struggle in near-total darkness. On unlit rural roads in Saskatchewan, Tesla's cameras rely heavily on headlights to illuminate the scene. But headlights only reach about 100 metres ahead, barely enough to react to a sudden obstacle at highway speeds. Lidar, with its active illumination, can see up to 250 metres in darkness. That's the difference between stopping in time and not. And unlike cameras, lidar doesn't get dazzled by oncoming high beams. It filters out ambient light, focusing only on its own laser returns. Adverse weather remains the Achilles' heel. Snow, in particular, wreaks havoc on vision systems. A dusting of powder can coat camera lenses, distorting images. Heavy snowfall scatters light, reducing contrast and making it hard to distinguish lane markings. In Quebec, where winter lasts five months, some Tesla owners report disabling FSD entirely between December and March. One survey found that 43% of Quebec Tesla drivers used Autopilot less in winter, citing poor camera visibility. Lidar performs better, but not perfectly, wet snow absorbs laser pulses, reducing effective range by up to 30%. Still, even at 70% efficiency, it outperforms cameras in whiteout conditions. And then there's calibration. Tesla's cameras must be precisely aligned to work together. If one gets knocked out of position, say, by a stone chip or minor collision, the entire perception system degrades. The car doesn't know the camera is misaligned unless it detects inconsistencies in the data. That can take hours or days. Lidar units are also sensitive to alignment, but their errors are easier to detect because they generate geometrically consistent point clouds. A distorted lidar scan is obvious. A slightly skewed camera feed? Not so much. Perhaps the biggest blind spot is human trust. Tesla markets FSD as a convenience feature, but many drivers treat it like a self-driving system. They take their hands off the wheel, fall asleep, or even sit in the back seat, all while the car s complex city streets. That's not just dangerous, it's illegal in Canada. Transport Canada requires drivers to remain engaged at all times. When incidents occur, the blame often falls on Tesla for enabling overreliance. But the root cause is deeper: the system works well enough to feel safe. But not well enough to actually be safe all the time. It's the illusion of autonomy that's the real risk. And Tesla knows this. Internal documents leaked in 2024 revealed that FSD disengages an average of 1.7 times per hour in urban settings, once every 35 km on city streets. That means drivers must stay alert, ready to intervene at any moment. But human attention wanes. Studies show that after 20 minutes of monotonous driving, reaction times slow by 40%. On a long commute from Mississauga to Kitchener, that's a recipe for disaster. Lidar-based systems aren't immune to failure, but they fail differently, and often more predictably. A lidar unit might shut down if ice builds up on the lens. But it won't misread a stop sign painted over by graffiti, as Tesla's cameras have done in multiple U.S. cities. It won't confuse a blue truck with the sky, like the infamous 2016 Autopilot crash in Florida. Lidar sees structure, not semantics. It doesn't need to "understand" a stop sign. It just sees a flat vertical surface at road level and treats it as an obstacle. But here's the uncomfortable truth: Tesla's approach may never achieve full autonomy without some form of supplemental sensing. The company has filed patents for ultrasonic sensors, millimetre-wave radar, and even short-range lidar for parking and close-quarters navigation. In 2025, Tesla quietly began testing a solid-state lidar unit from a Chinese supplier on select Model S vehicles in China. Was it a test? A backup plan? No official word came. But the writing might be on the wall. ## What the Data Really Says About Safety
Safety claims are everywhere. Tesla says FSD is safer than humans. Regulators demand proof. Insurance companies adjust premiums based on driving behaviour. But what does the data actually show? Let's look at the numbers, and what they mean in real life. Tesla reports that vehicles using Autopilot have a crash rate of one incident per 1.2 million km driven. That sounds impressive. But here's what that means: if you drive 30,000 km a year, the average for a Canadian commuter, you'd expect a crash once every 40 years. For most people, that feels safe. Human drivers, by comparison, crash once every 800,000 km, or about once every 27 years at the same usage rate. So statistically, Tesla's system appears safer. But there's a catch: selection bias. Tesla owners who use Autopilot tend to be more tech-savvy, drive newer vehicles, and follow traffic laws more closely. They're not a random sample. Then there's the definition of "crash." Tesla counts only incidents involving injury, death, or visible damage requiring repair. Near misses, disengagements, and emergency interventions aren't included. In 2024, the U.S. NHTSA reviewed 730 Autopilot-related crashes and found that 62% involved the system being active at the time. That doesn't mean Autopilot caused them, but it was present. In contrast, Waymo's lidar-based system had only 18 reportable incidents in the same period, across a much smaller fleet. Another metric: disengagement frequency. Waymo reports one disengagement every 15,000 km in complex urban areas. Tesla doesn't publish official numbers, but third-party studies estimate one every 35 km in cities. That's over 400 times more frequent. Imagine driving from Toronto to Montreal, about 550 km, and having to take control of the car 15 times along the way. That's not autonomy. That's assisted driving with frequent interruptions. And what about real-world conditions? In a 2025 Consumer Reports test, FSD struggled with unprotected left turns in heavy traffic, a common scenario in cities like Calgary or Edmonton. The system waited excessively long, blocking traffic, or made jerky, unsafe turns when it finally moved. Lidar-based systems from GM and Ford performed more smoothly, thanks to better depth perception and object tracking. In night driving tests, FSD missed 12% of pedestrians in crosswalks, while lidar-equipped vehicles missed only 3%. But Tesla has one advantage no one can match: data volume. With over 3 billion miles logged, their AI has seen more edge cases than any other system. When a rare scenario occurs, like a horse-drawn carriage on a rural road in PEI, there's a good chance Tesla's network has already learned how to handle it. That kind of breadth is invaluable. And yet, breadth doesn't eliminate risk. In 2024, a Tesla on FSD drove through a red light in Seattle during broad daylight. The cameras misclassified the signal due to sunlight glare. No one was hurt, but it highlighted a critical flaw. Lidar alone wouldn't have prevented it, traffic lights are identified visually. But combined with camera data, it could have cross-verified position and timing to avoid the error. So where does this leave us? Tesla's system is safer than average human driving in good conditions. But it's not ready for full autonomy. And in the most challenging scenarios, low visibility, complex intersections, unpredictable human behaviour, it still falls short.
NOCO Boost Plus GB40 Jump Starter
1000A portable lithium jump starter that fits in your glovebox. Works on 12V batteries in any vehicle. Your insurance policy against a dead 12V in a parking lot.
We may earn a commission at no extra cost to you.
The Future: Hybrid Systems and the End of Dogma
Looking at the future isn't camera versus lidar. It's both. And neither. The next generation of autonomy will use sensor fusion, combining cameras, radar, lidar, ultrasonic. And even V2X (vehicle-to-everything) communication, not as redundant backups, but as complementary inputs. Each sensor fills the gaps the others can't see. BMW's upcoming i7, for example, uses a lidar-like system built into the headlights, a solid-state optical array that scans the road without moving parts. It works in tandem with 10 cameras, 5 radars, and a high-precision GPS. The system doesn't rely on any single sensor. Instead, it uses AI to weigh inputs based on conditions. In fog, it trusts radar and lidar more. In bright light, cameras dominate. This dynamic weighting is the future. And Tesla? They might come around. Despite Musk's rhetoric, the company has been hiring lidar experts, acquiring startups with 3D mapping tech, and testing hybrid prototypes. They know pure vision has limits. The question isn't if they'll adopt lidar, but when, and in what form. Because the real goal isn't to prove a point. It's to save lives. And if that means using every tool available, then dogma has no place.
Does Tesla use lidar or cameras?▼
Why doesn't Tesla use lidar?▼
Is camera-only self-driving safe?▼
Will Tesla ever use lidar?▼
How does Tesla's FSD compare to other self-driving systems?▼
Read, Plan, Then Charge
Explore our expert articles to understand incentives and ownership costs, use the map to pressure-test charging reality, then grab the Canadian EV Guide for every detail in one place.
Continue Reading

BMW i4 vs Tesla Model 3: Is the $11K Luxury Premium Actually Worth It?

BYD Atto 3 vs Hyundai Kona EV: The $35K Showdown Canadian Buyers Will Actually Face

