- cross-posted to:
- technology@lemmit.online
- cross-posted to:
- technology@lemmit.online
BMW tests next-gen LiDAR to beat Tesla to Level 3 self-driving cars::Tesla’s autonomous vehicle tech has been perennially stuck at Level 2 self-driving, as BMW and other rivals try to leapfrog to Level 3.
My understanding was that the challenge in making the next leap in self driving was not based in hardware (detecting objects with cameras vs LiDAR), but in software. As in, it isn’t as difficult to detect the presence of objects as it is to make consistent and safe decisions based on that information.
But using LIDAR, you increase your data’s accuracy and dimensionality, giving you more options to play with. It probably won’t be a game changer, but it may be better than a camera only system.
Gathering more data, and being able to process it seems obvious as a way forward. How much better is this “new” LIDAR?
Edit: seems Tesla cars doesn’t even use LIDAR…
That’s not necessarily true. What you get is two separate things inputting raw data into a system that both need to be parsed. Sometimes, one won’t agree with the other and can cause issues with how the car thinks it should respond.
Nobody has a fully working system at this point, so it’s premature to make claims about what hardware is and isn’t needed. It may very well be that LIDAR is a requirement, but until somebody figures it out, we’re all just speculating.
You can, today, download an app and go ride in a self-driving car around multiple US cities. All of those cars use LIDARs. Sensor disagreement is not a major issue because sensor fusion is a very well-understood topic.
Yes but they geofenced those cars into areas with the most optimal conditions for autonomous driving. What happens when you take the car on the freeway, a suburban neighborhood, or a mountain pass?