mukeshbalani.com | “You heard it here first…if you haven’t already heard it elsewhere”…
A New Sensor Gives Driverless Cars a Human-Like View of the World
Eyes on the Road
We’re closer to making self-driving cars a reality than ever before, but at the moment the best options available to the public are along the lines of Tesla’s Autopilot system, which are assistive, rather than all-out, autonomy. To get to the next stage, many would agree that we need better sensors.
AEye, a company founded by Lockheed Martin, Northrop Grumman, and NASA veteran Luis Dussan, is aiming to produce just that. Dussan started out with a plan to create artificial intelligence that could underpin a self-driving car, but soon found that sensors were the weak link.
The majority of autonomous vehicles use lidar sensors, which use laser beams to take a reading of their surroundings. They’re limited in two key areas: they’re expensive, and they can only emit beams at preset angles.
AEye wants to use solid-state lidar sensors, which cast a laser beam back and forth across a scene. Most current applications of this technology utilize a regular scanning method, but Dussan wants to use two distinct types of scan instead: low resolution scans of a wide area, and high resolution scans of a smaller section, where the priority can be reprogrammed on the fly.
“You can trade resolution, scene revisit rate, and range at any point in time,” Dussan says. “The same sensor can adapt.” Much like the human eye, the hardware would be able to focus in on what’s most important depending on the driving conditions present at any given time.
AEye isn’t the only company looking to push the state of sensors forward. Apple has been working on a self-driving project in relative secrecy, but its AI chief Ruslan Salakhutdinov recently offered some details about what to expect.
Apple is said to be developing sensors that are capable of identifying objects even when the lens is affected or obscured by conditions, like rainfall. Localization and mapping techniques will be used alongside the sensors to create detailed 3D maps that help with decision-making.
Of course, all these attempts to create better sensors have one goal in mind: safer roads. While the masses may take a while to warm up to the idea of self-driving cars, there’s plenty of evidence that autonomy will make traveling by car much safer.
The wrinkle is that we’ll only see the best results when most — if not all — cars are piloted by computers. Self-driving vehicles will be able to communicate among themselves with perfect clarity, but anticipating the actions of a human driver could cause complications.
Still, advanced sensors will help make this a reality, sooner. Whether we’re talking human drivers or just pedestrians, self-driving cars will always have to deal with unpredictable human behavior. Receiving as much information as possible in an efficient manner via cutting edge sensors is one way to improve our the safety of our roads.
The post A New Sensor Gives Driverless Cars a Human-Like View of the World appeared first on Futurism.