Ever noticed how self-driving cars end up wearing some weird hats?
The earliest self-driving military trucks looked like they had spinning coffee cans up top. Carnegie Mellon’s iconic self-driving Hummer was topped by a giant ping-pong ball. Waymo’s smiley little prototype wears a siren-shaped dome that makes it look like the world’s most adorable police car.
Inside all three are about a dozen lasers, shooting through telescope-grade optics, slinging around hundreds of times per minute, to generate 300,000 data points per second. It’s called lidar, and without it, these cars would all be blind. It’s also one of the biggest reasons you don’t have a self-driving car in your driveway right now. At around $75,000, a single lidar can easily cost more than the car it rides on. And that’s just one ingredient in the self-driving soup.
But a new technology is popping up everywhere this year: solid-state lidar. With no moving parts, it promises to give self-driving cars sharper, better vision, at a fraction the cost of old-school, electromechanical systems. Solid-state lidar will pave the way for the first self-driving cars you can actually afford. Here’s how it works — and what’s just around the corner.
How lidar works
The term “lidar” comes from mashing together “light” and “radar,” which also makes a handy way of understanding it because … well, it’s radar, but with light.
A refresher from high-school physics: Radar bounces a pulse of radio waves off an object, like a plane, to determine how far away it is, based on how long it takes for the pulse to bounce back. Lidar uses a pulse of light from a laser to do the same thing.
“You need a combination of cameras, radar, and lidar in order to create a self-driving system.”
Take enough of those lasers, spin them in circle, and you end up with a three-dimensional “point cloud” of the world around you. You’ve probably seen these rainbow-colored dots depicting cityscapes, mountains, and even Thom Yorke’s singing, disembodied head in Radiohead’s House of Cards music video. That 360-degree 3D map is like a Rosetta Stone to a self-driving car, allowing it to decipher the world around it.
“You need a combination of cameras, radar, and lidar in order to create a self-driving system,” explains Jada Tapley, VP of Advanced Engineering at Aptiv. She would know. Aptiv built the autonomous Lyft cars that ferried attendees around Las Vegas for CES 2018. In the worst gridlock the city sees all year. And monsoon-like conditions. With zero accidents.
Those cars had nine lidar, ten radar, and four cameras. A combination of all three allow it to drive itself, but lidar performs the crucial function engineers call localization. “It’s important for the vehicle to be able to identify with a very high degree of accuracy where it is on the map,” Tapley explains. “We use our lidar to do that.”
While GPS can narrow down your location to a circle about 16 feet in diameter, lidar can do it within a circle four inches in diameter. That’s better than a lot of drivers can manage. Tapley remembers one group of wide-eyed journalists wincing as Aptiv’s autonomous car breezed past a parked bus in Las Vegas. They didn’t need to — because the car knew there was plenty of room. “As humans we get intimidated, especially by big, big vehicles like buses or semis. So we tend to kind of edge away from them,” she explains. “But an autonomous vehicle doesn’t need to do that.”
Autonomous car levels explained
International engineering organizations have settled on six levels of automation to talk about the evolution we’ll see between dumb cars and complete autonomy.
Level 0: No autonomy
This is the car you already probably own. Stop texting! You need to do everything.
Level 1: Hands on
Your car will help you in some scenarios, like adaptive cruise control slowing you down on the highway when the car ahead of you does.
Level 2: Hands off
Your car can drive just like you do — under just the right circumstances, like Tesla Autopilot on a divided, marked highway.
Level 3: Eyes off
Go ahead and send that text; this car won’t crash if it doesn’t have your attention. But you’ll still need to grab the wheel if things get complicated, like with Audi Traffic Jam Pilot.
Level 4: Mind off
Go to sleep; your car is under control. But you still need to sit behind a wheel juuust in case.
Level 5: Total autonomy
Your car has no steering wheel, because it can drive better than you can in all scenarios. Go sit in the back, feeble human.
While cameras can identify objects, and radar can tell how far away they are, lidar can achieve both with a degree of precision neither can touch. “Imagine that there’s an 18-wheeler tire tread in the middle of the road,” Tapley says. “Radar will not detect that. Lidar will.”
That’s why a Tesla Model S, which has both cameras and radar, but no lidar, must have a driver prepared to take the wheel at any time. It’s considered a level 2 autonomous vehicle. Almost all car autonomy experts — with the glaring exception of Elon Musk — believe lidar is necessary to achieve true “sleep behind the wheel” level 4 autonomy.
And that’s a tremendous problem if you or I ever hope to own one. The silver Velodyne HDL-64E you see atop many test cars costs $75,000. Even the company’s “budget” Puck model runs $8,000. And this is not a part you can want to skimp on. Imagine your car windows going black at 80mph, and you have a pretty good idea how losing lidar would look to the computer in a self-driving car.
Like all technology, lidar has become cheaper over time, but the precision required and massive spinning parts in electromechanical lidar mean it can’t become cheaper, smaller, and better every year the same way the processor in your phone or computer does.
But what if … you could make lidar from only silicon? Take away all the moving pieces, and the future starts to look a lot brighter.
Welcome to the solid state
Solid-state electronics, which by definition have no moving pieces, have changed the way we do everything from keeping track of time to listening to music. Remember how portable CD players used to skip? That’s what happens when you rely on a laser to read microscopic grooves in a spinning disc. But you can put your smartphone in a paint shaker and still listen to Kanye, because the music is stored on solid-state memory chips that don’t mind getting shaken up. Lidar is heading in the same direction.
Like portable CD players, spinning electromechanical lidar is not ideal. “Number one, they’re big,” says Tapley. “Number two, they’re expensive. Solid-state lidar allows us to get smaller, package better in the vehicles, and reduce costs.”
How do you move light around without moving a lens or a mirror? How does lidar get to solid state? Engineers have devised some downright genius ways.
The first is called flash lidar. “Flash is basically where you have a light source and that light source floods the entire field of view one time using a pulse,” Tapley explains. “A time-of-flight imager receives that light and is able to paint the image of what it sees.” Think of it as a camera that sees distance instead of color.
Think of it as a camera that sees distance instead of color.
But that simplicity comes with some snags. To see very far, you need a powerful burst of light, which makes it more expensive. And the light can’t be so powerful that it damages human retinas, which limits range. One workaround is to blast light at a specific, invisible wavelength that doesn’t affect human eyes. Perfect! Until you bump into yet another catch: Inexpensive silicon imagers won’t “read” blasts of light in the eye-safe spectrum. You need expensive gallium-arsenide imagers, which can boost the cost of these systems as high as $200,000.
“You have to have an extremely powerful light source, or an extremely sensitive receiver, and if you don’t have those things then you have this limited range,” Tapley says. It might be perfect for government planes conducting detailed aerial surveys, but flash lidar probably isn’t fit for your Corolla.
Set phasers to scan
Fortunately, there’s another way. Louay Eldada has been cracking on the problem since he got his PhD in optoelectronics in the early ‘90s; and today he runs Quanergy, one of the preeminent players in solid-state lidar. Eldada and his team derived a different approach by looking at how radar works. It is, after all, a close cousin of lidar. As it turns out, radar used to spin just like lidar, until scientists developed a brilliant workaround known as the phased array.
A phased array can broadcast radio waves in any direction — without spinning in circles — by using a microscopic array of individual antennas synced up in a specific way. By controlling the timing — or phase — between each antenna broadcasting its signal, engineers can “steer” one cohesive signal in a specific direction.
Phased arrays have been in use in radar since the 1950s. But Eldada and his team figured out how to use the same technique with light. “We have a large number, typically a million, optical antenna elements,” Eldada explains. “Based on their phased relationship amongst each other, they form a radiation pattern, or spot, that has a certain size and is pointed in a certain direction.”
By intelligently timing the precise flash of a million individual emitters, Quanergy can “steer” light using only silicon. “The interference effect determines in which direction the light goes, not a moving mirror or lens,” Eldada explains.
That means the nest of optics and motors inside a $75,000 lidar bucket disappears, and you’re left with only chips. Right now, Quanergy uses several chips and sells the package for $900, but future versions will become a single chip. “At that point, our sales price will become under $100,” Eldada predicts.
Quanergy can “steer” light using only silicon.
Solid state isn’t just cheaper, it’s better. “Being able to effectively change the shape of the lens to any shape you want allows you to zoom in and zoom out,” Eldada explains. “So imagine you’re looking at an object in your lane, and you want to define in high resolution what it is. You reduce the spot size and determine it’s a deer, it’s a tire, it’s a mattress that fell off a truck. At the same time, you can hop between doing that and looking at the big scene.” This “hopping” could happen multiple times per second without a driver even knowing, as an algorithm calls the shots and determines what deserves a closer look.
Solid-state devices also last longer. Electromechanical lidar can run for between 1,000 and 2,000 hours before failure. With the average American spending 293 hours in a car per year, most of us would end up replacing our lidar before our tires. Quanergy claims its solid-state lidar will run for 100,000 hours — more than most cars will ever drive.
Mirror mirror, on the wall
Flash and optical phased arrays are really the only true solid-state lidar. But there’s a third new way to do lidar, the red-headed stepchild known as microelectromechanical mirrors — or MEMS mirrors.
As the “mechanical” in “microelectromechanical” suggests, there are moving parts, so MEMS mirrors aren’t truly solid-state. But they’re also so tiny that the technology still represents an improvement over large-scale electromechanical lidar.
Aptiv is hedging its bets by working with – and investing in – all of them.
“The architecture is very simple,” Tapley explains. “You have one laser, one mirror.” The laser fires into the very tiny mirror, which spins like a top, providing the rotation that conventional lidar gets from spinning an entire bucket around.
It’s simple enough, until you want to move the laser up and down in addition to spinning in circles. Then you need to “cascade” it off another mirror, which spins on another axis. Or you can shoot multiple lasers at one mirror. Either way, the cost and complexity begin to build.
“Making sure that everything is aligned perfectly creates challenges,” Tapley explains. “If you’ve got this laser in a mirror that’s rotating on both axes, it can sometimes be susceptible to shock and vibrations.” You know, like the type you might find in a car, bouncing down the road at 70 mph.
Eldada points to other issues. “Micro MEMs mirrors drift out of alignment. They don’t maintain calibration. When there are big changes in temperature, they need to be recalibrated over the lifetime.”
“If the mirrors get stuck, you have an eye safety issue,” he points out. And sunlight can wreak its own havoc. “You have big issues when you’re facing the sun,” Eldada says. “The sunlight is going to hit it, the light is going to get reflected inside the lidar, and saturate the detectors, and drown out the signal.”
With so many differences between all three types of next-gen lidar, Aptiv is hedging its bets by working with – and investing in – all of them. “Each have different tradeoffs relative to field of view, range, and resolution,” Tapley explains. “Depending on where that lidar is positioned on the vehicle, that will dictate which one of those needs to be the most important.”
Side-facing lidar, for instance, might not need the range that front-facing lidar does. By mixing and matching between the variety, Aptiv hopes to harness the best of all worlds.
So where’s my self-driving car?
In 1999, Jaguar introduced the first radar-based cruise control in the XK, a coupe that sold for about $100,000 in today’s dollars. At the time, the sensors were so expensive that as Tapley tells it, “People joked around that you got a free Jag with every radar purchase.”
Today, you can get the same feature in a $18,000 Corolla. “We’re kind of on that same learning curve with lidar,” she says. “Until solid state becomes mature and enters mass production, these vehicles are going to be pretty cost prohibitive for an average consumer to own.”
Quanergy’s $900 solid-state lidar sensor is helping make that happen. The upcoming Fisker EMotion will be the first vehicle to hit the streets with those sensors inside — five of them — when it arrives in 2019. No bigger than the battery pack for a cordless drill, they’re buried in vents, hidden behind chrome grilles, and totally invisible unless you’re looking for them. A long way from the spinning buckets of yesterday.
Solid-state lidar means that self-driving cars won’t just be robochauffeurs for the wealthy.
Eldada believes we’ll see level 4 autonomous cars from a notoriously “aggressive” American manufacturer as early as 2020. “2021, 2022, you will see several more. 2023 is the big year. Most automakers will have self-driving cars.”
While the Fisker will be priced at $130,000, it might end up looking a lot like the Jaguar XK of 1999: An expensive harbinger of technology to come. Ultimately, solid-state lidar means that self-driving cars won’t just be robochauffeurs for the wealthy. “It means that everyone can have a self-driving car,” Eldada says. “It’s not only for the Mercedes S-Class and BMW 7 Series. This means that people driving Toyota Corollas will also have self-driving cars.”
And as fundamental as that shift may sound, cars may be just the beginning for solid-state lidar. “You will see it in devices, you will see it in wearables, in the helmets of firefighters and soldiers. The applications are almost limitless.”
Published at Thu, 15 Mar 2018 10:15:58 +0000