Austin Russell hops in a motorized cart and goes whizzing through a cavernous building on the edge of San Francisco Bay that is normally used to disembark cruise-ship passengers. As the lanky 22-year-old CEO tools around, he passes a mannequin, a tire and a co-worker on a bicycle–all elements of a demonstration to show how well his company’s sensor can monitor the environment. On a nearby screen, those shapes appear in rainbow colors that signify exactly how far away they are. All of it is the result of laser beams shooting out of a black box and bouncing off more than a million points around the room every second. “It’s easy to make an autonomous vehicle that works 99% of the time,” Russell says later. “But the challenge is in that last 1% of all the different edge cases that can be presented to a driver.”
Edge cases could mean anything from a sudden sleet storm to late-night carousers overflowing into the street. In order for self-driving cars to become a reality, vehicles must “see not just some of the objects some of the time, but all objects all the time,” says Russell, head of a firm called Luminar. Like an increasing number of entrepreneurs and investors, he believes that technology known as lidar–a shortened name for light detection and ranging–is a key part of the answer. More than $400 million has been invested in the field in the past few years. “Everyone’s circling and looking for opportunities within autonomy,” says CB Insights analyst Kerry Wu.
New cars that people buy today might have a dozen sensors on them already. But each type has drawbacks. Cameras, for instance, are helpful for backing up, but they’re hampered by snow and darkness. Radar sensors, which keep cars at a safe cruising distance, aren’t flummoxed by weather–yet they’re better at detecting metal than soft stuff, like humans.
Up until now, lidar had been too costly for widespread usage. In early prototypes of Google’s self-driving cars, the model that spun around in a bucket on top of the vehicle cost more than the vehicle itself. Self-driving researchers have forked out more than $80,000 for top-of-the-line laser-pulsing sensors because the detailed, three-dimensional picture they can provide, day or night, is hard to beat.
Startups are racing to bring down the price while maintaining a robust picture, often by rethinking the architecture piece by piece. Lidar can tell “what you’re doing with each finger, in the dark, when you’re a hundred yards away,” says Louay Eldada, co-founder of Quanergy, a startup based in Silicon Valley that was valued at more than $1.5 billion last year. His company plans to start shipping an “auto-grade” sensor that costs less than $1,000 in 2018, he says, assuming it checks the boxes in a battery of tests. Oryx Vision, an Israeli startup that is building a test unit for cars, hopes to eventually sell lidar sensors to vehicle manufacturers for about $100.
Robot cars might never get tired or have a fit of road rage, but still “there is no technology that can match the amazing ability of the human eye [and] brain” to comprehend a car’s surroundings, Oryx’s vice president of marketing, Yaron Toren, writes in an email, adding, “the closest we can get is with lidar.” Russell argues that trade-offs in quality must come with such drops in price but won’t say how much customers are paying for the 10,000 units Luminar is on track to start shipping by year’s end.
Tesla’s Elon Musk has argued that advanced radar could do the same job as lidar, and other startups are working on super-powered cameras to help cars see more clearly. Experts say it’s all for the better, because autonomous vehicles will almost certainly have a mix of sensors, just like people do. “I wouldn’t want to trust only one sense,” says Andy Petersen, a hardware expert at the Virginia Tech Transportation Institute. “There are ways you can fool any one of these sensors, but it would be hard to fool them all.”