Sensors in the driving seat – Design World Network
Advances in sensor technology help make autonomous vehicles safe and reliable.
RALF BORNEFELD | Infineon Technologies
Many drivers on today’s roads are already supported by advanced driver-assistance systems (ADAS) for functions such as parking assist, cruise control and automated lighting. Now, more intelligent and connected environmental sensors are making autonomous vehicles the next logical step for the industry. But as the development of driverless cars steams ahead, the debate continues as to what technology the ideal car of the future should offer and whether it should be completely driverless.
Since the invention of the first engine-driven vehicles in the late 1800s, cars have given people the freedom to go wherever and whenever they want in comfort and style. Part of the reward of owning a car is the thrill of being in complete control of a large, fast-moving machine. Passionate drivers might, therefore, feel averse to the idea of completely relinquishing this sense of independence to an autonomous vehicle that drives itself.
However, there are times when driving can be more draining than exciting, as when commuting in heavy traffic, or during long-distance journeys. Americans were stuck in traffic for a cumulative eight billion hours in 2015. The ability to switch-on autonomous mode and let the car takeover would not only relieve tired drivers but also reduce accidents.
THE MISSION FOR ZERO FATALITIES
In 1965, the U.S. recorded 47,089 deaths from car crashes. Fifty years later, despite the population rising from 198.7 million to an estimated 324 million, statistics show fatalities dropped to 35,092 in 2015. This is thanks to the introduction of safety regulations over the years such as the mandatory use of seat belts and increasingly advanced safety mechanisms such as anti-lock braking systems (ABS).
As the growing number of intelligent, connected sensors designed into new cars steadily makes driving safer, the vehicles of the future will be designed to attain “Vision Zero” – road traffic without any fatalities. It’s thought that human error causes some 90% of motor vehicle crashes, and with the implementation of tried-and-tested autonomous driving technology, it is hoped this figure could be drastically reduced in the future.
In 2006, cars carried an average of 40 sensors. Currently, this figure sits at around 90, and the self-driving car in 2025 might have double that number. Fully automated vehicles are set to go into mass production within the next decade, and this will no doubt improve our lives, offering greater flexibility, comfort and safety.
Technology heavyweights such as Google, Uber, Intel, Apple and even Samsung have partnered with auto manufacturers in the battle to be recognized as the architects of a brave new world of autonomous vehicles. Although the technology is generally not yet advanced enough to permit the driver’s hands to be removed from the wheel or attention to be taken away from the road, soon it will be possible for the driver to completely disconnect while the vehicle is driving itself.
Google is considered one of the earliest pioneers, with its self-driving vehicles clocking 1.5 million autonomous miles by March 2016. Google’s parent company, Alphabet, subsequently started a new self-driving car spin-off company called Waymo. Tesla Motors rolled out its Autopilot feature in a software over-the-air (SOTA) update in January 2016. Autopilot allows the Tesla Model S to act autonomously on limited-access highways with the full attention of the driver.
Ride-sharing company Uber began trialing a fleet of self-driving Ford Fusions in Pittsburgh in September 2016, each vehicle equipped with 20 cameras, seven lasers, GPS, radar and lidar. Uber’s main rival, Lyft, recently bolstered by a $500 million investment from General Motors and a technology partnership with Waymo, has also announced a forthcoming self-driving car trial in Boston.
Of course, the trials on public roads have been inevitably marred by some wobbles and even tragedy. Uber’s trials in Pittsburgh, San Francisco and Tempe, Ariz. have been plagued with legal issues, and one of its cars in Tempe flipped over in an accident at a yield sign. The first fatality happened in May 2016 in a high-speed collision between a Tesla Model S and a turning 18-wheeler truck. The subsequent inquiry was closed without recall, revealing that since Autopilot was rolled out the overall Tesla accident rate had dropped by 40%.
Lidar, radar and optical are three of the most important sensor technologies for the development of autonomous cars. The ultimate goal is to recreate the human power of reliable judgement, with the ability to make split-second decisions based on a combination of information from the sensors and lessons learned from previous experiences.
While radar uses radio-frequency electromagnetic waves, lidar sends out laser beams to scan an area and then analyze the reflections that bounce back. Lidar systems can be better than human senses in some cases, for example, in detecting even small objects on the road. However, up until recently they have not only been expensive – between $50,000 and $10,000 each – but also insufferably bulky because of their reliance on mirrors positioned to direct the laser beams. Technology companies are using different tactics to try and shrink lidar proportions, some by hoping to design a solid-state lidar without any moving parts and others by using flashes of laser light instead of constant beams.
Infineon has taken a slightly different approach by focusing on a microelectromechanical system (MEMS) developed by Dutch company Innoluce. Infineon acquired Innoluce in October 2016 with the aim of being able to offer technical expertise in all three complementary sensor systems required for autonomous driving.
The MEMS lidar device, which measures just 3×4 mm, consists of an oval-shaped mirror on a bed of silicon. Actuators use electrical resonance to make the mirror oscillate and change the direction of the laser beam. With a range of 250 m and the ability to scan 5,000 data points/sec., the MEMS lidar is small, robust and reliable, and expected to cost automakers less
Lidar systems will need to be semiconductor-based – thus becoming more compact, cost-effective and robust – to become a standard feature in all car classes, and they are going to be essential for self-driving cars to accurately identify roadside conditions such as traffic signs, road obstacles, and pavement markings.
The good news is that the burden of detecting all approaching dangers will soon be shared. As more new cars begin to be fitted with the full array of intelligent sensors connected to the network, road users will be able to pass on data to others about things like traffic congestion and dangerous road conditions.
Cars that drive themselves will also require enhanced computing power, and domain computer architectures connected via high-speed buses will need to become an integral part of the vehicle. It is therefore important to create redundant systems and domain architecture that guarantee fail-safe operation and safety.
As with every new technology, success will be long in the making and hard-won after the inevitable initial failures and setbacks. Absolute reliability is of utmost importance to ensure that drivers can confidently delegate responsibility to the vehicle, without fears about safety. A combination of all the most innovative sensor technologies will provide a segmented safety cocoon to ensure that autonomous vehicles of the future are as safe as possible.
The 2002 science fiction movie Minority Report depicted cars in the year 2054 as being sleek self-driving vehicles shuttling through a vast networked transit system. However, the futuristic Lexus 2054 that actor Tom Cruise found himself in was still capable of being driven manually to facilitate a dramatic car chase. The manual override compromise on automated vehicles might just be enough to safeguard the driving pleasure of the car enthusiasts well into the future.