Autonomous driving is a rapidly advancing technology and a subject of much controversy at the same time. At one extreme, people believe that autonomous cars will ensure a better future with increased safety on the roads, reduced infrastructure costs, and enhanced mobility for children, the elderly, and the disabled. At the other extreme, many people are afraid of automotive hacking, risks of fatal crashes, and the loss of driving-related jobs. A Pew Research Center survey found that 54% of adults are worried about driverless vehicle development, and only 40% of respondents feel optimistic about potential developments in car automation. The research has also shown how utterly different people’s views and attitudes toward self-driving cars are.
No doubt, autonomous driving is a complex and contentious technology. To understand how safe self-driving cars are, it is important to figure out how they work and what types of autonomous vehicle sensors help them know where to go and recognize objects on the road to prevent car accidents. But first, let’s look at the different levels of autonomous driving and define where we are at in the adoption of this technology.
From driver assistance to fully autonomous cars, there are five generally accepted levels of self-driving vehicles. They have been developed by the Society of Automotive Engineers (SAE) and vary depending on the degree of human involvement into driving. Actually, there are six levels in their classification, however level zero implies no automation but a complete human control of vehicles.
A human driver is responsible for all car-operating tasks, including accelerating, steering, braking, and monitoring of the surrounding environment. There is a driving automation system in a car that helps in either steering or accelerating, not both.
At this level, a car can assist with both steering and acceleration, while a driver is still responsible for most safety-critical functions and environment monitoring. Currently, the autonomous vehicles of level 2 are most common on the roads.
Starting from level 3 onwards, a car itself monitors the environment using autonomous vehicle sensors and performs other dynamic driving tasks, such as braking. A human driver has to be prepared to intervene if a system failure occurs or other unexpected conditions happen.
Level 4 implies a high level of automation, where a car is capable of completing an entire journey without driver intervention, even in extreme cases. However, there are some restrictions: a driver can switch the vehicle into this mode only when the system detects that the traffic conditions are safe and there is no traffic jam.
Fully automated cars do not yet exist, but automakers strive to achieve level 5 of autonomous driving, where drivers simply specify their destination and a vehicle takes complete responsibility for all driving modes. Therefore, level 5 cars have no provisions for human control, like steering wheels or pedals.
Although it looks fantastic, fully automated vehicles are expected to enter the world in 2020–2021. Currently, partially autonomous level 2 systems are dominating the market.
Autonomous vehicles are impossible without sensors: they allow the vehicle to see and sense everything on the road, as well as to collect information needed to drive safely. Further, this information is processed and analyzed in order to build a path from point A to point B and send appropriate instructions to the car’s controls, such as steering, acceleration, and braking. Moreover, the information collected with sensors, including the actual path, traffic jams, and obstacles on the road, can be shared between IoT connected cars. This is called vehicle-to-vehicle communication and helps to improve driving automation.
The majority of today’s automotive manufacturers commonly use the following three types of autonomous vehicle sensors: cameras, radars, and lidars.
How they work
Autonomous cars may have video cameras to see and interpret objects on a road just as human drivers do with their eyes. By equipping cars with cameras at all angles, the vehicles are able to maintain a 360° view of the external environment and provide a broader picture of traffic conditions around. Today, 3D cameras are available to display highly detailed realistic images. Image sensors automatically detect objects, classify them, and determine the distance to them. For example, the cameras can identify other cars, pedestrians, cyclists, traffic signs and signals, road markings, bridges, and guardrails.
Areas for improvement
Unfortunately, camera sensors are far from perfect. Poor weather conditions such as rain, fog or snow prevent cameras from seeing clearly the things on the road, thereby increasing the chances of accidents. Additionally, there are often situations where camera images simply aren’t good enough for a computer to make a good decision about what to do. For example, in situations when object colors are similar to the background or contrast is low, the driving algorithm can fail.
How they work
Radar (Radio Detection and Ranging) sensors make a crucial contribution to the overall function of autonomous driving: they send out radio waves that detect objects and gauge their distance and speed in real time. Short- and long-range radar sensors are usually deployed all around the car and have different functions. While short‑range (24 GHz) radar applications enable blind spot monitoring, lane-keeping assistance, and parking aids, the role of long‑range (77 GHz) radar sensors includes automatic distance control and brake assistance. Unlike cameras, radar systems typically have no trouble identifying objects during fog or rain.
Areas for improvement
The pedestrian recognition algorithm definitely needs to be improved, as current automotive radar sensors identify correctly only 95% of pedestrians, which is not enough to ensure safety. Also, widely-used 2D radars are not able to determine an object’s height as they only scan horizontally, which can cause problems when driving under the bridge. 3D radars currently being developed promise to solve that issue.
How they work
Lidar (Light Detection and Ranging) sensors work similar to radar systems, with the only difference being that they use lasers instead of radio waves. Apart from measuring the distances to various objects on the road, lidar allows creating 3D images of the detected objects and mapping the surroundings. Moreover, lidar can be configured to create a full 360-degree map around the vehicle rather than relying on a narrow field of view. These two advantages make autonomous vehicle manufacturers such as Google, Uber, and Toyota choose lidar systems.
Areas for improvement
Since rare earth metals are needed to produce lidar sensors, they are much more expensive than radar sensors. The systems needed for autonomous driving can be well beyond $10,000, while the top sensor being used by Google and Uber costs up to $80,000. Yet another problem is that snow or fog may block lidar sensors and negatively affect their ability to detect objects.
Autonomous vehicle sensors play an essential role in automated driving: they allow cars to monitor their surroundings, detect oncoming obstacles, and plan the path. In combination with automotive software and computers, they will allow the system to take over the full control of the vehicle, saving people a significant amount of time for doing tasks that are more efficient. Given the fact that the average driver spends approximately 50 minutes in a car daily, just imagine how valuable autonomous vehicles can be for the fast-paced world we live in.
Albeit the autonomous vehicle technology appears to be developing apace, no commercially available vehicles have yet passed the required level 4 for autonomous driving. There is still a huge area for technology improvement that needs to be taken seriously by manufacturers to ensure safety on the roads.