Machinedesign 14056 Selfdrivecarsensors Promo
Machinedesign 14056 Selfdrivecarsensors Promo
Machinedesign 14056 Selfdrivecarsensors Promo
Machinedesign 14056 Selfdrivecarsensors Promo
Machinedesign 14056 Selfdrivecarsensors Promo

Saved by the Sensor: Vehicle Awareness in the Self-Driving Age

Jan. 18, 2018
Self-driving cars are set to be a major disruptor in the automotive market, and their safety depends on the sensor systems they use to see the world.

First, they were inconceivable and then a far-fetched dream. Now self-driving cars are finally starting to make their way into the real world. In most robotic use cases today, an error can lead to a bad part, or perhaps an unswept floor. However, when driving a car, a computer must constantly make potentially life-threatening decisions. Because of this, the information fed to it has to be extremely accurate, meaning that several redundant sensor systems are often used to account for different driving situations.

Below is a list of the sensor systems used to maintain driverless-vehicle safety. In some cases, these can even apply to human-driven vehicles for tasks like emergency braking, blind-spot awareness, and adaptive cruise control to match up to the car in front of us. Significantly, they not only have to work once, or even a few times in a lab—they also must be automotive-grade and able to withstand the abuse of thousands of miles in varied conditions and still produce accurate results every single time.

Vision sensors — If you consider how we as humans observe the world, we collect an incredible amount of information via sight. Not only can we view and process the light waves reflecting onto our eyes, we can use the two of them together to observe distance. Vision sensors (or cameras, as they are also called) can be extremely useful in self-driving vehicles, able to take on such tasks as detecting pedestrians and even reading road signs. They can, however, be fooled in situations that human eyes would normally be able to handle, such as a brightly colored object against a bright sky or even painted objects that depict a different situation than reality.

LiDAR — Short for “light detection and ranging,” this sensing method measures the reflection of near-infrared light off of objects as it scans in 360°. Multiple beams are emitted at angles to each other, and the sensor array physically spins inside the device’s housing to produce a three-dimensional picture of its surroundings. Physically, the devices normally resemble a sort of oversized upside-down coffee cup, and can most famously be seen on top of Google’s Waymo cars, scanning an area as they drive.

While considered by some to be the gold standard in sensing technology for self-driving cars, price has been the major drawback. This type of sensor package originally cost a whopping $75,000 each when Google started experimenting with the technology nearly a decade ago. However, they have dropped to a tenth of that as of 2017.

Solid-state LiDAR — As the interest in LiDAR, which physically spins infrared lasers inside the sensor unit, has grown, companies are also developing solid-state versions with no moving parts. Importantly, these devices can be much less expensive than “traditional” automotive LiDAR units. Velodyne, for example, announced that it would be building these units at a price of hundreds of dollars each in 2018, and startup Innoviz has promised to make a sensor for only one hundred dollars, also in 2018.

While less expensive and smaller than their coffee-cup brethren, one disadvantage is that they cannot see in a full 360 degrees, meaning multiple sensors will be needed for a complete picture of the vehicle surroundings. This type of solid-state device can be implemented in other exciting ways as well, such as AEye’s version, which uses a vision system and on-sensor processing to focus the LiDAR beam in the needed area.

Typically found in many self-driving vehicles, the cameras, LiDAR, and radar array function as a single observation system to provide safety-redundant 360-degree sensing. (Courtesy of SAE International)

Ultrasound — Ultrasound sensors operate by pinging their surroundings with a loud sound at a frequency too high for human ears to register, then measure the time it takes for this signal to return. While you might expect the range to be limited, Tesla’s ultrasonic sensors can see out to a range of nearly 500 meters. In addition to sensors for autonomous cars, these sensors are also used in “conventional” vehicles for applications such as self-parking and blind-spot detection.

Radar — Radar, or radio detection and ranging, uses reflected radio waves to sense surrounding objects, similar to LiDAR. This technology has been around since the mid-1900s for detection of ships and planes, and is now miniaturized to the point where it can be used on passenger cars. Radar sensors pick up metal objects best, seeing humans as partially translucent, and seeing plastic or wood as nearly transparent. Despite its drawbacks, this sensing method does give cars long-range sensing abilities that can see through dust, fog, rain, and snow.

Tesla’s vehicles forgo LiDAR altogether, and instead use radar as the primary sensing method. These vehicles, however, do also feature eight cameras, along with ultrasound sensors to supplement this system. In addition, radar technology is in use today for adaptive cruise control and automatic emergency braking in human-driven vehicles.

Because each type has its advantages and disadvantages, sensors will need to continue to work together to create a complete picture of vehicle surroundings. While a sensor package for a self-driving car today can cost in the hundreds of thousands of dollars, when the forces of mass manufacturing come into full swing, we’re certain to continue to see these prices drop. We expect this trend toward automation to continue, with vehicles increasingly supplementing a driver’s senses and ultimately replacing them altogether—all to keep us moving safely on the road.

Zach Wendt and Jeremy S. Cook are engineers who cover emerging technology. Zach, with Arrow Electronics, has a background in consumer product development. Jeremy writes for a variety of technical publications and has worked in manufacturing automation.

Go to Arrow Electronics to learn more about sensor applications here.

Sponsored Recommendations

Altech's Liquid Tight Strain Relifs Catalog

March 13, 2024
With experienced Product Engineers and Customer Service personnel, Altech provides solutions to your most pressing application challenges. All with one thought in mind - to ensure...

Industrial Straight-Through Cable Gland

March 13, 2024
Learn more about Altech's cable glands and all they have to offer for your needs!

All-In-One DC-UPS Power Solutions

March 13, 2024
Introducing the All-In-One DC-UPS, a versatile solution combining multiple functionalities in a single device. Serving as a power supply, battery charger, battery care module,...

Smooth Sorting with SEW-EURODRIVE!

Feb. 22, 2024
Sorting systems are essential when it comes to warehouse automation, material handling, and distribution. SEW-EURODRIVE’s automated sorting solutions increase capacity, reliability...

Voice your opinion!

To join the conversation, and become an exclusive member of Machine Design, create an account today!