Emerging Sensors Driving Automation: LiDAR, Vision & 3D Trends
Key Highlights
- Sensor fusion combines multiple sensing modalities to improve environmental perception, obstacle avoidance, and safety in autonomous systems.
- AI-enabled vision and 3D depth sensing are revolutionizing quality control, navigation, and real-time decision-making in manufacturing and logistics.
- Advanced ranging sensors like LiDAR and ultrasonic technologies provide high-resolution detection with simplified integration, enhancing safety and performance.
In just the past few years, rapid advancements in automation have delivered unprecedented precision across diverse industries. At Machine Design, coverage of advancements in sensor and perception technologies highlights a core takeaway: The days of approaching sensors and automation as separate elements are disappearing.
Today’s systems are deeply intertwined and form the basis for smarter, more adaptive machines. This curation highlights key sensor technologies and trends reshaping industrial machine design and enabling smarter, safer and more efficient automated systems.
Editor’s note: Read the full coverage at the linked articles.
1. Sensor Fusion Frameworks and Reference Architectures
Sensor fusion, described simply, blends various ways of helping machines “see” and understand their environment better. Combining and blending optical data (what a camera sees), range data (how far away things are) and inertial data (information about the movement and orientation) results in safer autonomous systems for autonomous warehouse robot platforms and self-driving vehicles.
Going above and beyond what single sensors can deliver, integrated sensor fusion combines vision, LiDAR, IMU (inertial measurement units) and radar/proximity data as a way to improve robustness, situation awareness and obstacle avoidance.
What design engineers consider: The mechanical design needs to leave enough room for the sensors, ensure they are properly mounted and shield multiple sensors from interference. Fusion algorithms also demand that the data from these sensors be aligned and calibrated across devices so that readings match accurately.
A real-world sensor-fusion implementation: Ati Motors developed an advanced AMR stack that amalgamate multiple sensing modalities, including 3D LiDAR, cameras and IMUs with onboard AI to generate rich environmental maps and dynamic obstacle avoidance. According to Saurabh Chandra, founder of Ati Motors, the system achieves accuracy “in the high triple nines.”
Chandra pointed out that mobile robotics and self-driving car technologies are moving along similar paths with respect to advancements in sensing and perception. Autonomy in one area typically drives innovation in the other. He pointed out that both mechanical and sensor design are trending toward using a mix of vision and LiDAR sensors, rather than relying on a single sensor type.
2. 3D Vision & Depth Sensing
Gone are the days of simple pass/fail inspection. One needs only to scratch beneath the surface to uncover why cameras and vision systems adoption are proliferating the field. Cameras and vision systems account for a growing share of the sensor market. Projections for the video as a sensor market in 2025 were in the range of $71.19 billion and expected to grow to $144.7 billion by 2035, according to Roots Analysis, a business research and consulting firm. That translates to a CAGR of 7.35% for the forecast period.
Analysts say the once-passive recording tools are being transformed into intelligent decision-making assets, thanks to AI and machine learning. Growth is also influenced by lower camera costs and compact designs for a broader range of industries.
Mechanical and integration factors: Design engineers must account for vision sensor requirements for lighting conditions, optical quality and stable mounting. All of which place constraints on focus and calibration demands. And when integrated with motion systems, such as robotic arms or conveyors, factors such as precise timing and high-speed data exchange become critical to image alignment and synchronization with mechanical movement.
An industrial application of 3D vision: ABB’s AI-enabled Flexley Mover is equipped with 3D Visual SLAM, which employs multi-dimensional imaging for real-time mapping and navigation AMRs. The platform refines localization accuracy and supports efficient payload handling. It highlights how stereo and structured depth sensing can improve spatial perception in dynamic manufacturing environments.
ABB’s Marc Segura, president of ABB Robotics, states in a press note that “ABB has perfected robot eyes, through 3D AI vision technology; hands, through advanced force sensing, precision dexterity, and machine learning; and independent mobility, through 3D mapping.”
Segura said that the company’s robots gain a comprehensive, fuller view of the surroundings, which supports safer and more autonomous operations in automotive, manufacturing and logistics environments.
3. Networked Sensors with Diagnostics (IO-Link)
Basic sensors such as proximity, photoelectric and limit switches remain vital but are evolving. Thanks to IO-Link connectivity, the protocol helps to expand functionality by enabling two-way communication, remote configuration and predictive diagnostics.
Baumer, a manufacturer of sensors, encoders, measurement instruments and automated imaging components, pairs sensors with IO-Link technology in a way that allows users to configure built-in features.
Enabling the technology: Recognized as a standard connectivity layer for sensors and actuators, the benefits of IO-Link are palpable. Baumer’s Mauricio Lugo and Steffen Schneider noted that IO-Link delivers defined benefits, including easy commissioning and fast sensor exchange; access to diagnostic data for condition monitoring for optimized processes; precise parameterization of sensor functions; and efficient engineering thanks to PC-based sensor tools.
Design priorities: Engineers should plan network topology and connectors (IO-Link hubs, masters) early and incorporate them into control architecture early. They should also assess how built-in sensor diagnostics can integrate with condition monitoring, support predictive maintenance and reduce overhead.
4. Smart, AI-Enabled Vision for Real-time Control
LiDAR and advanced ranging technology are gaining traction across industries. Typically associated with automotive, LiDAR has expanded into industrial and robotics domains, particularly in applications where high-precision distance measurement and 3D mapping are required.
Simplify quality control on production lines: Vision systems designed with inference capabilities can produce actionable insights at the source. This is seen in smart vision systems used for automated control and safety.
For example, the In-Sight SnAPP vision sensor from Cognex is equipped with an integrated smart sensor that streamlines quality control on production lines. Compared to conventional laser sensors geared for inspection tasks, the solution is touted for the ability to locate parts regardless of their orientation and can identify more subtle flaws. These updates improve reliability.
Another example is the MiR Pallet Jack, which is equipped with AI-enabled vision for real-time control. In this setup, MiR fuses multi-camera and LiDAR data with onboard AI, which allows the robot to recognize and navigate around obstacles while ensuring safety of workers and efficient payload movement.
Design priorities: For an accurate range, engineers need to remain cognizant of the way mechanical housing design caters to optical line-of-sight. Vibration damping is essential for preventing signal errors from mechanical disturbances.
5. Advanced Ranging Sensors (LiDAR & Emerging Ultrasonic-3D)
Manufacturers increasingly shun complexity by turning to advanced ranging sensors to simplify automation. The challenge is to automate without sacrificing performance and developing solutions that fit seamlessly into standard industrial equipment. Expanding beyond traditional use cases into robotics and perception systems, LiDAR and emerging 3D ultrasonic technologies are advancing the delivery of reliable performance demanded from industrial environments.
Two cutting-edge examples are the Voyant Photonics Carbon LiDAR with FMCW Sensor on a Chip, and Sonair’s 3D Ultrasonic sensing technology.
The Voyant photonics carbon LiDAR with FMCW sensor on a chip provides high-resolution object detection and segmentation abilities at up to 200 meters in a compact form factor. This is achieved by embedding optics directly into a photonic integrated circuit (PIC).
Voyant notes that the fingernail-sized photonic integrated circuit (IP67-rated, 250 g) delivers high-resolution 3D point clouds with millimeter precision, real-time velocity data up to 63 m/s and detection ranges to 200 m across a 45-deg. vertical by 90-deg. horizontal field of view. Voyant said the solution has lowered costs and bumped up power relative to traditional time-of-flight (ToF) LiDAR.
Sonair’s 3D Ultrasonic sensors, known as ADAR (acoustic ranging and detection), offer enhanced spatial awareness for autonomous mobile robot (AMR) safety and cost savings of 50-80% compared to other LiDAR-based systems. It works differently from lasers and cameras. For one, it uses ultrasound waves in the air to give robots safe spatial awareness in 3D. The effect is quick, safe obstacle detection, and reported benefits include energy savings and seeing objects that laser and camera technologies miss.
Design priorities: When it comes to embedding sensors, a growing list of choices are coming online to assist with design decisions. Knowledge of the available choices enable engineers to balance requirements for cost, energy efficiency, precision and reliability.
About the Author

Rehana Begg
Editor-in-Chief, Machine Design
As Machine Design’s content lead, Rehana Begg is tasked with elevating the voice of the design and multi-disciplinary engineer in the face of digital transformation and engineering innovation. Begg has more than 24 years of editorial experience and has spent the past decade in the trenches of industrial manufacturing, focusing on new technologies, manufacturing innovation and business. Her B2B career has taken her from corporate boardrooms to plant floors and underground mining stopes, covering everything from automation & IIoT, robotics, mechanical design and additive manufacturing to plant operations, maintenance, reliability and continuous improvement. Begg holds an MBA, a Master of Journalism degree, and a BA (Hons.) in Political Science. She is committed to lifelong learning and feeds her passion for innovation in publishing, transparent science and clear communication by attending relevant conferences and seminars/workshops.
Follow Rehana Begg via the following social media handles:
X: @rehanabegg
LinkedIn: @rehanabegg and @MachineDesign






