Skip navigation
listen girll

Machine Learning Picks Up on Factory Sounds for Speedy Diagnoses

Using insight from technicians, this startup trains deep neural learning networks to diagnose anything from valve congestion to pump failure, just by "listening".

Airborne ultrasonic sensing has been used as a tool for machine diagnostics and predictive maintenance for more than two decades. Using handheld instruments that measure small changes in airborne frequencies as low as 40 kHz, technicians can detect changes in machine behavior caused by damaged moving parts, congestion, and other issues. Acoustic sensing is different than temperature and vibration sensing because it measures sound waves through air, rather than using data from numerous sensors at specific diagnostic sites.

But even without instruments, experienced technicians can often sense and identify changes in machine behavior just by listening. This skill, which is learned over many years on the job, inspired a team of engineers and computer scientists to develop deep-learning neural networks (DLNN) that can be trained to do the same thing at ultrasonic frequencies. With backgrounds in machine learning for machine vision, the team started a company called 3DSignals that uses deep learning on the Cloud to enable sound-based diagnostics for various components in a factory.

Convoluted neural networks (CNNs) and recurrent neural networks (RNNs) analyze audio data to classify machine failures, anomalies, and machine deterioration.

Co-founder and CEO, Amnon Shenfeld explains, “When we ask a technician, ‘What is your intuition in understanding if a machine is working well or not,’ the technician most commonly says ‘I can hear it’. With this experience, we came up with an idea to send interesting sound samples to ‘sound experts’ that can give us immediate feedback on what a sound means. Using this feedback, we can very quickly teach our learning algorithms to perform the same sound-based diagnostics.”

3DSignals uses ultrasonic microphones that are placed near machines to continually record sound (shown above). The audio data is sent to a Cloud-connected unit for processing by a DLNN, which classifies sounds as healthy operations, or as any diagnoses provided by the team’s algorithm specialists. Trained with the ability to classify various sounds, the DLNN can independently identify issues within minutes. This way, technicians on the floor can attend to problems as quickly as possible.

So far, the company has extended its service mainly to customers in steel factories and power-generation plants. It works with companies to create algorithms that will classify noises specific to the components in a machine. These may include motors, turbines, gas, hydropower, generators, bearings, compressors, pumps, valves, and others.

The system also opens new business models for OEMs. Using 3DSignals, they can add Industry 4.0 technologies to their products, and provide proactive maintenance services in response to continuous monitoring. 3DSignals recently partnered with Samson, a German valve company, to develop ultrasonic sensors and listening software that integrates into their valves.

"Now the valves suddenly have ears,” expresses Shenfeld. "Not only do the smart valves sense changes in flow through them, but they also sense the operations of other components around them."

The system can be connected with most commercial Cloud services, including Microsoft, Google, and Amazon. Diagnostics and continuous sound profiles can be read on the company's proprietary app.

Hide comments


  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.