WellAware Blog

News, Insights, and more on Industrial IoT

Latest Stories

Featured Stories

Filter By Categories
Cameron Archer
By
April 02, 2020

Edge compute solves the industrial low-res data problem

Through our data problem series, we’ve covered several data challenges that industrial companies face when pursuing digital projects. Some struggle with low-quality data, while others are limited by data silos and legacy systems.

 

Many of the specific issues we’ve covered are symptoms of the Digital Age. We have more data than ever before at our fingertips, and we are still figuring out what to do with all of it. Over the last several years, our ability to capture, transfer, store, and analyze data has improved significantly. Yet, we are still creating more data than we can actively process, and technological advancements like edge compute promise to hasten our digital transformation.

 

Enter the low-resolution data problem.

Low-resolution (or “low-res”) data is a unique but prevalent challenge - we can collect operations data on time but not at the rate needed to fully evaluate its meaning. High-quality data samples may come in at consistent intervals. Yet, they miss anomalies or insights that could have massive negative repercussions. Without the right level of sensor output granularity, operators are unable to manage their remote assets to the best possible extent.

 

The good news is that the low-res data problem is entirely correctable. There are many ways to solve the low-res data problem. However, not all are feasible or sustainable for every organization, so companies undergoing digital transformation need to thoughtfully consider how to best get the data they need at the granularity they require to get the job done.

 

The problem of low-resolution data is generally created by a tradeoff between communications cost or bandwidth, data storage capacity, and business need. Companies with financial means can increase data throughput or payer higher comms costs to get raw data back to an analytics engine. Smaller organizations with limited capital budgets need to get more creative, and figure out ways to push processing closer to the edge.

 

Thanks to advances in sensor technology, smart devices can now collect and process raw data before it’s sent into data pipelines. By taking advantage of edge computing, manufacturers can reduce data management expenses drastically and put those savings into enhancing low-res data.

For those in the design or build stage, mitigating low-res data problems upfront can yield significant value down the road. Below, we dig into what causes low-res data challenges in industrial IoT deployments.

 

old_tv_low_res-640x427Low-resolution data is a sign of antiquated technology. Photo by Sven Scheuermeier

 

What causes low-resolution data?

There are several primary causes of low-res data.

First, industrial companies often compensate for bandwidth limitations by decreasing the frequency at which they receive data. In other words, they address latency problems by reducing resolution. When data pathways get clogged, the quick and easy solution is to reduce how much data enters the pipeline. This “quick-fix” assumes that any data is valuable, even if it comes at a lower frequency, but the compromise can have an unseen effect on many business processes.

 

Second, sensor transmitters may not be configured appropriately for the specific application being measured. This can be caused by bandwidth or cost constraints, or it could be caused by oversight: the technician configuring the transmitter may not consider all of the impacted business processes. For example, an industrial sensor programmed to send samples every 15 minutes might miss data anomalies which last for only 5 minutes.

 

Third, some industrial processes - like predictive maintenance - require incredibly high-resolution monitoring to be effective. In environments where rotational machinery is used, for example, companies may use accelerometers to track vibrations for critical equipment. If transmitters lack sophistication, they may not capture the necessary frequency of data to alert operators to anomalies that can indicate when something is wrong.

 

Regardless of the cause, low-res data can significantly impact downstream business processes - mostly in the form of anomaly detection designed to preserve machine health, but also in larger-scale analytics where higher resolution allows more thorough analysis. The difference in sampling frequency could directly translate to a difference in an analytical discovery that leads to better business results, or the inefficient status quo.

 

Who is affected by low-resolution data?

Low-res data has an especially significant impact on the day-to-day responsibilities of field workers. These individuals are responsible for addressing operational errors that arise within manufacturing processes or equipment. In essence, they are the first line of defense against anomalies which put machines and processes at risk. If networks suffer from low-res data problems, field workers may not have the intelligence they need to mitigate issues before they occur or respond when they do.

Data aggregators and analysts are also affected by low-resolution data. If they don’t have the appropriate degree of granularity related to every remote asset, they can’t synthesize data in a way that truly reflects reality. Consequently, leaders are unable to make informed decisions about their asset base and business process.

 

catastrophe-640x427Field workers need to be able to immediately detect fast-acting anomalies that put equipment at risk. Photo by Andrea Ferrario.

 

What do we need to consider?

The low-res data problem draws attention to higher-level network design approaches. It forces network engineers and architects to consider how data moves throughout their industrial monitoring systems more broadly.

 

To maximize value and minimize costs, they have to optimize how data is stored, transferred, and processed at all times. If too much occurs within corporate systems, organizations may incur unnecessarily high communications and networking expenses. In response, leaders may ask field workers to reconfigure devices to transmit fewer samples.

 

Rather than increasing communications costs, however, industrial companies should strongly consider upgrading edge compute power to push analytics processes closer to the data source. Not only does this relieve networks of the burden of carrying larger payloads of raw data, but it also significantly reduces processing time and allows industrial workers to arrive at conclusions much more quickly. To use our prior example of rotating machinery, more sophisticated vibration sensors now perform Fast

 

Fourier Transform analysis on accelerometer data to process the many vibration samples into frequency data - this drastically reduces the amount of data sent to the cloud while still providing the necessary information. As edge compute power continues to grow thanks to Moore’s Law and a global economy becoming more and more dependent on silicon - industrial sensors and transmitters now have the ability to perform increasingly complex analytics before data is ever transferred over a network.

 

When they handle the issue properly, businesses that suffer from low-res data can actually have the best of both worlds: high-power processing and analytics and low communications capital and operating expenses.

 

rotating-machinery-640x427Rotating machinery needs high-resolution data to catch anomalies. Photo by Crystal Kwok.

 

What’s at stake for your business?

Your business must identify and address low-res data issues quickly, as they can cause major problems for remote workers and processes.

Use these guiding questions below to evaluate your existing capabilities:

  • Which monitored assets could seriously damage your business if anomalies go undetected?
  • Are the associated monitoring devices configured appropriately for the use cases?
  • How much will it cost to increase network throughput to solve the problem?
  • Which cloud analytics processes place the biggest data burden on your networks?
  • Which legacy OT devices could be upgraded to modern edge compute to push processing to the edge?

At WellAware, we help industrial companies overcome the low-res data challenge in several ways so that they don’t miss critical insights related to their remote assets.

 

We have a portfolio of powerful devices capable of powerful edge compute. We also help system architects or operations managers think through how to update data management techniques to preserve bandwidth across their networks.

Ready to talk to an expert? Contact us today.

 

And stay tuned for our next post! We’re addressing the tricky problem of Misidentified Data.

 

Like what you're reading? Sign up for updates!

 
WellAware Logo - Color
 
Ready to connect to the things that matter to your business?
 
Remote asset monitoring creates a more efficient workforce, pushes down your operating costs, and drives data driven business outcomes. Learn how WellAware can help you achieve your digital transformation goals.
 

Contact Us