Modern systems are packed with sensors. However, transferring the resulting data to the cloud is often problematic. To transfer less, but more relevant data, a combination of edge and cloud computing is the solution.
In order to detect impending machine failures at an early stage, sensors continuously monitor production and supply systems. Real-time sensor monitoring is used in particular for systems in system-critical infrastructures such as power plants in order to detect impending faults quickly. With increasingly higher sampling rates, sensors measure parameters such as temperature, pressure, movement or humidity. Due to the frequent sampling, enormous amounts of data are generated that require high computing capacities and network bandwidths. However, these capacities are not always available everywhere.
A paradigm shift is needed to tackle this problem: Instead of blanket transferring all data to the cloud and only evaluating it there, the data can be “filtered” directly at the data source, known as the edge: Inconspicuous data is sorted out here. Only data that could contain an indication of anomalies and requires detailed examination is sent to the cloud. Quality rather than quantity is the guiding principle for reducing data transport routes.
AI finds its way into real-time sensor monitoring
Model-based decision-making systems that work with artificial intelligence (AI) and machine learning (ML) help to efficiently evaluate sensor data in a wide range of applications. The systems continuously learn from the data, apply the findings directly on site and adapt their behavior. This is a significant step forward: previously, processing was often only carried out in a simple way using local control loops. These can only react to short-term events without recognizing long-term trends and recurring patterns.
For effective monitoring of system-critical infrastructures, sensor data must be considered in the context of other data. Depending on the application, sensor data must therefore not be evaluated in isolation, but must be linked to other relevant events and information. To this end, it can also make sense to store and evaluate the data obtained over the long term, which often fails due to insufficient storage and computing capacities.
Gas-fired power plants benefit from a combination of edge and cloud
Gas-fired power plants are one example of the use of sensor monitoring. They can be ramped up and down quickly to compensate for natural fluctuations in power generation from renewable energies. These plants are distributed around the world and in many places there is a lack of fast network connections, which would be necessary for transferring large amounts of data to the cloud.
The solution is a combination of edge and cloud computing: sensor data is analyzed locally at the edge, enabling efficient and secure control of the systems in real time. At the same time, the data from various locations that is consolidated in the cloud can be analyzed over the long term in order to identify trends and develop basic models. These models are stable in the long term and only need to be updated at regular intervals.
The challenge here is to strike a good balance between the use of the edge and the cloud: On the one hand, dependency on the cloud should be reduced. On the other hand, the cloud is the indispensable place where sensor data from geographically diverse locations converge and are analyzed in order to identify long-term trends in operations and form basic models. Machine learning applications also use the cloud as a central processing location, which is not always ideal – a lot of data traffic, latency problems and high energy consumption are the result.
Data processing at the edge relieves the cloud
A concept called “Federated Dataset Distillation” is suitable for reducing data traffic and the amount of data to be transmitted. This approach involves outsourcing parts of the global data analysis directly to the local devices on site, i.e. at the edge. This would allow large volumes of data to be compressed at the edge before being transferred to the cloud. This would significantly reduce the transmission requirements, while the data could still be used for training machine learning models. Even complex AI models could be trained in the cloud in a resource-efficient manner.
This approach aims to use existing computing capacities more efficiently and reduce dependency on the cloud. However, research is still needed into how real sensor data can be lossily compressed at the edge using this method before it is transferred to the cloud. If this is successful, data compression – just like pre-filtering – could further reduce transmission requirements.
Today, the training of prediction models is typically carried out centrally in the cloud. In future, however, AI models could also be trained directly at the edge by pre-processing the sensor data there. Federated dataset distillation could also be used to compress large volumes of historical data so that it can be transferred to the cloud for centralized training with minimal loss.
The SensE project: Sensor on the Edge
The “SensE” cooperation project between IFTA Ingenieurbüro für Thermoakustik GmbH and the Technical University of Munich shows how edge and cloud computing can be combined in practice: with new, modern and powerful generations of embedded systems that require little space and energy.
For the project, which was funded by the Bavarian Research Foundation (BFS), the developers analyzed sensor data in a gas-fired power plant with two turbines. By permanently monitoring the thermoacoustic vibrations, the system was to register deviations from the optimum operating range at an early stage. To ensure the most efficient edge operation possible, the researchers examined various computer architectures and analyzed different processor models and machine learning algorithms.
The demonstrator registers faults days in advance
An initial prototype – the demonstrator – initially focused on data processing at the edge. At the same time, the developers integrated important functions such as data compression, filtering and modeling. A performance model helped to better understand the resource requirements of the edge systems and to make further optimizations based on this.
The next step was to expand the demonstrator to enable collaboration between edge and cloud systems. The developers built a test system in the laboratory and integrated data models to optimize data throughput between the edge and cloud.
In the final phase of the project, additional hardware was installed directly in the power plant. This enabled the demonstrator to calculate anomaly parameters depending on the condition of the plant and predict potential problems. The demonstrator can thus detect potential damage several days in advance before it even occurs.
“The best of both worlds”: the key is the smart combination of resources
The hybrid approach to using edge and cloud combines the advantages of both processing locations in the best possible way. What is needed are systems that enable scalable and real-time training of AI models to detect impending failures. In order to achieve this in a resource-efficient manner, all the necessary computing operations must be divided flexibly and efficiently between the edge and the cloud. Sensor data processing is therefore evolving from a passive, local edge solution to a dynamic, collaborative architecture. Particularly in data-intensive industrial applications such as gas-fired power plants, the importance of edge and cloud systems will increase as the energy transition progresses and make the decisive difference for trouble-free operation.
What is Real-Time Sensor Monitoring?
Monitor sensor data in real time: Real-time sensor monitoring enables companies to continuously monitor critical parameters of machines and building technology. Networked sensors permanently record measured values such as temperature, vibration or pressure and transmit them for immediate analysis.
The intelligent monitoring systems automatically detect deviations from the normal state and issue an alarm if defined threshold values are exceeded. This enables predictive maintenance: Instead of rigid maintenance intervals, companies can use the continuous analysis of sensor data to detect wear at an early stage and plan their maintenance as required.
What is Federated Dataset Distillation?
Compress data sets intelligently: In federated dataset distillation, large distributed datasets are intelligently compressed for AI training. Instead of complete model parameters, only a condensed “distillate dataset” is exchanged between the systems.
In this process, the participants first train local AI models and distil a small synthetic dataset with the most important properties. This enables centralized training with significantly reduced resource requirements. An additional advantage: sensitive original data remains stored locally.