IoT: Real-Time Data Analysis for more Responsive Manufacturing
Many manufacturers around the world have a very large number of heavily underused data. Solution providers are increasingly investing in data management and enhancement to add value to the business.
Industrial equipment produces a very large volume of valuable data on their operating status: temperature, pressure, availability time, number of incidents…
Real-time analysis of these large volumes of data helps to improve equipment availability, streamline production and thus increase yield. Ultimate step: Predictive maintenance can be put in place to optimize interventions and anticipate outages before they occur. Production can then become more reactive and bring the agile company into the Industry of the Future.
Connect at all costs and convey information
On old equipment still in place, this data does not always come from communicating sensors. In addition, these machines have a lifespan of 20 to 30 years! Yet solutions exist, despite this rate of renewal to capture this information otherwise.
For example, Intel’s “Connect the Unconnected” approach is to replace a manometer with its electronic equivalent. Thus, a video camera can record an image and “digitize the pressure gauge” via basic image recognition software.
So, the first stage of data generation can be realized, via the sensors of the modern equipment, or by the approaches of substitution.
Certainly, Intel does not manufacture sensors. Nevertheless, we want to use this information generated by the equipment (or its automated monitoring) to make sense of this valuable information to improve production.
Once produced, this data must be conveyed from the machine or production line to equipment that collects it for processing or analysis. Already, simple installations allow cameras to count the products or check their quality out of the chain, for example, by the recognition of patterns. Simply connect a wireless camera in 5G or gigabit Wi-FI to allow the transfer of data.
The Cloud easily accessible and expandable
Once these data are collected, they are sent to a solution providing the necessary processing: real-time or delayed analysis, cleaning and correlation of data between them or with external data, etc. To process this information, several approaches are possible.
The Cloud is now able to absorb large amounts of information. The elasticity of the infrastructure makes it possible to respond simply to material needs, without requiring the intervention of specialists.
Intel is deploying its high-performance processors at Cloud providers. However, to comply with any compliance rules or the choice of the company, it is possible to work on private Clouds (in a datacenter for example) or on hybrid environments. Regardless of the choice of infrastructure, all this data remains available for additional work by the company’s data analysts.
Near the sensors
Whatever the chosen Cloud environment (public, private, hybrid), the latency between the generation of data and their processing on a public or private remote Cloud can pose problems of responsiveness.
In order to minimize these delays, it is necessary to place computing power near the sensors or equipment concerned. This is called “Edge Computing “, which consists of installing servers near the sensors and the data generated, in order to process smaller volumes of information more quickly.
These proximity treatments can filter the data, perform very fast analyzes in real time. All or part of this information can then be sent to the Cloud or to the company’s datacenters for deferred processing, requiring the correlation of multiple data sources on very large volumes.
Directly in the sensor
A third level of processing, even more critical and concerning delays can be done directly in the sensor. For example, a camera on the production line could simply send this information to an Edge Server. However, the cameras also integrate processors that immediately perform the appropriate processing: counting, shape detection, analysis of image models – all elements that promote compliance with quality requirements, through the identification of the rejection rate or incident at the earliest in the production line.
In addition, the correlation of this data with information from sensors in different parts of the chain, it becomes possible to identify as soon as possible the source of an incident or a quality problem.
Powerful database engine
The orchestration of all these technological evolutions requires software dynamic and a strong knowledge of business processes, which provides the engine and all the tools for data analysis.
The data engine should be able to handle very large volumes of data directly in memory, with optimal performance. This is also the reference in this market. For example, the SAP Cloud Platform, the SAP Leonardo digital transformation engine, analysis tools, image processing algorithms, Machine Learning, etc.