Real Time monitoring of Leak Detection Tests using Azure IOT Suite in Auto Manufacturing Industry
The customer, an auto component manufacturing company, currently performs the pressure decay test or differential pressure test to measure the leak rate in their auto component as part of the QA processes defined in the organization. Today the test values are read from the display panel or data acquisition system (DAS) manually and captured in a printed sheet to assess if the component is accepted or rejected. The result of the component is shared with the Product Engineering Team the next day for the failed components and a detailed analysis is done. The manual data collection and analysis caused delay in the analysis of the failed components. The produced units with this quality status is collated in an excel document and delivered on T+1 day to the management.
- Microsoft Azure IOT Hub
- Azure Service Bus
- Azure Service Fabric
- Azure SQL Database
- Azure Stream Analytics
- Notification Hub
IOT Hardware at Plant
- AM335x 1GHz ARMCortex-A8
- SGX530 graphics accelerator
- NEON floating-point accelerator
- 2x PRU 32-bit 200MHz microcontrollers
- WiLink 8 WL1835MOD 802.11b/g/n 2.4GHz WiFi, Bluetooth and Bluetooth Smart Module
- USB client: power, debug and device
- USB host
- Micro HDMI output
- 512MB DDR3 800MHz RAM
- 4GB Embedded eMMC Flash with Debian Distribution
- MicroSD Card Slot
IOT Hardware at Plant
- Leak test device (LTD) data acquisition was done in a real time process starting from commencement of the component quality process check till the verification is completed and results are obtained.
- The LTD had a serial port (DB9 port) which uses RS232/RS485 protocol to communicate externally.
- We deployed a Beagle-Bone Black embedded device for data acquisition from the device for further processing
- Local storage
- Push to Cloud
- Local Storage - To handle scenarios where the wireless communication was stalled either at the device end or the receiving end, we implemented a local storage mechanism to save the test results in JSON format.
- On Demand - Post to storing in the local environment, we used MQTT protocol to push the JSON file on to the Azure IOT Hub for further processing and consolidated the production status visualization in a real time scenario
- Sync Service –JSON files stored in the local storage was monitored and pushed to Azure IOT Hub. Once the file is pushed to cloud, the local storage is cleared.
- We first analysed the device data and the current dataflow model of the application.
- The data included
- Signals are collected dynamically. It contains information about test values of the components. The value varies from -7 to +11 psi for acceptance
- Metadata contains the component ID of the test and the Beagle Bone device ID which is critical from a quality result capture and reporting perspective (similar to engine / chassis number), acquired OCR (image to text on the fly)
- We deployed and configured a local storage in case of stalled communication and Azure IoT Hub to handle the data ingestion.
- We wrote a nodejs service to store data on the Beagle Bone device in the plant with the purpose of preserving messages and storing those that have not yet been sent to cloud. This acts like a local queue
- We defined endpoints in IoT Hub as Azure Service Bus queues
- The next step was real time monitoring for the Production Engineering Team at the plant level. We used Signal-R based website to host it in Azure using App services and invoked the Service Bus queue for presenting the incoming messages to deliver in a real time without any performance degradation
Processing and Analytics
- We deployed Azure Stream Analytics to monitor the incoming data to look at the threshold and enable the workflow capabilities defined in the system. In case if the PSI value in the metadata is outside of the range of -11 to +7 then the notification message is pushed to the Azure Service Bus queue which will trigger the notification to PED team regarding the test failure.
- This also in addition stores the data in Azure Table storage for MIS reporting purpose at the Head office level at later stages.
- One of the requirements from solution scalability point of view from the client, was to prepare the system for processing data from varied industrial devices in the plant. These devices will have different data formats and requires specific parsers to handle the data.
- To map a queue message to a specific parser we used Message Factory available in Service Fabric SDK. This way the message is classified and parsed by the specialised parsers and stored in Azure DB in JSON format
Code Snippet for retrieving the data from machine and storing in local
Dashboard for Data Visualization
Conclusion & Business Impact
With this cutting edge solution we achieved the objective of
- Real time processing of data
- Test Result for Production Engineering seamlessly
- Notify the test results to Operations heads based on threshold
- EOD MIS reporting
The significant design element is that the current application architecture is that the system will not be impacted by introducing additional machines at the plant level due to Service Fabric approach which will handle data using the Micro services architecture for parsing and further processing.
Opportunities going forward
Big Data Analysis & Machine Learning – Once the manufactured component is released to the field and any complaints logged from the field against a particular component, based on the test results stored in the Azure DB, we can identify others components in the field in a predictive manner (with similar PSI / Test values) to come up with a similar compliant and invoke appropriate predictive maintenance.
Tell us a little about yourself, and we'll be in touch right away