Consider the smartphone in your pocket and all the things it enables you to do. From banking to entertainment, the mobile revolution has opened up entirely new economic sectors and now contributes trillions to the global economy. While each phone is an engineering marvel in its own right, the real potential of smartphones comes not only from hardware like integrated circuits and lithiumion batteries, but also software like apps.
In the same way, the Fourth Industrial Revolution is not about any single piece of technology, but about how a wide range of different building blocks come together successfully to create new and improved ways of doing things. These Industry 4.0-enabling technologies run the gamut from data analytics to Internet of Things (IoT) devices and robotics, domains that are being tackled by different research groups across A*STAR.
Hardware that is self-aware
In the realm of robotics, for example, collaborative robots or cobots look poised to make a big impact on how things are manufactured. Promising to alleviate manpower shortages and take over work that is dangerous, dirty and dull, cobots are a relatively small but rapidly growing segment of the industrial robot market, says Alberto De San Bernabé, a Development Scientist at the Advanced Remanufacturing and Technology Centre (ARTC).
The problem when man and machine work together, however, is that safety becomes an issue. Currently, cobots are equipped with internal sensors to detect collisions, as well as external sensors to monitor nearby human operators. “These systems effectively turn a collision that could knock you out into a gentle push. However, cobots usually carry sharp tools, so most of the time it is much better to completely prevent the collision,” De San Bernabé explained.
External safety sensors such as 2D light detection and ranging (LiDAR) are expensive and typically configured to slow down or stop the cobot when an object crosses predetermined static limits. Furthermore, they do not distinguish between trained operators and bystanders, unnecessarily reducing their speed and efficiency when people unexpectedly walk by, De San Bernabé added.
To improve the perception abilities of cobots, De San Bernabé and his team used the depth sensor camera of the Microsoft Kinect gaming accessory to reconstruct the 3D environment around the robot. “Using the RGB-D camera, we were able to determine not only if there are humans in the frame but also the pose of their body, head and limbs. We used that information to determine the speed and direction of the person relative to the robot, feeding that data into an algorithm that dynamically calculates the safety limits.”
Adopting dynamic limits is as safe as using static ones but more efficient, said De San Bernabé. For example, if both cobot and human are moving at low speeds, the minimum safety distance between them can be reduced, allowing the human to stand close to the robot. “If the limit was static, on the other hand, then the robot would stop once the human crosses the limit, even if it is in a situation considered safe by standards such as the ISO,” he explained.
Making the connection
While De San Bernabé and his colleagues work on making individual robots smarter, other researchers at A*STAR are developing ways for smart devices to reach the next level of intelligence by communicating with one another. Connecting industrial Internet of Things (IIoT) devices is particularly challenging, said Min Li Huang and Boon Shyang Lim of the Institute for Infocomm Research (I2R), as the demands are high but the hardware is limited.
“The background noise and interference in industrial environments will affect the quality of the signal and packet reception, causing longer latency or higher packet loss rates. These are not acceptable for IIoT applications which are usually real-time and demand very low data loss rates, very high reliability and very low latency,” Huang said. These issues are compounded by the fact that IIoT sensors are typically low-power devices or ‘edge’ devices that have limited processing and memory capabilities.
Working with Japanese heavy industry manufacturer IHI Corporation, Huang and the team under the leadership of Sumei Sun have developed an IIoT system that can capture sensor data and use it to anticipate machine maintenance needs. Called the 5G-ready Plug & Play IIoT Analytics Module, the technology comprises a wireless communications algorithm and a predictive maintenance algorithm based on machine learning.
Unlike other communications networks which require extensive human intervention to debug and optimize, the network management module developed by Huang is powered by an algorithm that allows it to self-configure and heal, allowing the researchers to achieve a reliability performance of 99 percent. Data collected from devices at the edge are then transmitted over this high reliability network to what are called edge gateways, which are devices with better processors and more memory to support edge analytics, before being sent to the cloud or a local server for storage or further processing.
The predictive maintenance algorithm, on the other hand, uses the sensor data collected to monitor machine condition, detect anomalies and predict the remaining useful life of the machine. A lightweight implementation of the model can be run directly on the edge devices, while results produced at the edge gateway can be used for real-time insights and decision making, Huang says.
“But what really sets us apart is that we are system and platform-agnostic; our intellectual property can be plugged into any of the existing systems or platform tools available,” she adds. For example, the network management module supports many different protocols—including 5G—giving users the flexibility to choose the wireless protocol that best suits their needs.
“IIoT supports many use cases and applications, each requesting for different communications performance and computing algorithms,” said Sun. “Some use cases request for high data rates, such as high resolution images and videos, while others request for very low latency but also lower demands on bandwidth. Depending on the type of data, edge computing algorithms will also need to adapt. The module therefore supports multiple wireless protocols, including 5G. More edge computing functionalities will be enabled as well."
The team is deploying their system at Singapore Institute of Manufacturing Technology (SIMTech)’s Model Factory and ARTC’s Next-Generation HyperPersonalization Line, and is in discussion with a few partners for deployment, Sun shared.
Machine learning takes to the skies
Nowhere do the twin concerns of safety and predictive maintenance come to the fore more than in the airline industry, where costly delays and cancellations can have a profound impact on a company’s bottom line. With an eye towards using technology to address delays, Singapore Airlines (SIA) tapped on the expertise of researchers at I2R to set up the SIA-I2R Joint Lab.
“The partnership forms part of SIA’s Digital Innovation Blueprint to boost our digital capabilities and accelerate the adoption of digital technologies in the aviation and travel industry, helping to transform the aviation industry for the future,” said Hwa Peng Lau, Senior Vice President of Engineering at SIA.
A single flight on an A380 or B777 airplane can generate vast amounts of data, with sensors measuring anything between 1,400 to 3,000 different parameters up to eight times per second. Making sense of so much data is simply not possible for humans, and until recently, even machines. But machine learning has changed the game, allowing researchers to probe the health of different aircraft components and estimate the likelihood of a failure, based on historical flight recorder data.
In particular, the algorithm developed by the joint lab measures the engineering health of a critical component known as the bleed pressure regulating valve, which controls the flow of hot, pressurized air from the engine to the cabin via the length of the wing. Pressure sensors upstream and downstream of this particular valve indicate whether it is sufficiently opened or closed. When combined with other algorithms, this algorithm has helped SIA mitigate more than 500 minutes of flight delay time, according to Lau.
The pressure regulating valve model has been deployed in an online decision support tool used by operational staff. Since 2018, the model has helped to identify three valve failures about one to two days before the failure occurred, Lau shares. “And as with all machine learning algorithms, it will become more accurate over time as we feed more data into it,” Lau says, sharing that the team is continually adding more training events to improve the accuracy and lead time of their model.
“At the same time, as airline operations are dynamic, the team remains poised to embark on new predictive maintenance use cases related to emerging issues, possibly using appropriate new methods including unsupervised learning,” he says.
“Through this applied partnership with I2R, our goal is to develop smart solutions that will help lower maintenance cost, reduce aircraft delays and aid us in enhancing our service standards.”