Introduction
Digitised representations of physical systems or manufacturing processes, digital twins, can accelerate innovation, by improving efficiency, encourage collaborative work and enhance accessibility.
The landscape of digitisation has recently been extended to replicate the climate, construction, social life and even the human physiology in virtual reality to improve the understanding of these systems and their interactions.
For example, the infrastructure digital twin, models the interactions between industries and consumers in an effort to understand the impact of operating conditions on manufacturing, product demand and quality within a business.
One of the several ways to improve the quality of a product or component is the careful investigation of its material properties, which is a manifestation of its manufacturing route.
While this does not require the digitisation of the whole manufacturing unit, a manufacturing digital twin, it is sufficient to simulate the boundary conditions that are experienced during multiple stages of the life cycle (from manufacturing to service and end of life).
For instance, to estimate the in-service life of a turbine blade, it is vital that the material properties at critical locations are fully understood. In some cases it may be sufficient to replicate the manufacturing processes and in-service conditions within a digital twin on the component alone rather than simulating the whole manufacturing route. This representation of processes on a component alone is known as a component digital twin.
The digital integration work package aims to incorporate the insights of multi-scale modelling with experimental measurements to simulate manufacturing processes. This requires a virtual representation of raw materials in 3 dimensions, including material properties and relevant processing boundary conditions. To achieve this, standardised architecture and data storage methods need to be developed. The development of such architecture is the first objective of this work package.
Currently, research is ongoing to identify appropriate international standards to carefully assess their relevance in defining and developing an architecture for digital twins that in conjunction with an independent standard defines a universal and efficient data format to store the materials data.
Standards for Digital Twins
The development of a component digital twin is considered here to be supported by three pillars:
- Material layer representation or the data formats for materials data,
- Flow of data between the sections of materials data and between the physical and virtual realities,
- The stable architecture that handles the data formats and flow.
For a component digital twin to be efficient and successful, it should be built in accordance with certain standards. The compliance with the standards enables smooth and efficient transfer of digital twin principles and data within and between businesses.
In a recent detailed report by the British Standards Institute (BSI) prepared for the Centre for Digital Built Britain (CDBB) [1], several standards have been identified that streamline the compliance for data formats and data flow. Figure 1 shows the published, in-preparation and under review standards for crucial aspects of data within digital twins – data structure, quality, openness and the security.
Figure 1: Summary of published/in-progress standards for data structure, quality, openness and security. Adapted from [1].
While the above figure describes the available standards for the structure and quality of data, a standard to define the data flow and format are still under development within the ISO community. In this context, the existing ISO 10303-235 standard for the representation of materials data within the engineering products seems promising. The standards applied in the different stages of product life cycle are summarised in the Figure 2 (note that a few of the standards shown in the figure may be withdrawn/outdated). The presence of several standards indicate the challenges involved in integrating different stages within product life cycle using a single standard for data format. The recently developed Quality Informative Framework (QIF) appears promising as it allows the usage of a single XML format throughout the stages of product life cycle.
Figure 2: The standards used thus far in different stages of component development. Adapted from [2].
An efficient digital twin architecture forms the foundation for handling the standardised data formats. Figure 3 shows the general scope of standards directly related to the architecture and a roadmap for the development of a reference architecture for a generalised digital twin. The important standard related to the current project is the ISO 23247, which is still under review, that deals with the definitions, detailed description of the elements of a digital twin, data flow between modules of a digital twin and the architecture itself. This standard and the ISO 10303-235 are currently under consideration and review respectively within this work package to develop an efficient materials data representation and an architecture to handle the component digital twin. In this context, it is worth mentioning the roadmap, shown in Figure 4, set out by the BSI within the CDBB project for the deployment of standards for the digital twins, which indicates the sheer complexity involved in the development and implementation of standards for digital twins.
Figure 3: Standards involved in the development of digital twin architecture. Adapted from [1].
Figure 4: Roadmap set out by the CDBB for the development of standards for digital twins [1].
[2] Lu, Y., Liu, C., Kevin, I., Wang, K., Huang, H. and Xu, X., 2020. Digital Twin-driven smart manufacturing: Connotation, reference model, applications and research issues. Robotics and Computer-Integrated Manufacturing, 61, p.101837.
Local Systems for Data Storage
Security, cost and access requirements mean that local storage of data will remain important. Building systems that aid in capturing data at the point it is generated are key to improving digital integration.
Today’s world of smart manufacturing relies heavily on the generation, application and management of data. Data it typically generated at every stage of the product life cycle, from market analysis and business strategies, through to operation and end of life. Figure 1 provides some examples of the data collected.
Figure 2: Different data formats generated in a manufacturing process. Adapted from [2].
In order to utilise the large amount of data generated at each stage of manufacturing process, it needs to be carefully acquired, amended as necessary, compared against the expected quality, cleaned to avoid redundant information and stored in a secured location. Such data can then be collated to obtain a master reference dataset for a given component of a product. The availability of such a master dataset is essential to drive cost-effective business decisions, aid new product development and support the product in operation. Although efficient information management is proven to support businesses, the existing manufacturing systems face several challenges to implement such streamlined data management:
- lack of provision to extract and save data at crucial steps of manufacturing. E.g. temperature, composition and strain rates during forging
- data of different formats are often generated at each stage manufacturing process, as shown in Figure 2, that need to be integrated to obtain a master dataset
- A lack of central data-storage system to store these datasets
- An interface to handle the above limitations is also required to enable the flow of data to and from the data-storage system and to generate industrial big data
The success of this platform lies in the ease of saving raw experimental data in differ data formats, which are then integrated to obtain a repository of useful experimental data for further analyses.
Among the challenges listed above, efforts have already been made to address data integration and cleansing, along with the development of storage systems and interfaces. This has been ongoing at several institutions, namely The University of Sheffield (TUoS) within their Manufacturing with Advanced Powder Processes (MAPP) hub and the National Institute of Standards and Technology (NIST) systems integration division.
The MAPP hub has been working with the Henry Royce Institute, to develop a cloud based material data curation system which can handle the cleaning and integration of the vast amounts of data produced within the academic setting. While this system operates as an interface to store the processing, testing and characterisation of samples, it only interacts with specific proprietary applications. For this platform to be successful, it will need to save raw data in multiple formats, thus allowing ease of access for further analysis. This is just one such example of a data integration and storage systems currently in development.
Although these systems integrate the collected data, they do so only by collecting and segregating the data based on a given reference. For instance, the composition, mechanical testing data, scanning electron microscopy (SEM) images and the optical micrographs are collectively stored for a given sample. While such integration and storage is useful for further analyses and in providing insight into different samples, they do not ease automatic data interpretation and interoperation. In the case of interpretation, it still requires human intervention to analyse the captured SEM images.
The data from such analyses could be further broken down to enable interoperability. For instance, composition maps could be replaced by equivalent statistical representations, and electron back scatter diffraction data by texture indexes. This numeric data could then be deployed as material inputs in simulations and in post-processing analyses of a manufacturing digital thread.
Thus, the data integration work package also aims at assisting in generation of such numeric representations of material microstructures to integrate into manufacturing process simulations as part of a manufacturing digital thread. These quantities can also be used to validate a well-developed digital thread.
[1] Wang, J., Zheng, P., Lv, Y., Bao, J. and Zhang, J., 2019. Fog-IBDIS: Industrial big data integration and sharing with fog computing for manufacturing systems. Engineering, 5(4), pp.662-670.