What data does a digital twin need to be productive?
Dr. Elena García Trelles: A productive digital twin combines process data from machine control, sensor data (e.g., temperatures, pressures), materials and batch information, as well as test and inspection results. Simulation data and material models (e.g., damage models or service life models) are added to this. The decisive factor is not so much the ‘amount of data’ as the quality, traceability, and consistent linking of this data in the knowledge graph.
How can the digital twin take economic and ecological aspects into account?
Dr. Elena García Trelles: Economic and ecological aspects are taken into account by supplementing the knowledge graph with key figures such as scrap rates, energy and material usage, cycle times, and carbon footprint. This allows different process variants to be compared not only in terms of quality, but also in terms of cost and sustainability. The digital twin thus supports decisions that keep an eye on product performance, resource efficiency, and climate targets at the same time.
What data do I need as a foundry to make data-driven predictions and gain new insights from my production?
Dr. Johannes Tlatlik: For data-driven predictions in foundries, the more data, the better. It is not enough to simply document the classic target values. We recommend consistently measuring and storing what happens in the process – from machine data and material batches to environmental influences.
The variance of the collected actual values is crucial, because only the natural dispersion in the process provides the basis for valid evaluations. Our analyses clearly show that often unexpected correlations between seemingly unimportant data provide valuable insights.
What advantages do data-driven predictions offer me as a foundry?
Dr. Johannes Tlatlik: The basic requirement for data-driven predictions is that the system can use the prediction to streamline and even render obsolete existing processes, such as subsequent quality checks. In addition, outliers and anomalies can in particular be detected.
A well-designed digital data structure uses the continuously collected actual measurements from ongoing production to provide a precise probability forecast of the quality for each casting. This allows foundries to focus specifically on those 'at-risk' components where the risk of defects is higher according to the algorithm. This eliminates the need for expensive, rigid full testing, saves time and resources in quality control, and allows specialists to focus on areas where the probability of a problem is highest.