ENBIS Spring Meeting 2023

Europe/Berlin
Building 101A, Meeting Room 1 (Copenhagen)

Building 101A, Meeting Room 1

Copenhagen

DTU Lyngby campus Anker Engelunds Vej 1 2800 Kgs. Lyngby
Description

ENBIS Spring Meeting on Digital Twins

Copenhagen, Denmark, May 25-26, 2023

 

See Timetable for the final programme

 

Theme of the Spring Meeting

Recent decades have seen a proliferation of computing power that has allowed for more accurate digital representations of physical systems through Digital Twins. Although challenges exist in developing appropriate Digital Twins, they have the potential to provide great benefits. This has led to a fast increase in interest from academia as well as from industry.

At this year's Spring Meeting, we seek to bring together engineers developing Digital Twins with industrial statisticians who have developed powerful methods to analyze the complex datasets generated by these digital replicas. The aim is to foster collaboration, promote knowledge exchange, and advance the development of new techniques that can enhance the accuracy and usefulness of Digital Twins in practical applications. 

This event promises to be an exciting and informative opportunity to learn and engage with like-minded researchers, whether you're an engineer, a statistician, or simply curious about the potential of this cutting-edge technology.

Topics include but are not limited to:

  • Digital Twin development for both the products and processes

  • Simulation models for Digital Twins

  • Design and Analysis of computer experiments

  • Industrial implementation of Digital Twins

  • Robustness and optimization through Digital Twins

  • Practical applications and cases studies of data analytics using Digital Twins

  • Process surveillance and predictive maintenance through Digital Twins

  • Decision making through Digital Twins

  • Verification and validation of Digital Twins in real life

ENBIS Spring Meeting Highlights

Confirmed keynotes are:

  • Jesper Hattel (Technical University of Denmark): "Digital Twins in Manufacturing Processes – from Hype to Real Examples"
  • Ron Kenett (The KPA Group and the Samuel Neaman Institute): "Digital Twins and Engineering of Performance"

The invited sessions are:

  • Digital Twins for Logistics by Saeed Farahani (Novo Nordisk A/S) and Svend Skovgaard Petersen (Novo Nordisk A/S)
  • Digital Twins for Agrofood by Bart De Ketelaere (KU Leuven) and Tom Leblicq (CNHi)
  • Digital Twins for Robust Design by Tobias Eifler (Technical University of Denmark) and Sören Knuts (GKN Aerospace Sweden)
  • Digital Twins for Processes by Ulrich Kruhne (Technical University of Denmark) and Carina Gargalo (Technical University of Denmark) 

The meeting also includes 6 contributed papers sessions as well as a poster session. For the tentative program please see Scientific Programme

 

Registration
ENBIS Spring Meeting 2023 -- Registration
    • 08:00 08:15
      Opening Building 101A, Meeting Room 1

      Building 101A, Meeting Room 1

      Copenhagen

      DTU Lyngby campus Anker Engelunds Vej 1 2800 Kgs. Lyngby
    • 08:15 09:15
      Keynote lecture Jesper HATTEL Building 101A, Meeting Room 1

      Building 101A, Meeting Room 1

      Copenhagen

      DTU Lyngby campus Anker Engelunds Vej 1 2800 Kgs. Lyngby
      • 08:15
        Digital twins in manufacturing processes – from hype to real examples 20m

        These years, society experiences a dramatic change due to the everywhere present digitalization. We all know how our lives have been changed due to social media and just recently the chatbots based on AI may force us to rethink the way we arrange our entire educational system. Production and manufacturing are no exceptions and a similar disruption of how we produce products industrially is underway, but it is not equally visible.
        This transition is often referred to as the 4th industrial revolution or just Industry4.0 in brief. An important part of that transformation is what we call “digital twins” where we based on measurements and advanced mathematical models expressing the underlying physics, can make a digital representation of a product and its manufacturing process chain. Depending on the purpose, these digital twins can be very complex or relatively simple and go from nano scale up to meter scale.
        In the presentation we will give a general background for digital twins in a manufacturing setting and then show some examples from advanced 3D print processes in metal and plastic as well as more traditionally produced cast parts.

        Speaker: Jesper Hattel (DTU)
    • 09:15 10:15
      Invited session “Digital Twins for Logistics” Building 101A, Meeting Room 1

      Building 101A, Meeting Room 1

      Copenhagen

      DTU Lyngby campus Anker Engelunds Vej 1 2800 Kgs. Lyngby
      • 09:15
        How do Digital Twin models support intralogistics within pharma industries? 20m

        One of the critical functions within pharma industries is intralogistics. Intralogistics is the process of automating, optimizing and managing the physical flows of materials inside warehouse and to/from warehouse across Active Pharmaceutical Ingredients (API) production, Aseptic Production (AP), assembly and packaging facilities within pharma industries. Simulating the materials flow help the users to make sure that the design requirements and needs of material for operating different working stations is fulfilled. It is very difficult and, in some case, impossible to produce products, manufacture or expand pharma facilities without using simulations and digital models. Digital twins (DTs) enable the utilization of these models in operations and leveraging both simulated and operational data simultaneously. The digital journey towards DTs usually starts with 3D-modeling of physical assets. This will be followed by visualization – e.g., through VR, simulation, emulation (shadow digital twin) and finally closed-loop DT. Simulation tools have been extensively used in pharma and non-pharma industries to support decision makers regarding project planning, scheduling, and budget forecast for decades. However, simulation is an imitation of reality typically under ideal conditions and usually limited to the design phase. In Novo Nordisk, we are looking into how to combine the advantageous of simulation and operational data to develop more accurate emulation of processes to achieve accurate detailed live insight and validate the digital models. This paves the way towards closed-loop DTs. The emulation phase, in particular, is where statistician and data analyst play a key role to help further development of the digital models, e.g., by implementing advanced statistical models, designing DoEs, analysis of big data, and potentially self-learning algorithms.

        Speakers: Saeed Farahani (Novo Nordisk), Svend Skovgaard Petersen (Novo Nordisk)
    • 10:15 10:30
      Break 15m Building 101A, Meeting Room 1

      Building 101A, Meeting Room 1

      Copenhagen

      DTU Lyngby campus Anker Engelunds Vej 1 2800 Kgs. Lyngby
    • 10:30 12:00
      Contributed session “Digital Twins for Production / Operations” Building 101A, Meeting Room 1

      Building 101A, Meeting Room 1

      Copenhagen

      DTU Lyngby campus Anker Engelunds Vej 1 2800 Kgs. Lyngby
      • 10:30
        The three gears of success: a framework to tackle a need for expanding production capacity 20m

        With diabetes being one of the world’s fastest growing health emergencies, demand for diabetes medication is rising. This calls for expanding production capacity through construction of new facilities as well as increasing capacity within existing facilities. In this work, a systematic method for capacity ramp-up based on data-driven modelling and simulation of an API purification process is presented. The framework of the method can be described as synergy of three domains (gears): operation, process, and equipment, where accounting of those leads to success in the capacity expansion.
        The work started by systematically developing the virtual representation of a physical manufacturing system (i.e. digital twin) for current state quantification and highlighting of a limiting step on a production line. Considering a new process description, the model was then applied to explore potential areas for further improvements in all three domains. The model-based analysis led us both to identify the need for process control modeling, and to define and execute targeted experiments in the lab scale for further modification of the process description and process topology. Finally, to achieve future demand, the roadmap of changes on a production line was proposed and then was split into project packages based on resource availability and complexity of the changes.
        In summary, the focus of this work was to demonstrate the importance of systematic approach for continuous evaluation of the design and operational space through simulation, subject to the available knowledge. In addition, the digital twin has supported decision making for investment project by visualization of effects of proposed changes, and to define required experiments and tests in production. Supporting decision making, series of changes in all three domains were proposed to build a roadmap of projects which take the process from the current to the desired future states.

        Speaker: Anton Ochoa Bique (Novo Nordisk)
      • 10:50
        Development of a digital twin for a downstream process in API manufacturing for decision-making and process improvement 20m

        In the field of API production, process control and stability are crucial for ensuring a maximum capacity and most importantly that final API meets the required quality, as any deviation from these targets can have a dire consequence on the patient’s life. One promising approach to address these challenges in the production is the use of a digital twin, which is a virtual replica of the physical system that can simulate its behavior and provide insights into its dynamics and operation. In this context, a digital twin for a downstream process that is the recovery process in API production is developed based on a hybrid model that accurately represents the underlying process dynamics. The model can then be used to simulate different scenarios and test the impact of various process parameters on the system's behavior. By doing so, one can gain a deeper understanding of the process operation and identify opportunities to optimize its performance in terms of stability and yield.

        Another key advantage of the developed digital twin is its ability to perform “what-if” and “what-is” best analyses, which involved testing hypothetical scenarios to understand how the process might respond under different conditions without performing real tests and without endangering the process. For example, one could simulate the impact of changing the flow rate of a particular stage of the process or the flow profile of the next capture step to see how it affects the overall operation and performance, or to test different process design topologies that aim at increasing the yield.

        In summary, by developing digital twin for downstream API production’s recovery process, a wealth of opportunities was harnessed as it offered a versatile tool and an informative platform that provided a direct value co-creation by supporting the production in the daily operation either to test “what-if” scenarios or screening different ideas to increase the yield. Hence, the developed model helped in achieving a targeted decision-making and allowed for a systematic approach to improve the process performance.

        Speakers: Dr Fethi Belkhir (Novo Nordisk), Mrs Marta González García (Novo Nordisk)
      • 11:10
        Industry 4.0 Enabled PILOT Facility in DTU Chemical Engineering 20m

        Topics in digitalisation and industry 4.0 have become the centre point of a new frontier in the chemical and biochemical manufacturing industry over the last decade. Especially the area of Digital Twins (DT) is diligently being explored by industry and academia to improve large-scale operational efficiency and -characterisation through tighter data integration between the physical system and digital entities [1, 2]. The implementation of DT in the biochemical- and pharmaceutical industry is severely restricted by the low degree of automation in typical production lines and the complexity of the digital models used in predictive toolboxes [2, 3]. Most publications demonstrating digital twin implementations in biomanufacturing processes are often limited to laboratory-scale setups and automated case studies. There is a significant lack in the literature that explores how to employ tight data integration for improving operations at a larger scale with operators responsible for key decision-making.
        To address this demand, the Department of Chemical and Biochemical Engineering at DTU initiated a comprehensive retrofitting and expansion initiative of the existing pilot plant laboratory in 2021 to enable the research and development of digital twins for common large-scale process operations [3]. Since 2021 key infrastructure has been commissioned that achieves easy, tight, and flexible integration between digital and physical entities. The implemented digital infrastructure achieves a flexible operation through a two-layer approach. The first layer consists of augmenting each unit with access to a proprietary IoT gateway software (vNode) which allows for the modular addition of sensors. The second layer is the incorporation of a cloud server infrastructure that can easily be scaled based on computational demand.
        This work presents an overview of the completed commissioning of pilot plant 4.0 and a workflow that enables large-scale integration and development of digital twins for unit characterisation and operator support using a simple batch distillation column as an example. The batch distillation column includes a range of online and offline sensors combined with at-line measurements performed by the operator. The reflux ratio and the product flow are controlled by a set of manual valves that the operator must balance to achieve a steady operation. Poor control of the valves will lead to a very oscillating operation that may result in a violation of product composition and complete draining of the condenser.

        [1] Bähner F., et al., 2022, Challenges in Optimisation and Control of Biobased Process Systems: An Industrial-Academic Perspective, Industrial and Engineering Chemistry Research, vol 60, issue 42, 14985-15003
        [2] Cimino C. et al., 2019, “Review of digital twin applications in manufacturing”, Computers in Industry, vol 113
        [3] Jones M., et al., 2022, Pilot Plant 4.0: A Review of Digitalization Efforts of the Chemical and Biochemical Engineering Department at the Technical University of Denmark (DTU), Computer Aided Chemical Engineering, vol 49, 1525-1530

        Speaker: Mads Stevnsborg (Technical University of Denmark)
      • 11:30
        A Digital Twin-based Framework for Adaptive Production Control 20m

        In the current era of Industry 4.0 and cyber-physical systems, the way production systems are designed and managed has been significantly transformed by digital technologies. In this context, the Digital Twin (DT) has emerged as a promising simulation paradigm to support decision-making through various engineering applications. In fact, DT can be considered a virtual replica of a physical system capable of reproducing its behavior by relying on simulation models and real-time data integration. DT typically enables improvements in production processes by simulating alternative scenarios to allow the evaluation of the impact of changes in how the system is designed and operated. By predicting system behavior, DT can assist in making informed decisions, identifying potential issues, and allowing performance optimization.
        Despite extensive research on DT for the area of production planning and control, the focus remains mainly related to scheduling, and very few studies have investigated the use of DT for enhancing existing production control mechanisms or for developing new ones. To address this literature gap, the authors propose a new framework for adaptive production control based on existing control mechanisms and rules, focusing on throughput, work-in-progress, and resource utilization. Leveraging on DT predictive capabilities and inspired by model predictive control theory, the proposed framework supports the optimization of production control parameters, allowing to adapt to changing production conditions and to maintain stable production performance over time. Control parameters dynamically modify mechanisms and rules which determine how orders are sequenced, released, and routed, changing the overall system behavior.
        To validate the effectiveness and quantify the benefits of the proposed framework, experiments were conducted in fully virtualized environments. The framework was proven capable of increasing throughput while reducing WIP, thereby improving the overall performance of the manufacturing system. The proposed framework offers several advantages, drastically improving traditional production control mechanisms. Control can be optimized predictively according to simulation results and the rolling horizon approach enables the framework to adapt to changing production conditions, ensuring system stability and robustness over time. In conclusion, this study contributes to the research in production planning and control methods based on DT by proposing a framework for adaptive production control, highlighting the importance of predictive insights for optimization under changing system conditions.

        Speaker: Lorenzo Ragazzini (Politecnico di Milano)
    • 12:00 13:00
      Lunch break 1h
    • 13:00 14:30
      Contributed session “Digital Twins for Quality and Reliability” Building 101A, Meeting Room 1

      Building 101A, Meeting Room 1

      Copenhagen

      DTU Lyngby campus Anker Engelunds Vej 1 2800 Kgs. Lyngby
      • 13:00
        Digital Twins for Data-driven Reliability Assessment of Cyber-Physical Production Systems 20m

        In recent years, there has been a significant increase in the deployment of cyber-physical production systems (CPPS) across various industries. CPPS consist of interconnected devices and systems that combine physical and digital elements to enhance the efficiency, productivity, and reliability of manufacturing processes [1]. However, with the proliferation of these complex systems, there is a growing need for effective and efficient methods to assess their reliability and performance.

        CPPS have introduced new challenges for traditional, expert-based reliability assessment of manufacturing systems. Firstly, reliability assessment is a time-consuming and labor-intensive process that requires significant resources to collect and analyze data manually. Secondly, expert-based assessments are prone to human errors and biases, as they rely on the individual expertise and experience of analysts. This can lead to inconsistent results and misinterpretation of data, which can have serious consequences for the reliability and safety of the manufacturing system. Finally, traditional methods may not be able to capture the dynamic behavior of CPPS, which typically evolve and change their topology and configuration throughout their lifetime [2,3].

        One promising solution to these challenges is the use of digital twins (DTs) for data-driven reliability assessment (DDRA) of CPPS. DTs are virtual representations of physical assets or systems that can be used to simulate and analyze their behaviors and performance. Digital twins provide a means to collect and analyze real-time data from physical systems, enabling engineers and operators to identify potential issues and optimize reliability [4]. By leveraging DTs, DDRA can generate accurate reliability models rapidly and reduces the need for subject matter experts. In addition, DDRA can provide manufacturers with better insight and understanding of the root causes of failures. Finally, DDRA can assist decision makers in maintenance planning, machine purchasing, or plant layout configuration, contributing to more efficient resource allocation [2].

        We provide a comprehensive overview of DDRA for CPPS, exploring key concepts, methods, and tools involved. We specifically focus on the data requirements to automate reliability modeling in a manufacturing context and highlight the formalisms that are most suitable for DDRA. Additionally, we present recent research on the validation of data-driven reliability models, providing insights into the accuracy and robustness of results achieved by using DDRA. To demonstrate the real-world benefits of DDRA, we showcase several case studies conducted in collaboration with manufacturers that highlight how this approach can be used to enhance the reliability and performance of CPPS.

        References

        [1] Monostori, L. (2014). Cyber-physical production systems: Roots, expectations and R&D challenges. Procedia Cirp, 17, 9-13.
        [2] Friederich, J., & Lazarova-Molnar, S. (2021). Towards data-driven reliability modeling for cyber-physical production systems. Procedia Computer Science, 184, 589-596.
        [3] Lazarova-Molnar, S., & Mohamed, N. (2019). Reliability assessment in the context of industry 4.0: data as a game changer. Procedia Computer Science, 151, 691-698.
        [4] Friederich, J., Francis, D. P., Lazarova-Molnar, S., & Mohamed, N. (2022). A framework for data-driven digital twins for smart manufacturing. Computers in Industry, 136, 103586.

        Speaker: Mr Jonas Friederich (University of Southern Denmark)
      • 13:20
        Conceptual Digital Twin Framework for Quality Assurance in the Injection Molding Industry: Technical and Digital Skill Perspectives 20m

        Conceptual Digital Twin Framework for Quality Assurance in the Injection Molding Industry: Technical and Digital Skill Perspectives
        Till Böttjer, Sara Blasco Román

        In order to maintain a competitive edge in the market, companies strive to improve the quality and performance of their products while also minimizing downtime and maintenance costs. One of the ways to achieve this is through the use of Digital Twins (DTs), which are enabled by the fourth industrial revolution. Hence, DTs can serve as a powerful tool for implementing the necessary quality assurance (QA) processes. Schol- ars developed domain-specific DT-driven QA frameworks such as for the aerospace industry [1], semiconductor manufacturing [2], and metal sheet assembly processes [3]. However, while DTs have shown great promise as a tool for QA, current DT-driven QA frameworks do not provide guidelines for addressing the unique challenges of the injection molding industry, which include a highly complex process chain, collaboration and com- munication silos, and the need for enhanced quality assurance tools. To alleviate these challenges, this paper proposes a DT-driven quality assur- ance framework for injection molding. The purpose of the framework is to guide manufacturers in how to setup a DT for injection molding quality assurance that (i) meets the QA needs across the mold lifecycle (ii) can be constructed from meaningful data available in industry, (iii) has the re- quired personnel with the digital skills to build, operate and maintain the DT, (iv) adds value to the company on economic and social dimensions. We deliberately focus our analysis on digital skills as DTs rely heavily on data, and hence being able to read, analyze, interpret, and communicate data is crucial for achieving the full potential of a DT. The framework was developed by the Design Science Research (DSR) methodology in collaboration with a leading Danish injection molding company and DT experts. The DSR method can be used to develop design artifacts and en- sures the academic and industrial relevance. The DSR method consists of three elements: (i) environment, (ii) knowledge base, and (iii) design arti- fact. The environment defines the problem space and business needs. The problem space in this work resides in the three injection molding lifecy- cle phases: engineering, mold manufacturing, and injection molding. The knowledge base contains literature and existing information to ground the development of the design artifact, i.e., DT-driven QA framework. In this research, the knowledge base consists of digital skills transformation lit- erature, an unstructured review of DT frameworks, and expert interviews from the DT domain. The design artifact, in this paper the DT frame- work design, describes the iterative process of creating and evaluating the framework design. Inputs to the design artifact are the requirement coming from the environment and fundamental knowledge derived from the knowledge base. The DT-driven QA framework was evaluated by a combination of literature review and expert inputs. The final framework was validated by a combination of academic literature and expert inputs. Overall, we developed a conceptual DT-driven QA framework tailored to the specific requirements of injection molding companies that can help those companies to take full advantage of the DT technology. Besides the DT framework, this paper makes an academic contribution by outlining the linkage of digital skills and technical requirements of building, using and maintaining the DT.

        References
        [1] Albrecht Hänel, Thorben Schnellhardt, Eric Wenkler, Andreas Nestler, Alexander Brosius, Christian Corinth, Alexander Fay, and Steffen Ihlen- feldt. The development of a digital twin for machining processes for the application in aerospace industry. Procedia CIRP, 93:1399–1404, 2020.
        [2] Foivos Psarommatis. A generic methodology and a digital twin for zero defect manufacturing (zdm) performance mapping towards design for zdm. Journal of Manufacturing Systems, 59:507–521, 2021.
        [3] Rikard Söderberg, Kristina Wärmefjord, Johan S Carlson, and Lars Lind- kvist. Toward a digital twin for real-time geometry assurance in individual- ized production. CIRP annals, 66(1):137–140, 2017.

        Speakers: Sara Blasco Román (Copenhagen Business School and The LEGO Group), Mr Till Böttjer (Aarhus University and The LEGO Group)
      • 13:40
        Improving Pork Shoulder Processing with a Digital Twin-based Inspection System: Utilizing X-ray Technology and 3D Imaging 20m

        Introduction:

        Pork is one of the most important meat products for the European Union, with the EU being the world's largest pork exporter. In 2020, the EU exported over 4 million tons of pork, with a total value of €7.7 billion (Eurostat, 2021). The largest pork-exporting countries within the EU include Spain, Denmark, and Germany.

        However, despite the significant export value of pork products for the EU, the deboning process remains labor-intensive and manual. There is an interest in advanced automation of the processing chain to replace or assist the deboning operations. Thereto, inline inspection systems are explored that can analyze the bone structure of pork shoulders. A computed tomography (CT) system would be suitable as the X-rays can penetrate large volumes of meat resulting in high-contrast reconstructed 3D images of the meat and bone. Based on these 3D models a pork shoulder can be produced and fed to the automation system. However, CT systems have a few disadvantages such as their size and cost. Using prior knowledge of the pork shoulder, a 3D image can be rendered from a single X-ray projection assisted with input from a 3D camera and training data of the digital twin. Such a multi-sensor imaging system could be made much more simple and cheaper and would be easier to integrate into existing processing lines.

        Method:

        This Ph.D. research focuses on a new method involving a digital twin that uses a shape model of the pork shoulder's outer and bone shapes, both separately and combined. This approach allows for the use of a multi-sensor inspection system that can work with input from a 3D camera to fit a realistic 3D model of the pork shoulder to the measured point cloud. This shape model can then be used to simulate and improve the X-ray system.

        Instead of relying on CT measurements, specifically trained neural networks are explored to predict the 3D bone structure from a limited number of 2D radiographs. This research created reference 3D models of the outer shape and bone using CT measurements of over 90 pork shoulders of different weight classes on high-resolution systems at the KU Leuven XCT core facility and a gantry CT system at UZ Leuven. Additionally, shape models have been developed to characterize the shape variance of pork shoulders to synthetically create more training data for the neural network.

        Results:

        At the moment, the developed digital twin is capable of simulating a multi-sensor system. It can simulate realistic X-ray images from various angles to inspect pork shoulders and simulate the use of a 3D camera to examine the pork shoulder from different viewpoints. By using the Fminc optimizer, this system can fit realistic 3D models that accurately represent the outer shape and inner structure from the simulated 3D camera view. The bone structure is predicted with an average mean squared error (MSE) of 6.94 mm on the test set and an overall average error of 9.80 mm on the combined shape of the test set.

        Conclusion:

        Based on these results, it can be concluded that the developed digital twin system can be an effective tool for simulating multi-sensor systems and accurately predicting the 3D structure of pork shoulders. The ability to simulate realistic X-ray projections and use a 3D camera to inspect the shoulder from various angles demonstrates the system's versatility and potential for use in various imaging applications.

        However, currently, the error is calculated based on the simulated 3D camera data of the outer shape. These results will be compared to real 3D camera data collected on the inspection machine in the future to determine if domain adaptation will be necessary. Additionally, other possible fitting optimizers need to be explored to further improve these results.

        Speaker: Michiel Pieters (Ku Leuven)
      • 14:00
        Digital twin-driven quality assessment approach in a pharmaceutical assembly line 20m

        The transition from batch to continuous manufacturing in the pharmaceutical industry is well underway, leading to the investigation of new technologies. Digital Twin (DT) is one of the topics promising to optimize processes and increase product quality by integrating the sensor data and simulation models.
        In this presentation, we report on the main steps for implementing the DT for a particular cell, part of manufacturing assembly pilot line. We describe the most important aspects of the digital twin, such as the physical system, data collection and transfer layer, visualization, models, as well as more advanced DT services such as fault detection.
        The physical system includes a snap-fit process carried out by a linear motor and measurement. To collect and store the data, a transfer layer, Apache Kafka, is implemented and continuously collects and maintains the data. Moreover, to visualize different measurements such as applied force, torque and displacement, an event-based data dashboard, has been established. The simulation model for the snapping process has been developed to investigate the effect of assembly and geometrical variation and find the optimum settings. Furthermore, this simulation model is able to generate synthetic samples to assist future data-driven models. Eventually, an interpretable fault detection model is developed to detect the potential deviations from the normal situation.
        In conclusion, DT technology shows great promise in enabling real-time monitoring and optimization of continuous manufacturing. This study is an example of how DT technology can be used to build a fault detection system and optimize a snap-fit process by implementing a data transfer layer, a dashboard, and a simulation model. This presentation proves the potential of DT, which adds to the overall understanding of the process, reduces abnormal situations, and therefore improves product quality.

        Speaker: Fatemeh Kakavandi (Aarhus University)
    • 14:30 14:45
      Break 15m Building 101A, Meeting Room 1

      Building 101A, Meeting Room 1

      Copenhagen

      DTU Lyngby campus Anker Engelunds Vej 1 2800 Kgs. Lyngby
    • 14:45 15:45
      Invited session “Digital Twins for Agrofood” Building 101A, Meeting Room 1

      Building 101A, Meeting Room 1

      Copenhagen

      DTU Lyngby campus Anker Engelunds Vej 1 2800 Kgs. Lyngby
      • 14:45
        Simulation based optimization of forage harvesters 20m

        Forage harvesters are used worldwide to harvest grass, corn or hay. After the crop has been chopped into small pieces it’s accelerated and thrown, via a spout, into a tractor-driven trailer. The interaction between crop and air flow plays an important role in the spout. The design of the spout determines how well it can handle crop in different harvesting conditions. When the trailer is full the tractor takes the crop to storage silos and bunkers where it is compacted and fermented to provide feed for livestock or biomass used for bioenergy.
        Designing spouts is not a trivial task as this component needs to operate optimally for different crop types, in different harvesting conditions and for different, operator chosen, machine settings. Moreover, the time for physical testing is limited to the harvesting window (a limited number of weeks per year). For these reasons computer simulations are used frequently nowadays when developing agricultural machines. The interaction of the crop flow with functional components is typically simulated using discrete element modelling (DEM), the air flow is taking into account using computational fluid dynamics (CFD).
        Prior to this research, a simulation model of the crop flow through the functional channel of a forage harvester has been developed and successfully validated using field test data. In the current research that simulation model was used to determine the optimal spout’s curvature. Seven parameters (some correlated, some with restrictions between them) were identified determining the spout shape. Next to these geometrical parameters also crop properties and machine settings were taken into consideration (e.g. the crop throughput (tons/h), the length of cut (LOC), the engine RPM (determining the rotational speed of all functional machine components and therefore also the speed of the crop), the spout height and rotation setting, etc.).
        Although, simulations are considerably cheaper than building prototypes and testing those in field conditions, coupled DEM-CFD simulations still come at a high computational price (calculation times range from a couple of days to a couple of weeks depending on the required accuracy and the number of particles). To maximize the information gained from a limited number of simulations a designed test plan was used (space filling design). In total 130 simulations were performed with different combinations of spout shapes, crop properties and machine settings. Gaussian processes were used to link the simulation output (spout efficiency, crop speeds, risk of blockages) to these test plan parameters. Once these surrogate models were available, they were used to find optimal spout shapes. For this, a second space filling design was used to evaluate the performance of 20000 spout shapes (taking the restrictions into account). A desirability function (considering all output variables) was used to evaluate the different spout designs. The pareto front provided the most interesting spout shapes.
        A modular spout, where the shape can be adjusted on the go, was used to evaluate the most promising designs in field conditions.

        Speaker: Tom Leblicq (CNHi)
      • 15:05
        Digital twins in Quality Engineering: cases, challenges and the role of statisticians 20m

        The pace of digitalization is rapidly accelerating with businesses and industries utilizing technologies such as IoT, cloud computing, big data analytics, and AI to create digital twins (DTs) - virtual replicas of physical systems or environments. DTs offer a powerful framework for integrating the physical and virtual worlds, enabling more efficient and effective decision-making. Industries such as manufacturing, automotive, and healthcare are utilizing DTs to simulate behavior and create personalized models for more efficient optimization, safer vehicle design, and better treatment plans. As the use of DTs continues to grow, it is crucial for businesses to stay updated and leverage them to gain a competitive advantage. DTs require a multidisciplinary approach, bringing together specialists from different fields such as computer science, mathematics, and statistics. The talk will showcase two real-world examples of DTs used in product development and quality inspection to define key challenges and research avenues for statisticians. Whereas statisticians historically focused on physical world characteristics, an expanded focus including the virtual representation of DTs, as well as techniques combining the information gained from both worlds are interesting and will be discussed.

        Speaker: Dr Bart De Ketelaere (Catholic University of Leuven)
    • 15:45 16:00
      Break 15m Building 101A, Meeting Room 1

      Building 101A, Meeting Room 1

      Copenhagen

      DTU Lyngby campus Anker Engelunds Vej 1 2800 Kgs. Lyngby
    • 16:00 17:30
      Contributed session “Digital Twins for Modeling and Optimization” Building 101A, Meeting Room 1

      Building 101A, Meeting Room 1

      Copenhagen

      DTU Lyngby campus Anker Engelunds Vej 1 2800 Kgs. Lyngby
      • 16:00
        Integrated & Computer-Aided Product & Process Design: Liquid-Liquid Extraction of Industrially Relevant Organic Acids in biomanufacturing 20m

        Separation and purification processes are responsible for a large portion of the production cost in the bioprocessing industry (McGlaughlin, 2012). As such, significant interest and efforts have been geared toward improving existing processes or developing new and innovative ones. Extractive recovery of products from fermentation broth provides an attractive way of separating heat-sensitive or high-boiling products, or products that form azeotropes with water (Kertes and King, 1986). The selective removal of the product using in-situ operation could potentially result in kinetic enhancement of the reaction yielding the desired product. The choice of solvent is a key element in ensuring the extraction process’s thermodynamic feasibility and economic viability.
        In this work, we present an integrated product-process design framework that combines computer-aided molecular design (CAMD) with a short-cut method for evaluating key process parameters such as product recovery and selectivity. The framework enables the generation of new molecules with desired properties as well as the evaluation of already existing compounds for specific applications. The framework provides a decision-making tool for a more informed detailed simulation and experiments by limiting the chemical design space of viable candidates. This is done by evaluating the pure component properties as well as the mixture properties, both of which are done through quantitative structure-property relations (QSPRs). QSPRs are predictive models that use a numerical translation of the molecular structure and correlate it to the property of interest.
        The framework is applied to identify solvents that can selectively extract succinic acid from an aqueous solution also containing acetic acid. The problem is decomposed to first investigate compounds that can:
        • Extract succinic acid from an aqueous solution
        • Extract acetic acid from an aqueous solution
        • Selective extraction of succinic acid from a solution of water and acetic acid
        The work will aim to investigate whether currently used thermodynamic and property models are capable of such a task and compare the obtained results to studies readily available in the literature.

        References

        Kertes, A.S., King, C.J., 1986. Extraction chemistry of fermentation product carboxylic acids. Biotechnol Bioeng 28, 269–282.
        McGlaughlin, M.S., 2012. An Emerging Answer to the Downstream Bottleneck. Bio-Process International 10 (5) 58–61.

        Speaker: Adem R. N. Aouichaoui (Process and Systems Engineering Center (PROSYS), Department of Chemical and Biochemical Engineering, Technical University of Denmark)
      • 16:20
        Automated Bioreactor Compartmentalization Using Unsupervised Machine Learning 20m

        Digital Twins are essential in Bioprocessing 4.0 as virtual representations of bioprocess components. Existing coupling methods between Computational Fluid Dynamics (CFD) and biological models focus on simple metabolic approaches, resulting in complex simulations that strain high-performance computers.

        We propose an automated compartmentalization approach for bioreactors using unsupervised machine learning and cellular metabolic state analysis, addressing limitations of prior techniques reliant on spatial discretization or 2-D compartmentalization and based on fluid flow characteristics. We also examine the optimal number of compartments for accurate bioreactor representation.

        Our approach enables more complex metabolic models and reveals statistical trends in cellular performance through CFD and black-box model integration. This innovative solution advances Bioprocessing 4.0 by providing a precise understanding of the interplay between physical and biological systems in bioreactors.

        This 250-word abstract showcases our groundbreaking work on bioreactor compartmentalization using machine learning and metabolic state analysis, contributing to the optimization and efficiency of bioprocessing techniques.

        Speaker: Mr Víctor Puig I Laborda (Danmarks Tekniske Universitet)
      • 16:40
        A computational framework for predictive digital twins of civil engineering structures 20m

        The digital twin concept represents the most appealing opportunity to move forward condition-based and predictive maintenance practices. Enabling such a paradigm shift for civil engineering systems would allow for reducing lifecycle (economic and social) costs and increasing the system safety and availability. This is nowadays possible as the installation of permanent real-time data collecting systems has become affordable, as well as thanks to the recent advances in learning systems and diagnostic activities.

        This work proposes a predictive digital twin approach to the health monitoring, maintenance and management planning of civil structures. The asset-twin coupled dynamical system, and its evolution over time are encoded by means of a probabilistic graphical model. In particular, a dynamic Bayesian network equipped with decision nodes is adopted to rule the observations-to-decisions flow and quantify the related uncertainty. For diagnostic purposes, the dynamic Bayesian network is used to update and track the evolution of the structural health parameters comprising the digital state and describing the variability of the physical asset. The assimilation of observational data is carried out with deep learning models, useful to automatically select and extract damage-sensitive features from raw high-dimensional vibration recordings, and ultimately relate them with the corresponding structural state in real-time. The digital state is continuously updated in a sequential Bayesian inference fashion, and eventually exploited to inform an optimal planning of maintenance and management actions within a dynamic decision making framework. For prognosis purposes, the dynamic Bayesian network is used to forecast the future damage growth, according to control-dependent transition dynamic models describing how the structural health is expected to evolve.

        The digital twinning framework is made computationally efficient through a preliminary offline phase that involves: (i) the population of training datasets through reduced-order numerical models, exploiting the physics-based knowledge about the system response. This is useful to overcome the lack of experimental data for civil applications under varying operational and damage conditions. (ii) the computation of a health-dependent control policy to be applied at each time step of the online phase. Such a control policy is then exploited to map the belief over the digital state onto actions feeding back to the physical asset.

        The strategy is assessed on the simulated monitoring of a cantilever beam and a railway bridge. The obtained results prove the capabilities of health-aware digital twins of accurately tracking the evolution of structural health parameters under varying operational conditions, and promptly suggesting the most appropriate control input with relatively low uncertainty.

        [1] L. Rosafalco, M. Torzoni, A. Manzoni, S. Mariani, and A. Corigliano. Online structural health monitoring by model order reduction and deep learning algorithms. Computers & Structures, 255:106604, 2021.
        [2] M. Torzoni, L. Rosafalco, A.Manzoni, S.Mariani, and A. Corigliano. SHM under varying environmental conditions: an approach based on model order reduction and deep learning. Computers & Structures, 266:106790, 2022.
        [3] M. G. Kapteyn, J. V. Pretorius, and K. E. Willcox. A probabilistic graphical model foundation for enabling predictive digital twins at scale. Nature Computational Science., 1(5):337-347, 2021.

        Speaker: Matteo Torzoni (Dipartimento di Ingegneria Civile e Ambientale, Politecnico di Milano)
      • 17:00
        Steps toward a Digital Twin: Biomass and growth rate modelling in batch and fed-batch bioprocess of Lactobacillus rhamnosus 20m

        Lactobacillus rhamnosus (L. rhamnosus) represent a valuable potential for applications in the continuously growing multi-billion euro functional-food industry. This commensal microorganism is known for its multiple health benefits including immunomodulatory and gut-stimulating properties. Optimizing the production of L. rhamnosus biomass can enhance the efficacy of its use. In this study, three artificial neural networks (ANNs) were developed to predict biomass and growth rate in batch and fed-batch bioprocesses using a dairy-based substrate. Additionally, the immunomodulatory effect of L. rhamnosus was examined, revealing anti-inflammatory properties.
        This research aims to maximise the production of L. rhamnosus biomass through optimized and robust control of bioprocesses. L. rhamnosus was cultivated in bench-scale studies on industrially suitable dairy-based feedstocks, namely skim milk powder, 90% demineralised whey and whey permeate.
        Based on on-line and at-line measurements of bioprocess data, three ANNs were developed to predict biomass and growth rate in batch and fed-batch bioprocesses. This can be used to estimate biomass and growth rate digitally in real-time, creating a digital twin of the process. This estimator can be used to regulate the addition of feed to the bioprocess, leading to more precise control and increased productivity. These findings demonstrate the potential of ANN modelling and digital twin development for bioprocess optimisation of L. rhamnosus for application as a functional food ingredient. The immunomodulatory effect of L. rhamnosus was furthermore investigated and in-activated cell samples were found to have anti-inflammatory properties.
        Overall, this research provides a comprehensive approach to the optimization of L. rhamnosus production through ANNs, digital twin development, and process control. The findings are significant for the functional food industry, demonstrating the potential for using L. rhamnosus as a functional food ingredient.

        Speaker: Helena Mylise Sørensen (Dublin City University)
    • 08:00 09:00
      Keynote lecture Ron KENETT Building 101A, Meeting Room 1

      Building 101A, Meeting Room 1

      Copenhagen

      DTU Lyngby campus Anker Engelunds Vej 1 2800 Kgs. Lyngby
      • 08:00
        Digital Twins and Engineering for Performance 20m

        Advancements in technology and data access are followed by significant changes in the scope of engineering work. In Industry 4.0, sensor technologies enable improved monitoring, diagnostic, prognostic and prescriptive analytic capabilities. Systems and processes are now paired with digital twins, a digital asset that parallels the physical assets. This evolution triggers a shift from engineering of design to engineering for performance integrating mathematical models driven by physics, with statistical and analytic data driven models. The talk will review the background of digital twins and sketch future pathways emphasizing engineering for performance, in contrast to engineering of design. A case study from the Israeli railway system will be presented.

        References
        • Chinesta F., Cueto, E. Abisset-Chavanne, E., Duval, J.and el Khaldi, F. (2020) Virtual, Digital and Hybrid Twins: A New Paradigm in Data-Based Engineering and Engineered Data, Archives of Computational Methods in Engineering, 27()1, pp. 105–134.
        • Dattner, I., Bortman, J., Kenett, R.S. and Chinesta, F. (2021) Learning Dynamical Systems: Optimal Integration of Mathematical Models with Machine Learning and Artificial Intelligence, https://mmldt.eng.ucsd.edu/
        • Kenett, R.S., Zonnenshain, A. and Fortuna, G. (2018) A road map for applied data sciences supporting sustainability in advanced manufacturing: The information quality dimensions, Procedia Manufacturing, 21, pp. 141–148.
        • Kenett, R.S. and Zacks, S. (2021) Modern Industrial Statistics: With Applications in R, MINITAB, and JMP, 3rd Edition, John Wiley and Sons.
        • Kenett, R.S. and Bortman, J. (2022) The digital twin in Industry 4.0: a wide-angle perspective, Quality and Reliability Engineering, 38(3), pp. 1357-1366.
        • Kenett, R.S., Zacks, S. and Gedeck, P. (2022) Modern Statistics: A Computer-Based Approach with Python, SITE series, Springer/
        • Kenett, R.S., Zacks, S. and Gedeck, P. (2023) Industrial Statistics: A Computer-Based Approach with Python, SITE series, Springer.

        Speaker: Prof. Ron Kenett (KPA Ltd.)
    • 09:00 10:00
      Invited session “Digital Twins for Robust Design” Building 101A, Meeting Room 1

      Building 101A, Meeting Room 1

      Copenhagen

      DTU Lyngby campus Anker Engelunds Vej 1 2800 Kgs. Lyngby
      • 09:00
        Robust Design in the age of Digital Twins - an engineering design perspective 20m

        “Cost is more important than quality, but quality is the best way to reduce costs”. In the spirit of this inspirational quote from Genichi Taguchi, traditional Robust Design has developed into a well-established part of the quality-by design domain. Including widely known tools such as signal-to noise ratios, quality loss functions, crossed-array experiments, etc., Robust Design offers a coherent and appreciated approach for the parametric optimization of parameter settings that not only ensure sufficient performance but also robustness of the solution against variation and noise.

        However, is this traditional Taguchi Quality Engineering toolbox still appropriate for today’s engineering practices that are increasingly driven by advanced numerical simulation and optimization tools? This talk approaches this question from a design perspective. Based on an overview of robust engineering examples, the aim is to stimulate a discussion around future challenges of using Digital Twin technology to ensure product robustness throughout an increasingly interdisciplinary development process. This includes i) insights on basic challenges of DoE and surrogate modelling strategies which are usually focused on detailed modelling rather than efficient design decisions, and ii) the underlying requirements towards CAD modelling, simulation toolchains, etc. that are necessary to automatically explore a multidimensional design space. The presentation will furthermore highlight current academic research in model-assisted design space exploration and trade-off approaches that extend the parametric exploration of largely matured products towards earlier conceptual and configuration design decisions.

        Speaker: Tobias Eifler (Technical University of Denmark)
      • 09:20
        The Digital Twin and uncertainty handling in aiming for a Robust Design 20m

        “All models are wrong, but some are useful”. This fundamental quoting from George RP Box puts focus on the awareness on actual uncertainty and approximations that all models have to deal with. In engineering most product designs nowadays are developed and built including significant simulation efforts based on advanced multi-physics models. In aerospace engine design mechanical, thermo-mechanical or aerodynamic simulations are the dominant models that are used in order to set a product design that are meeting the customer’s requirement. Computer modelling in Aerospace design has been done for quite a long time, and the Digital Twin concept is rather young as a concept. On the other hand the resolution in analytical models has largely enhanced during the last decade, and therefore the Computer Aided Models are experienced many times as an exact twin copy of the reality.
        When performing a design study parametric Computer Aided Finite Elements and Finite Volume models are applied, and studies can be performed with a wide range of parameters varying to simulate a so called Knowledge Space of the parameter settings. This means that optimization with respect to requirements on a desired customer solution more easily can be carried out to meet the desired so called Design Space. However, by the size of the multi-physics optimization problem, when adding the handling of the manufacturing capability with restrictions on dimensions and tolerances as well as the design margins show a true challenge to find a Robust Optimized Design. Therefore certain type of tricks need to be done to find a product design that fulfills the customer needs as well as giving safety margins when applied on an aircraft.
        In this presentation examples are given on how statistics concepts are influencing how design optimization is carried out. Both in the software platform that allows for adaption of the latest response surface fitting algorithms, as well as handling of variation and uncertainty in finding the Robust Design Optimized solution. Identification of obstacles and challenges, while working on finding a desired customer design definition, with a statisticians view point.

        Speaker: Dr Sören Knuts (GKN Aerospace Sweden)
    • 10:00 10:15
      Break 15m Building 101A, Meeting Room 1

      Building 101A, Meeting Room 1

      Copenhagen

      DTU Lyngby campus Anker Engelunds Vej 1 2800 Kgs. Lyngby
    • 10:15 11:25
      Contributed session “Digital Twins for Quality and Reliability” Building 101A, Meeting Room 1

      Building 101A, Meeting Room 1

      Copenhagen

      DTU Lyngby campus Anker Engelunds Vej 1 2800 Kgs. Lyngby
      • 10:15
        Realistic Fault Simulation Platform for Testing Monitoring Strategies in Wastewater Treatment Plants 20m

        Instrumentation, control and automation for industrial chemical and biochemical processes to attain cost-effective and safe process operation are highly dependent on reliability of the real-time measurements. Despite considerable development of online sensors during the past decades, their dependability is still impaired due to various fouling and failing issues. Occasionally unsatisfactory measurement performance can prevent full instrumentation of plant-wide control systems. This is especially important for wastewater treatment plants (WWTPs) where often fault-tolerant control systems needs to be implemented. Small WWTPs generate up to 500 signals (including on-line and off-line signals), whereas larger ones typically register over 30,000. Despite a large number of available signals, data reconciliation and validation for online instrumentation has remained a largely unexplored field with a lack of standardized approaches. Most data are stored unstructured, with lots of gaps, repetition, ambiguity and uncertainty. This has led to “data-rich, information-poor” situations in which data sets are often too large and complex for processing and analysis to be used for decision-making. To turn raw data into useful and actionable information, data need to be validated. This can be achieved through a fault detection procedure.

        The aim of this study is to extend an existing benchmark simulation tool for wastewater treatment processes by including “realistically” different sensor/actuator and process faults which are compatible and unified with the previous developments (influent generator, process models, sensor and actuator models, simulation procedure, evaluation criteria). Different sensor/actuator and process faults were modelled using a Markov-chain approach, given the probability of fault occurrence and probability of transitions from one state to another. The output includes different scenarios which is suitable to test univariate/multivariate statistical monitoring methods as well as fault-tolerant control strategies. Using this platform, one can test the performance of a fault detection method. The method should ideally distinguish between various faults, isolate highly consequential deviating. To demonstrate this in an example, an adaptive-dynamic fault detection was tested using dynamic principal component analysis (dPCA) with a moving window. The model and the database are both freely available online, and we are extending an invitation to people to test their fault detection methods on our provided dataset, which also includes labels.

        Speaker: Pedram Ramin (Technical University of Denmark, Chemical Engineering)
      • 10:35
        Adaptive cusum control using prediction intervals for condition monitoring 20m

        Shell operates large and complex assets, such as oil refineries, chemical plants and offshore platforms, as one of the key parts of its business. A typical refinery will include thousands of individual pieces of equipment ranging from simple valves to highly complex systems, such as distillation columns.

        These systems are monitored for early signs of failure. Successful early detection might result in increased availability and reduced maintenance costs. Furthermore, the insights from monitoring equipment can be used to improve equipment care strategies and performance of assets.

        One of the large-scale monitoring approaches in Shell is based on comparing a measured output of a simple component to its virtual counterpart. Both the measured output and inputs to the model to construct the virtual counterpart are process variables, such as valve opening position, flow rate, temperature, and pressure. The idea is that under normal operation, the measured output and its virtual counterpart are approximately the same. Deviations could be an early sign of failure.

        To implement this approach, a prediction target is selected for each monitored component. Other model features are used to predict that target at each point in time as a nowcast. The model predictions are compared to the actual target measurements. The prediction error feeds into a cusum statistic, which in turn is used to raise alerts if a certain threshold is reached.

        In addition to the point estimates produced by the model, we also quantify the uncertainty of the point estimate in terms of a prediction interval. We propose to use these prediction intervals to adaptively change the hyperparameters of the cusum control [1]. More specifically, we can penalize prediction errors when the prediction interval is small and be more tolerant of prediction errors when the prediction interval is large.

        We describe the virtual sensor approach in the context of condition monitoring. Furthermore, we detail adaptive cusum control using prediction intervals and additional challenges to implementing and validating this at scale in an industrial setting.

        [1] Montgomery, D. C. (2009). Statistical quality control (Vol. 7)

        Speaker: Dr Jarno Hartog (Shell)
      • 10:55
        Virtual Tutors as Digital Twins for Education 20m

        Currently, Artificial Intelligence (AI) and more specifically generative models, are substantially criticized due to concerns about the lack of ethical regulation concerning its use and the impact that it could have on society. However, if applied correctly and ethically, AI could provide the opportunity for educators to automate repetitive tasks with stimulant new experiences for the students. The Department of Chemical and Biochemical Engineering at the Technical University of Denmark (DTU) offers a course in Good Manufacturing Practice (GMP) and quality in pharmaceutical, biotech and food industry. One of the activities performed during the course is an audit exercise, where students take on the role of an auditor inspecting a company (represented by the teachers), about their adherence to the quality management system. During the audit, students can ask questions regarding the practices adopted by the company, which is necessary information to evaluate whether the company in question could be a valuable business partner. The teachers’ role is to answer the students’ questions and show the available documentation, leaving the students to reflect upon the company’s behavior in quality management. Audio recordings of the audits have been collected for two offerings of the course (2022/2023). In this work, we present an initial investigation and proof of concept of how AI-powered models can be used to automate educational processes. More specifically, we show how to fine-tune a pre-trained generative question-answering model on domain-specific data (the audit recordings previously collected).
        The aim of this initiative is to improve students’ learning experience and increase engagement through stimulating material and gamification, as well as to automate repetitive tasks and allow time saving for the educators.

        Speaker: Fiammetta Caccavale (Technical University of Denmark (DTU))
    • 11:25 11:55
      Poster session Building 101A, Meeting Room 1

      Building 101A, Meeting Room 1

      Copenhagen

      DTU Lyngby campus Anker Engelunds Vej 1 2800 Kgs. Lyngby
      • 11:25
        Discontinuous Galerkin solver for continuous chromatography 10m

        Column chromatography is a workhorse unit operation for purification of proteins and pharmaceuticals. Most column chromatography unit operations are run in batch-mode although continuous chromatography has the potential to increase productivity and reduce the solvent consumption. Operating continuous chromatography requires having multiple columns in series and/or parallel. The operation is thus significantly more complex compared to batch chromatography, but modelling can aid the operation of them. However, modelling of continuous chromatography also becomes more complex compared to batch operation for the following reasons: 1) When having columns in series, the system will consist of coupled partial differential equations that must be solved simultaneously. 2) The system can never reach steady state because of the nature of the separation process but can reach a cyclic steady state. For some continuous chromatography operations, it is required to simulate far into the future to reach the cyclic steady state, leading to long simulation times. 3) Continuous chromatography has switches of flowrates (events) leading to periodic discontinuities.
        These factors lead to long simulation times. Thus, optimization of operation becomes very time consuming. Therefore, fast solvers are needed to enable real-time optimizations which is required for developing digital twins.
        The most widespread simulation software for continuous chromatography is the open-source software called CADET which uses a finite volume method to solve the underlying PDEs. So far, only Kristian Meyer has developed and implemented a Discontinuous Galerkin spectral element method for solving the underlying PDEs for batch chromatography [1]. To develop a faster solver, the work of Kristian Meyer is extended to multiple columns and is implemented in Julia. The performance of the Discontinuous Galerkin solver is tested by comparing against CADET. The implementation of the Discontinuous Galerkin solver has the potential to decrease the simulation times enough to enable real-time optimizations.

        [1]: Meyer, K., Leweke, S., von Lieres, E., Huusom, J. K., & Abildskov, J. (2020). ChromaTech: A discontinuous Galerkin spectral element simulator for preparative liquid chromatography. Computers and Chemical Engineering, 141, 107012. https://doi.org/10.1016/j.compchemeng.2020.107012</div>

        Speaker: Jesper Frandsen (Technical University of Denmark)
      • 11:35
        Real time predictions in the process industry using hybrid-models - a case from Viking Malt A/S 10m

        In the chemical-, biochemical-, and food production industries vast amounts of process data are being recorded using inline/online sensor measurements and offline laboratory measurements. Putting this process data to work through implementation of operator assistance tools, control systems, and digital twins has gained renewed interest in industry [Udugama 2020]. Introduction of Big-Data processing methods and Machine Learning (ML) offers opportunities for utilization of the process data. However, the data available from production processes is often not truly “Big-Data” [Venkatasubramanian 2019] due to low variation in data and time demanding laboratory measurements. As ML algorithms are inherently “data-hungry” the limitations in the available process data pose problems in the accuracy and extrapolability of developed models. A way of mitigating the limitations in process data is to introduce hybrid-models, in which well-known phenomena in the system is modelled using mechanistic modeling, while unknown phenomena is modelled using ML [von Stosch 2014].
        This work applies hybrid-modeling to a germination process in the Viking Malt A/S malting plant in Vordingborg, Denmark. Viking Malt A/S is a standard company with regards to data availability such as data structures, quantity of data, measurement frequency, number of sensors etc. but modeling the biological phenomena in the germination process is complicated due to the influence from both external factors and varying process conditions. ML is applied for predicting the growth rates of the barley grain sprouting in the germination process using inputs including distribution of sprout length, water content, temperature, addition of modification promoting hormone, ambient air conditions, and raw material properties, while mechanistic insights are introduced through mass, energy, and population balances. The germination process highlights some of the complexities encountered when working with real production data such as limited number of sensors, uncertainty in sampling, low frequency lab data, and unstructured data storage architecture. The developed models will be applied in real time prediction applications for helping operators in decision making with the objective of minimizing the energy consumption of the process while satisfying the product quality constraints and yield.
        Keywords: Hybrid-modeling, Machine Learning, Real time prediction, Energy optimization
        Udugama, Isuru A.; Gargalo, Carina L.; Yamashita, Yoshiyuki; Taube, Michael A.; Palazoglu, Ahmet; Young, Brent R.; Gernaey, Krist V.; Kulahci, Murat; Bayer, Christoph, 2020, The Role of Big Data in Industrial (Bio)chemical Process Operations, Industrial and Engineering Chemistry Research, Volume 59, Issue 34, pp. 15283-15297
        von Stosch, Moriz; Oliveira, Rui; Peres, Joana; Feyo de Azevedo, Sebastião, 2014, Hybrid semi-parametric modeling in process systems engineering: Past, present and future, Computers and Chemical Engineering, Volume 60, pp. 86-101
        Venkatasubramanian, Venkat, 2019, The promise of artificial intelligence in chemical engineering: Is it here, finally?, Aiche Journal, Volume 65, Issue 2, pp. 466-478

        Speaker: Peter Jul-Rasmussen (Technical University Of Denmark)
      • 11:45
        Stochastic modelling of fermentation processes: State estimation of perfusion cultivations using the Extended Kalman filter 10m

        Mammalian cells, particularly Chinese Hamster Ovary (CHO) cells, used for the production of recombinant proteins has gained widespread popularity in recent years due to their capability to produce proteins that are more similar to those found in humans [1], an essential feature in the development of therapeutics. CHO cells are recognized as one of the most reliable mammalian cell lines due to their high growth rate, ability to grow in suspension culture, and substantial protein production capacity [2]. Despite their advantages, the metabolic processes and production involved with CHO cells are complex, requiring optimization in order to enhance efficiency, productivity, and product quality.
        Production of therapeutic proteins is frequently achieved through the use of continuous cultivations. This process involves the growth of a cell culture in a bioreactor, where a continuous supply of nutrients is maintained, and the product is continuously harvested. Fermentation processes of this nature on lab scale has been cultivated using a specific CHO cell line for the production of monoclonal therapeutic antibodies (mAb). A total of five batches were run to completion − between 36 and 42 days − reporting various measurements during the entire period.
        The present master’s thesis explores the mathematical modelling of continuous cultivations using CHO cells. It aims to develop a system of differential equations that describes the dynamics of the CHO cell metabolism in a perfusion cultivation, including the evolution of biomass, glucose, lactate, and product titre. The developed system will be solved using the Extended Kalman filter; an algorithm used to estimate parameters related to nonlinear systems including statistical noise. The equations are based on previous definitions [3] and models various metabolic pathways of the CHO cell line, as well as the in- and outlet of nutrients and components involved in the perfusion process. The challenge lies in creating a model that is both mechanistically meaningful and that is not too intricate in order to be estimated using Kalman filtering.
        Two models with increasing complexity have been implemented, showing promising results in estimating rates related to cell consumption and production of various metabolites and cell growth, despite facing various challenges; some of which including the sensitivity of the algorithm used for parameter estimation when the system of equations increase in complexity, and a loss of observability when measurement values reach zero, challenging the rate estimation for certain metabolites. Although at an early stage, this study has the potential to contribute to the field of mathematical modelling of biochemical phenomena and provide a foundation for future research in the area of cultivation monitoring and/or design of experiments.

        Speaker: Torine Reed Herstad (Novo Nordisk)
    • 11:55 12:55
      Lunch 1h
    • 12:55 14:05
      Contributed session “Model Quality for Digital twins” Building 101A, Meeting Room 1

      Building 101A, Meeting Room 1

      Copenhagen

      DTU Lyngby campus Anker Engelunds Vej 1 2800 Kgs. Lyngby
      • 12:55
        Online Validation of Production System Digital Twins 20m

        With the recent technological advances, there is a continuing trend to increase the level of detail of digital twins to correctly capture the behavior of a complex system (Tao et al., 2019). Increasing complexity introduces challenges for maintaining the physical-to-digital alignment (Tan and Matta, 2022).

        Given the short-term decision support that must be provided by digital twins, it is essential to guarantee the validity of the underlying digital models with relatively small sets of data. This means traditional validation techniques cannot be used, and new methods should be developed to address the validity of both the model logic (i.e., topology) and the model inputs (i.e., statistical distributions, data models) (Lugaresi et al. 2019).

        To the best of authors knowledge, no significant contribution on the online validation of digital twins is present in literature. This work proposes a validation methodology to assess similarity between a manufacturing system and its simulation-based digital twin by comparing sequences of events and sequences of Key Performance Indicators (KPIs). The validation is done treating both information types as sequences and exploiting proper techniques to measure their similarity.
        The contributions of this work are twofold:

        • To describe the problem of online validation of digital twins within the scope of production planning and control. The same problem can be present in several other contexts in which the physical system can be modeled as a discrete event system, such as logistics and transportation, warehouse management, and supply chain management.
        • To propose of a methodology for online validation that can be used at different levels of detail and in an automated way during the production operations.

        Offline experiments are conducted to understand the properties of three techniques that have been identified as proper. Moreover, the proposed methodology is applied in real-time on a lab-scale manufacturing system to reach a proof-of-concept digital twin demonstrator (Lugaresi et al., 2022).

        References:

        Lugaresi, G., Gangemi, S., Gazzoni, G., & Matta, A. (2022, December). Online Validation of Simulation-Based Digital Twins Exploiting Time Series Analysis. In 2022 Winter Simulation Conference (WSC) (pp. 2912-2923). IEEE.

        Lugaresi, G., Aglio, G., Folgheraiter, F., & Matta, A. (2019, August). Real-time validation of digital models for manufacturing systems: A novel signal-processing-based approach. In 2019 IEEE 15th International Conference on Automation Science and Engineering (CASE) (pp. 450-455). IEEE.

        Tan, B., & Matta, A. (2022, December). Optimizing Digital Twin Synchronization in a Finite Horizon. In 2022 Winter Simulation Conference (WSC) (pp. 2924-2935). IEEE.

        Tao, F., Q. Qi, L. Wang, and A. Nee. 2019. Digital twins and cyber–physical systems toward smart manufacturing and industry 4.0: Correlation and comparison. Engineering 5(4):653–661.

        Speaker: Giovanni Lugaresi (CentraleSupelec)
      • 13:15
        On-the-fly uncertainty quantification: algorithms and applications 20m

        Quantification of all sources of uncertainties when modeling a physical system is essential to build an accurate digital twin and make informed decisions in an operational situation [1]. In uncertainty quantification of numerical simulation models, the classical approach for estimating distribution function or quantile function of a model output variable requires availability of an entire sample of the studied variable (i.e. the outputs of all the simulation model runs) [2]. This approach is not suitable at exascale as large ensembles of simulation runs would need to gather and store a prohibitively large amount of data [3]. For quantile estimation, this problem can be solved thanks to an on-the-fly (also called iterative or recursive) approach based on the Robbins-Monro algorithm [4]. An algorithm has been proposed for quantile function estimation in the context of uncertainty quantification [5]: it aims to estimate quantiles (at orders ranging from 5% to 95%) from samples of limited size (a few hundred observations). We illustrate it on toy functions and real application cases. We also provide some implementation elements inside the python package ‘IterativeStatistics’ [6] (which includes other statistics as mean, variance, covariance and first-order Sobol’ index). This package is integrated into the Melissa system (on-line processing of data produced from large scale ensemble runs [7,8]) and will be also integrated into the SALOME platform [9] soon.

        [1] A. Thelen, X. Zhang, O. Fink, Y. Lu, S. Ghosh, B.D. Youn, M.D. Todd, S. Mahadevan, C. Hu, Z. Hu. A comprehensive review of digital twin—part 2: roles of uncertainty quantification and optimization, a battery digital twin, and perspectives. Structural and Multidisciplinary Optimization, 66 (1), 2023.
        [2] R. Smith. Uncertainty quantification, SIAM, 2014.
        [3] A. Ribés, T. Terraz, Y. Fournier, B. Iooss and B. Raffin. Unlocking large scale uncertainty quantification with in-transit iterative statistics, In: In situ visualization for computational science, H. Childs, J. Bennet and C. Garth (Eds), Springer, 2022.
        [4] Robbins, H. and S. Monro. A stochastic approximation method. The Annals of Mathematical Statistics 22, 400–407, 1951.
        [5] B. Iooss and J. Lonchampt. Robust tuning of Robbins-Monro algorithm for iterative uncertainty quantification - Application to wind-farm asset management, Proceedings of the 31st European Safety and Reliability Conference, B. Castanier, M. Cepin, D. Bigaud and C. Berenguer (Eds.), Research Publishing, Singapore, 2021. Conference ESREL 2021, Angers, France, September 2021.
        [6] https://pypi.org/project/iterative-stats/
        [7] T. Terraz, A. Ribés, Y. Fournier, B. Iooss and B. Raffin. Large scale in transit global sensitivity analysis avoiding intermediate files, SC17 (The International Conference for High Performance Computing, Networking, Storage and Analysis), November 2017
        [8] https://gitlab.inria.fr/melissa
        [9] https://www.salome-platform.org/?lang=fr

        Speaker: Bertrand Iooss (EDF R&D)
      • 13:35
        Computer Experiments in the Context of Simulation-Driven Design – A Cautionary Tale 20m

        In light of the advances of simulation technology over the last decades, engineering design increasingly turns towards simulation-driven practices. However, with the enormous possibilities for parametric optimization, it is often neglected that many other design decisions require efficient screening to gain insights about the significance of inputs. Examples include the choice of different design configurations, prioritising design variables for further investigation, or the exploitation of interaction effects for increased product robustness.

        While many engineering applications for this purpose still rely on simple tools such as traditional fractional factorials, this presentation will address the challenges of applying more sophisticated designs and surrogate modelling approaches for a design rather than a detailed modelling task.

        Building on Joseph’s review of space-filling designs [1], sampling designs suitable for computer simulations are investigated which have advantages in their specific use cases. If the goal is to interpolate based on the samples, a meta-model can be created based on the kriging method, allowing to gain more information about the design space. The performance of a metamodel can be characterized using the Cross Validation Error (CVE) metric. Screening for significant effects and their behaviour can be done by using functional decomposition, also known as functional ANOVA.

        Four designs, namely the Central Composite Design, the MiniMax, and two versions of Latin Hypercube Designs (LHD), have been used for multi-objective injection molding simulations with three varying inputs in Moldex3D. CVE values and functional decomposition of the main effects for the kriging-based metamodels are investigated for each sampling model resulting in different model accuracy and estimated variable main effects for each design.

        Model dependence of the effectiveness the sampling methods apparent in the injection molding case is eliminated by carrying out a study using known true functions. Analyzing the data using functional decomposition and kriging, the LHD performed overall the best, yielding the lowest CVE and matching the true main effects the closest. The influence of interaction effects in the true model is investigated, where introducing interaction effects in the true model lead to distortion of the shape of the main effects. When there is an interaction significant but one of its components is not, it may show up as a falsely significant main effect in the functional decomposition causing misinterpretation of the results, unless the interaction effects are considered in the functional decomposition.

        Overall, it can be concluded that the performance of sampling designs is heavily model dependent.
        Introducing artificial interaction inputs (AII) in the metamodel for a true model lacking interaction may allow the identification of the significant interaction effects in exemplary cases. The approach of analyzing the AII may lead to more insights to the design space but requires more research to allow for an estimation of its accuracy and robustness.

        References

        [1] V. Roshan Joseph. “Space-filling designs for computer experiments: A review”. In: Quality Engineering. Vol. 28. 1. Taylor and Francis Inc., Jan. 2016, pp. 28–35. doi: 10.1080/08982112.2015.1100447.

        Speakers: Moritz Leopold Bauchrowitz (Technical University of Denmark), Naganathan Krishnan Athreya (Technical University of Denmark), Herle Kjemtrup Juul-Nyholm (Technical University of Denmark), Tobias Eifler (Technical University of Denmark), Murat Kulahci (Technical University of Denmark)
    • 14:05 14:20
      Break 15m Building 101A, Meeting Room 1

      Building 101A, Meeting Room 1

      Copenhagen

      DTU Lyngby campus Anker Engelunds Vej 1 2800 Kgs. Lyngby
    • 14:20 15:20
      Invited session “Digital Twins for Processes” Building 101A, Meeting Room 1

      Building 101A, Meeting Room 1

      Copenhagen

      DTU Lyngby campus Anker Engelunds Vej 1 2800 Kgs. Lyngby
      • 14:20
        The perspective of digital twins in bioprocesses at different scales 20m

        Scaling the production needs from CMC to flexible scale or large-scale fermentation-based production of APIs (like insulin) is a cumbersome endeavor. Even though key operational process parameters have been validated at development scale and common scale up methods have been applied, the variation in space, time, and yield can differ considerably from site to site. This presentation attempts to give examples of how a digital shadow of the process can help to decide on key parameters for understanding and design of the bioprocess at different scales.
        Computational Fluid Dynamic methods are used to predict the inhomogeneities in mixing efficiencies in reactors. Kinetic models are coupled with the fluid dynamic models in order to understand and describe the performance of the process under those conditions. Ultimately such models will assist to develop small scale reactor systems for the experimental investigation of operational conditions. This contribution will include the underlaying principle of mass transfer in chemical reactions, present the experimental and simulation results of a microbubble column-bioreactor (µBC)1 , mixing at pilot scale 2, reaction performance at full scale3 and the practical considerations for digital twins applied to bioprocesses. Finally, some future opportunities for the application of digital twins will be discussed and the realistic, experimental constraints will be considered as well.

        1. Lladó Maldonado, S. et al. Multiphase microreactors with intensification of oxygen mass transfer rate and mixing performance for bioprocess development. Biochem Eng J 139, 57–67 (2018).
        2. Christian Bach, Jifeng Yang, Hilde Larsson, Stuart M. Stocks, Krist V. Gernaey, Mads O. Albaek & Ulrich Krühne, ⇑. Evaluation of mixing and mass transfer in a stirred pilot scale bioreactor utilizing CFD. Chem Eng Sci 171, 19–26 (2017).
        3. Wright, M. R., Bach, C., Gernaey, K. V & Kruhne, U. Investigation of the effect of uncertain growth kinetics on a CFD based model for the growth of S. cerevisiae in an industrial bioreactor. CHEMICAL ENGINEERING RESEARCH & DESIGN 140, 12–22 (2018).
        Speaker: Ulrich Krühne (Technical University of Denmark)
      • 14:40
        Digitalization for intelligent biomanufacturing and engineering education: status and perspectives / new practices 20m

        Digitalization and especially Digital twins have become a new frontier in the biomanufacturing- and processing industries. The development of these technologies promises to improve lean and intelligent production opportunities within these sectors by addressing model complexities with more data driven and hybrid modeling approaches. These methods employ a tight integration between the digital and physical entities to achieve this goal. Therefore, one of the essential supporting tools in its implementation is addressing how to proactively respond to the ever-evolving market demand for chemical engineers with expanded data science knowledge and the challenges associated with implementing this type of production system in industrial settings. Hence, tools such as high-fidelity modelling (white, black, hybrid), process system engineering (methods and software) and data management should be part of any engineer's toolbox. Furthermore, the know-how on tackling unforeseen challenges is leading, now more than ever, to a high demand for theoretical and hands-on knowledge.
        The first requirement to accomplish a successful development of a digital twin/shadow is the facilitation of a bidirectional physical-to-digital information loop which should be prudently developed and curated on the go. It requires continuous synchronization between the physical entity (data) and digital representation. Notably, embedding a quasi-flawless flow of information will empower the prediction of optimal states and system interventions becoming a tool for knowledgeable online decision-making, minimizing manual intervention. However, a major caveat in the biomanufacturing industry is the high degree of unique and strict requirements towards characteristics and regulations of the products.
        Therefore, one of the most essential tools in its transformation to become a reality, data collection methods (e.g., smart and soft sensors), signal processing and models (e.g., mechanistic, data-driven or hybrid) should be like bread and butter.
        This presentation/talk will cover a general review of ongoing research activities within digitalization in the biomanufacturing field, our interpretation of digitalization, up-to-date solutions, and enabling technologies and applications. Two ongoing development cases will be presented. The first example covers the development and application of digital twins as of virtual tutors for education; we will show a proof of concept on how AI-powered models can be used to automate educational processes. Secondly, a more practical example is presented where we present an approach to develop a digital twin of a pilot scale fermentation, highlighting the use of Non-Linear State Estimation for improving induction timing in GFP production. The developed tool can help operators by predicting an estimated time for induction based on a glucose concentration target.

        Speaker: Carina L. Gargalo (PROSYS, DTU)
    • 15:20 15:35
      Break 15m Building 101A, Meeting Room 1

      Building 101A, Meeting Room 1

      Copenhagen

      DTU Lyngby campus Anker Engelunds Vej 1 2800 Kgs. Lyngby
    • 15:35 16:25
      Contributed session “Digital Twins for Modeling and Optimization” Building 101A, Meeting Room 1

      Building 101A, Meeting Room 1

      Copenhagen

      DTU Lyngby campus Anker Engelunds Vej 1 2800 Kgs. Lyngby
      • 15:35
        Multiphysics Modelling for Polymer Additive Manufacturing Technologies 20m

        Polymer additive manufacturing (AM) technology is a relatively young and highly versatile manufacturing process which focuses on creating intricate geometries that would be difficult or impossible to achieve with other manufacturing methods. It is a type of AM that involves the layer-by-layer construction of three-dimensional objects using polymers as the raw material. Polymer AM has the potential to revolutionize the manufacturing industry, enabling faster and more cost-effective production of parts and components. This technology offers immense opportunities for innovation and design, making it an exciting area of research and development. The study works on Multiphysics modelling for two leading polymer AM, selective thermoplastic electrophotographic process (STEP) and selective laser sintering (SLS).

        STEP utilizes a combination of electrostatic forces, heat, and pressure to deposit and fuse thermoplastic powder into 2D layers, creating a final 3D structure. Notably, STEP is capable of producing parts with complex details using multiple materials, and it is highly efficient, making it ideal for high-volume production. The thermomechanical model was proposed to simulate warpage and dimensional defects of the final products and was further validated by sensor data, measurements, and statistical models. Besides, the work also provided a comprehensive parametric study of the process. SLS employs a laser as a heat source to sinter polymer particles together and creates a 3D object. Compared to conventional manufacturing methods, SLS is easy to set up and can be done quickly. SLS is widely used for rapid prototyping and experimental applications due to its ability to produce high-fidelity, robust, and isotropic products. In SLS, the thermos-fluid dynamic model was used for melt pool morphology prediction, which has a significant impact on the stability of the process and quality of the product, such as surface roughness and stiffness. Similarly, the model was validated by experiments.

        The incorporation of a Multiphysics model is a crucial aspect of constructing digital twins, as it provides a comprehensive understanding of processes from a physical standpoint. By analyzing the physics behind the system, the Multiphysics model can simulate the behaviour and performance of the digital twin in different scenarios, enabling better prediction of its real-world behaviour. The development of digital twins for an additive manufacturing process represents a crucial step toward improving the efficiency and effectiveness of manufacturing processes. The potential benefits of using a digital twin for process optimization are significant, including reductions in production costs and improvements in product quality. By simulating the manufacturing process digitally, engineers can identify potential problems before they occur and make adjustments to the process to prevent them.

        Speaker: Mr Kenneth Meinert (Technical University of Denmark )
      • 15:55
        Mechanistic modelling of industrial crystallization processes – Challenges and opportunities at the initial solution stage 20m

        Crystallization processes are important and widely used unit operations in the pharmaceutical industry and there is a need to understand and improve the performance of these processes in terms of e.g. yield and batch time as well as the performance of downstream processes affected by crystal size and shape e.g. yields and processing times of subsequent centrifugation, filtration and drying steps. The non-linear, dynamic nature of the process along with the high number of operational degrees of freedom makes experimental improvement and optimization of such processes a complex task. However, the process can be described mathematically using mechanistic models coupling the mass, energy, and population balances of the system allowing for advanced process understanding, targeted experiments, optimization, etc. The development of such models becomes increasingly challenging with the complexity of the phenomena involved (e.g. non-linearity, large number of phenomena, coupling, etc.) [1]. In this work, a mechanistic model for an industrial crystallization has been developed using a systematic framework with selection of numerical method and model complexity based on phenomena identified from available data and observations. The model has at the initial solution stage been applied to e.g. explain observations in production, design space exploration, and performing what-if analyses for risk evaluation. Next, the goal is to estimate the model parameters to apply the model for process optimization. However, with limited data availability and a high number of phenomena/parameters, an appropriate experimental strategy is needed to generate the required data to estimate the parameters of the individual but simultaneously occurring phenomena.

        Speakers: Mrs Marta González García (Novo Nordisk), Marta Gonzalez Garcia (Novo Nordisk), Rune Lorits (Novo Nordisk)
    • 16:25 16:40
      Closing Building 101A, Meeting Room 1

      Building 101A, Meeting Room 1

      Copenhagen

      DTU Lyngby campus Anker Engelunds Vej 1 2800 Kgs. Lyngby