Speaker
Description
The emergence of Industry 4.0 has led to a data-rich environment, where most companies accumulate a vast volume of historical data from daily production usually involving some unplanned excitations. The problem is that these data generally exhibit high collinearity and rank deficiency, whereas data-driven models used for process optimization especially perform well in the event of independent variations in the input variables, which have the capability to ensure causality (i.e., data obtained from a DoE that guarantees this requirement).
In this work, we propose a retrospective DoE methodology aimed at harnessing the potential of this type of data (i.e., data collected from daily production operations) for optimization purposes. The approach consists (i) retrospectively fitting two-level experimental designs, from classical full factorial or fractional factorial designs to orthogonal arrays, by filtering the database for the observations that were close to the corresponding design points, and (ii) subsequently carrying out the analysis typically used in DoE. We also investigate the possibility of imputing eventual missing treatment conditions. Finally, we conduct a meta-analysis with the results of all the retrospective experimental designs to extract consistent conclusions. Here, raster plots play a crucial role, enabling the detection of aliasing patterns as well as factors appearing consistently in the best models, and thereby pointing to the potential active factors.
The proposed methodology is illustrated by an industrial case study. It is expected to be useful for screening, gaining some insights about potential active factors, and providing data-driven input towards efficient subsequent designed experimentation.
Classification | Both methodology and application |
---|---|
Keywords | DOE, Historical Data, Industry 4.0 |