The discipline of statistical engineering has gained recognition within NASA by spurring innovation and efficiency, and it has demonstrated significant impact. Aerospace research and development benefits from an application-focused statistical engineering perspective to accelerate learning, maximize knowledge, ensure strategic resource investment, and inform data-driven decisions. In...
User-generated content including both review texts and user ratings provides important information regarding the customer-perceived quality of online products and services. The quality improvement of online products as well as services will benefit from a general framework of analyzing and monitoring these user-generated content. This study proposes a modeling and monitoring method for online...
Randomization is a fundamental principle underlying the statistical planning of experiments. In this talk, we illustrate the impact when the experimenter either cannot or chooses not to randomize the application of the experimental factors to their appropriate experimental units for split-plot experiments (Berni et al., 2020). The specific context is an experiment to improve the production...
The need to handle ordinal and nominal data is currently being addressed in various work going on amongst ontology organisations and various standards bodies dealing with concept systems in response to big data, machine reading in applications such as the medical field. At the same time, some prominent statisticians have been reticent about accepting someone else telling them what scales they...
It is common practice in metrology that the standard uncertainty associated with the average of repeated observations is taken as the sample standard deviation of the observations divided by the square root of the sample size. This uncertainty is an estimator of the standard deviation of the sample mean when the observations have the same mean and variance and are uncorrelated.
It often...
In human sciences, mediation refers to a causal phenomenon in which the effect of an exposure variable 𝑋 on an outcome 𝑌 can be decomposed into a direct effect and an indirect effect via a third variable 𝑀 (called mediator variable).
In mediation models, the natural direct effects and the natural indirect effects are among the parameters of interest. For this model, we construct different...
Monitoring the manufacturing process becomes a challenging task with a huge number of variables in traditional multivariate statistical process control (MSPC) methods. However, the rich information is often loaded with some rare suspicious variables, which should be screened out and monitored. Even though some control charts based on variable selection algorithms were proven effective for...
The sparse and smooth clustering (SaS-Funclust) method proposed in [1] is presented. The aim is to cluster functional data while jointly detecting the most informative portion(s) of the functional data domain. The SaS-Funclust method relies on a general functional Gaussian mixture model with parameters estimated by maximizing the sum of a log-likelihood function penalized by a functional...
The ISO 2859 and ISO 3951 series provide acceptance sampling procedures for lot inspection, allowing both sample size and acceptance rule to be determined, starting from a specific value either for the consumer or producer risk. However, insufficient resources often prohibit the implementation of “ISO sampling plans.” In cases where the sample size is already known, determined as it is by...
George Box made many profound and enduring theoretical and practical contributions to statistical design of experiment and response surface methodology and their influence on industrial engineering and quality control applications. His focus on using statistical tools in the right way to solving the right real-world problem has been the inspiration throughout my career. Our statistical...
I will present a brief overview of my most recent experiences in disseminating statistical culture: participation in the virtual event STEMintheCity 2020 and the creation of statistics pills for a general public, available on the Outreach website of the National Resear Council of Italy.
I will conclude with a short presentation of the ongoing multidisciplinary research activity on...
Discrete choice experiments are frequently used to quantify consumer preferences by having respondents choose between different alternatives. Choice experiments involving mixtures of ingredients have been largely overlooked in the literature, even though many products and services can be described as mixtures of ingredients. As a consequence, little research has been done on the optimal design...
Automated procedures of real estate price estimation and prediction have been used in the real estate sector since 15 years. Various providers of real estate price predictions are available, e. g., the platform Zillow, or Immoscout 24 from Germany. Simultaneously, the problem of real estate price prediction has become a subject of statistical and machine learning literature. The current...
We explore the concept of parameter design applied to the production of glass beads in the manufacture of metal-encapsulated transistors. The main motivation is to complete the analysis hinted at in the original publication by Jim Morrison in 1957, which was possibly the first example of exploring the idea of transmitted variation in engineering design, and an influential paper in the...
The presence of variation is an undesirable (but natural) factor in processes. Quality improvement practitioners search constantly for efficient ways to monitor it, a primary requirement in SPC. Generally, inspections by attributes are cheaper and simpler than inspections by variables, although they present poor performance in comparison. The $S^2$ chart is widely applied in monitoring process...
Autoclave curing process is one of the important stages in manufacturing. In this process, multiple parts are loaded in the autoclave as a batch, they are heated up to their curing temperature (heating phase) and cured at that temperature for their dwell period. There are two main considerations that affect how parts are placed in the autoclave. Firstly, if some parts reach the curing...
Injection molded parts are widely used in power system protection products. One of the biggest challenge in an injection molding process is shrinkage and warpage of the molded parts. All these geometrical variations may have an adverse effect on the quality of product, functionality, cost and time-to-market. Our aim is to predict the spread of the functional dimensions and geometrical...
Large-scale dynamic forecasting of non-negative count series is a major challenge in many areas like epidemic monitoring or retail management. We propose Bayesian state-space models that are flexible enough to adequately forecast high and low count series and exploit cross-series relationships with a multivariate approach. This is illustrated with a large scale sales forecasting problem faced...
Numerous countries are making electric vehicles their key priority to reduce emissions in their transport sector. This emerging market is subject to multiple unknowns and in particular the charging behaviours of electric vehicles. The lack of data describing the interactions between electric vehicles and charging points hinders the development of statistical models describing this interaction...
Track geometry is critical for railway infrastructures, and the geometry condition and the expected degradation rate are vital for planning maintenance actions to assure the tracks’ reliability and safety. The degradation prediction accuracy is, therefore, essential. The Wiener process has been widely used for degradation analysis in various applications based on degradation measurements. In...
Experimental data are often highly structured due to the use of experimental designs. This does not only simplify the analysis, but it allows for tailored methods of analysis that extract more information from the data than generic methods. One group of experimental designs that are suitable for such methods are the orthogonal minimally aliased response surface (OMARS) designs (Núñez Ares and...
In many data analytics applications based on machine learning algorithms, the main focus is usually on predictive modeling. In certain cases, as in many applications in manufacturing, understanding the data-driven model plays a crucial role in complementing the engineering knowledge about the production process. There is therefore a growing interest in describing the contributions of the input...
Rail transport demand in Europe has increased over the last few years, and passenger thermal comfort has been playing a key role in the fierce competition among different transportation companies. Furthermore, European standards settle operational requirements of passenger rail coaches in terms of air quality and comfort level. To meet these standards and the increasing passenger thermal...
For the last twenty years, a plethora of new ``memory-type'' control charts have been proposed. They share some common features: (i) deceptively good zero-state average run-length (ARL) performance, but poor steady-state performance, (ii) design, deployment and analysis significantly more complicated than for established charts, (iii) comparisons made to unnecessarily weak competitors, and...
Accurate estimations of internal migration is crucial for successful policy making and community planning. This report aims to estimate internal migration between municipalities in Sweden.
Traditionally, spatial flows of people have been modelled using gravity models, which assume that each region attracts or repels people based on the populations of regions and distances between them....
Organizations, both large and small, have a difficult time trying to standardize. In the field of statistical methods standardizing on a software package is especially difficult. There are over 50 commercial options, over 40 open source options, and add-ins for spreadsheets and engineering tools. Educational licenses provide low costs to universities, but graduates often find their...
In this work, we develop and study upper and lower one-sided CUSUM control charts for monitoring correlated counts with finite range. Often in practice, data of that kind can be adequately described by a first-order binomial integer-valued ARCH model (or BINARCH(1)). The proposed charts are based on the likelihood ratio and can be used for detecting upward or downward shifts in process mean...
When the Covid19 pandemic is no longer the prime burden on British health services, it might be possible to refocus on the three concerns that threatened to overwhelm the National Health Service in 2019. Namely, heart disease, cancer and obesity.
Whilst the NHS can reasonably claim to have made progress with the first two, it is faced with an ever-increasing level in obesity. To non-clinical...
Maintenance planning is one of the most important problems for manufacturing enterprises. Maintenance strategies applied in an industry are corrective and preventive maintenance strategies. The development of sensor technologies has led to a widespread use of preventive maintenance methods. However, it can be costly for small and medium-sized enterprises to install such sensor systems. This...
Quality evaluation of resistance spot welding (RSW) joints of metal sheets in the automobile sector is generally dependent on expensive and time-consuming offline testing, which are impracticable in full-scale manufacturing on a vast scale. A great opportunity to face this problem is the increasing digitization in the industry 4.0 framework, which makes on-line measurements of process...
When the Covid19 pandemic is no longer the prime burden on British health services, it might be possible to refocus on the three concerns that threatened to overwhelm the National Health Service in 2019. Namely, heart disease, cancer and obesity.
Whilst the NHS can reasonably claim to have made progress with the first two, it is faced with an ever-increasing level in obesity. To...
In Statistical Process Control/Monitoring (SPC/M) our interest is in detecting when a process deteriorates from its “in control” state, typically established after a long phase I exercise. Detecting shifts in short horizon data of a process with unknown parameters, (i.e. without a phase I calibration) is quite challenging.
In this work, we propose a self-starting Bayesian change point...
Because of the curse-of-dimensionality, high-dimensional processes present challenges to traditional multivariate statistical process monitoring (SPM) techniques. In addition, the unknown underlying distribution and complicated dependency among variables such as heteroscedasticity increase uncertainty of estimated parameters, and decrease the effectiveness of control charts. In addition, the...
This talk will concern developing and disseminating statistical tools for answering some industrial issues. It will be fully based on my 20-years’ experience as a statistician research engineer and expert in the French research institute of nuclear energy (CEA) and the French company of electricity (EDF). I will particularly focus on the domain of uncertainty quantification in numerical...
Nowadays tourist websites make extensive use of images to promote their structure and the its location. Many images, such as landscapes, are used extensively on destination tourism websites to draw tourists’ interest and influence their choices. The use of eye-tracking technology has improved the level of knowledge of how different types of pictures are observed. An eye-tracker enables to...
The development of drive assist systems, such as traffic sign recognition and distance regulation, is one of the most important tasks on the way to autonomous driving. With focus on the definition of reliability as the ability to perform a required function under specific conditions over a given period of time, the most challenging aspect appears to be the description of the usage conditions....
Multiple statistically independent data streams are being observed sequentially and we are interested in detecting, as soon as possible, a change in their statistical behavior. We study two different formulations of the change detection problem. 1) In the first a change appears at a single unknown stream but then the change starts switching from one stream to the other following a switching...
Complexity manifests itself in many ways when attempting to solve different problems, and different tools are needed to deal with the different dimensions underlying that complexity. Not all complexity is created equal. We find that most treatments of complexity in problem-solving within both the statistical and quality literature focus narrowly on technical complexity, which includes the...
Bersimis et al. (2007) motivated by Woodall and Montgomery (1999) statement published an extensive review paper of the field of MSPM. According to Bersimis et al. (2007) open problems in the field of MSPM, among others are robust design of monitoring procedures and non-parametric control charts. In this work, we introduce a non-parametric control scheme based on convex hulls. The proposed...
In multistage manufacturing systems, modeling multiple quality indices based on the process sensing variables is important. However, the classic modeling technique predicts each quality variable one at a time, which fails to consider the correlation within or between stages. We propose a deep multistage multi-task learning framework to jointly predict all output sensing variables in a unified...
A control chart is used to monitor a process variable over time by providing information about the process behavior. Monitoring the process of related variables is usually called a multivariate quality control problem. Multivariate control charts, needed when dealing with more than one quality variable, relies on very specific models for the data generating process. When large historical data...
In the development process of railway vehicles several requirements considering reliability and safety have to be met. These requirements are commonly assessed by using Multi-Body-Dynamics (MBD) simulations and on-track measurements.
In general, the vehicle/track interaction is significantly influenced by varying, unknown or non-quantifiable operating conditions (e.g. coefficient of friction)...
Blocking is often used to reduce known variability in designed experiments by collecting together homogeneous experimental units. A common modeling assumption for such experiments is that responses from units within a block are dependent. Accounting for such dependencies in both the design of the experiment and the modeling of the resulting data when the response is not normally distributed...
Non-linear predictive machine learning models (such as deep learning) have emerged as a successful approach in many industrial applications, as the accuracy of predictions often surpasses classical statistical approaches in a significant, and also effective way. Predictive maintenance tasks (such as predicting change points or detecting anomalies) are particularly susceptible to this...
Detecting abrupt structural changes in a dynamic graph is a classic problem in statistics and machine learning. In this talk, we present an online network structure change detection algorithm called spectral-CUSUM to detect such changes through a subspace projection procedure based on the Gaussian model setting. Theoretical analysis is provided to characterize the average run length (ARL) and...
Despite the vast amount of literature on supersaturated designs (SSDs), there is a scant record of their use in practice. We contend this imbalance is due to conflicting recommendations regarding SSD use in the literature as well as the designs' inabilities to meet practitioners' analysis expectations. To address these issues, we first summarize practitioner concerns and expectations of SSDs...
This contribution offers an active coffee break. This 15-minutes activity is practically an informal survey and may even be a bit funny. Together, we create a quick picture on two selected topics within the ENBIS community: a) the fashion topic Artificial Intelligence and b) ENBIS in general, which might give ideas for future events and developments within ENBIS. Everybody is invited to take...
Meeting customer quality expectations and delivering high quality products is the key for operational excellence. In this study, a printed circuit board (PCB) assembly process is considered for improvement. Associations between the defects as well as patterns of the defects over time are investigated. A priori algorithm for association rule mining and Sequential Pattern Discovery using...
About the Session:
Are you interested in case studies and real-world problems for active learning of statistics? Then come and join us in this one-hour interactive session organised by the SIG Statistics in Practice. The session follows on from a similar event in ENBIS 2020.
A famous project for students to apply the acquired knowledge of design of experiments is Box's paper...
The research examines the zero-state and the steady-state behavior of the Shiryaev-Roberts (SR) procedure for Markov-dependent count time series, using the Poisson INARCH(1) model as the representative data-generating count process. For the purpose of easier evaluation, the performance is compared to existing CUSUM results from the literature. The comparison shows that SR performs at least as...
Non-parametric control charts based on data depth and resampling techniques are designed to monitor multivariate independent and dependent data.
Phase I
Dependent and independent case
- The depths $ D_F (X_i) $ ordered in ascending order are obtained.
- The lower control limit $ (LCI) $ is calculated as the quantile at the $ \alpha $ level of the observations under null...
As the field of reliability engineering continues to grow and adapt with time, accelerated life tests (ALT) have progressed from luxury to necessity. ALT subjects test units to higher stress levels than normal conditions, thereby generating more failure data in a shorter time period. In this work, we study a progressively Type-I censored k-level step-stress ALT under interval monitoring. In...
We propose a three-step approach to forecasting time series of electricity consumption at different levels of household aggregation. These series are linked by hierarchical constraints -global consumption is the sum of regional consumption, for example. First, benchmark forecasts are generated for all series using generalized additive models; second, for each series, the aggregation algorithm...
The context is the statistical fusion of air quality measurements coming from different monitoring networks. The first one of fixed sensors of high quality, the reference network, and the second one of micro-sensors of less quality. Pollution maps are obtained from the correction of numerical model outputs using the measurements from the monitoring stations of air quality networks. Increasing...
Advancement in manufacturing has significantly extended the lifetime of a product while at the same time it made harder to perform life testing at the normal operating condition due to the extensively long life spans. Accelerated life tests (ALT) can mitigate this issue by testing units at higher stress levels so that the lifetime information can be acquired more quickly. The lifetime of a...
The COVID-19 data analysis is essential for policymakers in analyzing the outbreak and managing the containment. Many approaches based on traditional time series clustering and forecasting methods such as hierarchical clustering and exponential smoothing have been proposed to cluster and forecast the COVID-19 data. However, most of these methods do not scale up with the high volume of cases....
We propose a methodology for the selection and/or aggregation of probabilistic seismic hazard analysis (PSHA) models, which uses Bayes's theory by optimally exploiting all available observations, in this case, the seismic and accelerometric databases. When compared to the actual method of calculation, the proposed approach, simpler to implement, allows a significant reduction in computation...
Recent improvements in data acquisition technologies have produced data-rich environments in every field. Particularly relevant is the case where data are apt to be modelled as functions defined on multidimensional domain, which are referred to as functional data. A typical problem in industrial applications deals with evaluating the stability over time of some functional quality...
Issue Resolution is a critical process in the manufacturing sector to sustain productivity and quality, especially in the Food and Beverage Industry, where aseptic performance is critical. As a leader in this industry, Tetra Pak has built a database regarding quality events reported by Tetra Pak technicians, each containing domain knowledge from experts. In this paper, we present a model...
In this work, we investigate order-restricted Bayesian cost constrained design optimization for progressively Type-I censored simple step-stress accelerated life tests with exponential lifetimes under continuous inspections. Previously we showed that using a three-parameter gamma distribution as a conditional prior ensures order restriction for parameter estimation and that the conjugate-like...
Future grid management systems will coordinate distributed production and storage resources to manage, in a cost-effective fashion,
the increased load and variability brought by the electrification of transportation and by a higher share of weather-dependent production.
Electricity demand forecasts at a low level of aggregation will be key inputs for such systems. In this talk, I'll focus on...
A gearbox is a fundamental component in a rotating machine; therefore, detecting a fault or malfunction is indispensable early to avoid accidents, plan maintenance activities and reduce downtime costs. The vibration signal is widely used to monitor the condition of a gearbox because it reflects the dynamic behavior in a non-invasive way. The objective of this research was to perform a ranking...
The need to analyze complex nonlinear data coming from industrial production settings is fostering the use of deep learning algorithms in Statistical Process Control (SPC) schemes. In this work, a new SPC framework based on orthogonal autoencoders (OAEs) is proposed. A regularized loss function ensures the invertibility of the covariance matrix when computing the Hotelling $T^2$ statistic and...
A protocol for a bio-assay involves a substantial number of steps that may affect the end result. To identify the influential steps, screening experiments can be employed with each step corresponding to a factor and different versions of the step corresponding to factor levels. The designs for such experiments usually include factors with two levels only. Adding a few four-level factors would...
The use of eXplainable Artificial Intelligence (XAI) in many fields, especially in finance has been an important issue not only for researchers but also for regulators and beneficiaries. In this paper, despite recent researches in which XAI methods are utilized for improving the explainability and interpretability of opaque machine learning models, we consider two mostly used model-agnostic...
In this work, we are going to predict precipitation through the use of different functional regression models (FRM) and the best fit is selected between: Functional Linear Model with Basic Representation (FLR), Functional Linear Model with Functional Basis by Principal Components (PC), Functional Linear Model with Functional Basis of Principal Components by Partial Least Squares (PLS) and the...
Adhesives are increasingly used in the manufacturing industry because of their desirable characteristics e.g. high strength-to-weight ratio, design flexibility, damage tolerance and fatigue resistance. The manufacturing of adhesive joints involves a complex, multi-stage process in which product quality parameters, such as joint strength and failure mode, are highly impacted by the applied...
Early fault detection in the process industry is crucial to mitigate potential impacts. Despite being widely studied, fault detection remains a practical challenge. Principal components analysis (PCA) has been commonly used for this purpose. This work employs a PCA-based local approach to improve fault detection efficiency. This is done by adopting individual control limits for the principal...
Wind energy is an immensely popular renewable energy source, due to the increase in environmental awareness, the decrease in the number of fossil fuels, and the increase in costs. Therefore, the amount of energy produced in wind turbine farms should be estimated accurately. Although wind turbine manufacturers estimate energy production depending on wind speed and wind direction, mostly actual...
Principal component analysis (PCA) is a basic tool for reducing the dimension of a space of variables. In modern industrial environments large variable space dimensions up to several thousands are common, where data are recorded live in high time resolution and have to be analysed without time delay. Classical batch PCA procedure start from the full covariance matrix and construct the exact...
Optimal experimental designs are extensively studied in the statistical literature. In this work we focus on the notion of robustness of a design, i.e. the sensitivity of a design to the removal of design points. This notion is particularly important when at the end of the experimental activity the design may be incomplete i.e. response values are not available for all the points of the design...
Identifiability of polynomial models is a key requirement for multiple
regression. We consider an analogue of the so-called statistical fan, the set of
all maximal identifiable hierarchical models, for cases of noisy design of experiments or measured covariate vectors with a given tolerance vector. This
gives rise to the definition of the numerical statistical fan. It includes all
maximal...
Computer age statistics, machine learning and, in general, data analytics is having an ubiquitous impact on industry, business and services. This data transformation requires a growing workforce which is up to the job in terms of knowledge, skills and capabilities. The deployment of analytics needs to address organizational needs, invoke proper methods, build on adequate infrastructures and...
In real applications, time series often exhibit missing observations such that standard analytical tools cannot be applied. While there are approaches of how to handle missing data in quantitative time series, the case of categorical time series seems not to have been treated so far. Both for the case of ordinal and nominal time series, solutions are developed that allow to analyze their...
Imbalanced classes often occur in classification tasks including process industry applications. This scenario usually results in the overfitting of the majority classes. Imbalanced data techniques are then commonly used to overcome this issue. They can be grouped into sampling procedures, cost-sensitive strategies and ensemble learning. This work investigates some of them for the...
Emerging technologies ease the recording and collection of high frequency data produced by sensor networks. From a statistical point of view, these data can be view as discrete observations of random functions. Our industrial goal is to detect abnormal measurement. Statistically, it consists in detecting outliers in a multivariate functional data set.
We propose a robust procedure based on...
The decommissioning of nuclear infrastructures such as power plants arises as these facilities age and come to the end of their lifecycle. The decommissioning projects expect a complete radiological characterization of the site, of both the soil and the civil engineering structure to optimize efficiency and minimize the costs of said project. To achieve such goal, statistical tools such as...
We address a real case study of Statistical Process Monitoring (SPM) of a Surface Mount Technology (SMT) production line at Bosch Car Multimedia, where more than 17 thousand product variables are collected for each product. The basic assumption of SPM is that all relevant “common causes” of variation are represented in the reference dataset (Phase 1 analysis). However, we argue and demonstrate...
Accreditation of statisticians has been offered by ASA and RSS earlier, and since 2020 by FENStatS.
The purpose of accreditation is to focus on the professionality, development and quality of applied statistical work. We believe that the need for good statistics and good statisticians is increasing and an accreditation programme can provide one tool in this process.
The accreditation...
We summarize eleven months of pro-bono work on statistical modeling and analysis of Covid-19 topics. For each of the papers and tutorials included here we provide a one-paragraph summary and commentary, including methods used, results, and possible public health applications, as well as the ResearchGate url to access them. Section 1 is an Introduction. In Section 2 we describe the web page...
In real estate models, spatial variation is an important factor in predicting land prices. Spatial dependency factors (SDFs) under spatial variation play a key role in predicting land prices. The objective of this study was to develop a novel real estate model that is suitable for Sri Lanka by exploring the factors affecting the prediction of land prices using ordinary least squares regression...
In a linear regression model, endogeneity (i.e., a correlation between some explanatory variables and the error term) makes the classical OLS estimator biased and inconsistent. When instrumental variables (i.e., variables that are correlated with the endogenous explanatory variables but not with the error term) are available to partial out endogeneity, the IV estimator is consistent and widely...
An important issue in classification problems is the comparison of classifiers predictive performance, commonly measured as proportion of correct classifications and often referred to as accuracy or similarity measure.
This paper suggests a two-step fixed-sequence approach in order to identify the best performing classifiers among those selected as suitable for the problem at hand. At the...
In comparative statistical tests of parallel treatment groups, a new drug is commonly considered superior to the current version if the results are statistically significant. Significance is then based on confidence intervals and p-values, the reporting of which is requested by most top-level medical journals. However, in recent years there have been ongoing debates on the usefulness of these...
The spatial dependence on environmental data is an influential criterion in clustering processes, since the results obtained provide relevant information. As classical methods do not consider spatial dependence, considering this structure produces unexpected results, and groupings of curves that cannot be similar in shape/behavior.
In this work, the clustering is performed using the modified...
This contribution is a joint work of academicians and a research group of STMicroelectronics (Italy) a leading industry in semiconductor manufacturing.
The problem under investigation refers to a predictive maintenance manufacturing system in Industry 4.0. Modern predictive maintenance is a condition-driven preventive maintenance program that uses possibly huge amount of data for monitoring...
A new methodology based on bootstrap resampling techniques is proposed to estimate the distribution of the h and k Mandel's statistics, commonly applied to identify laboratories that supply inconsistent results usually utilized to detect those outlier laboratories by testing the hypothesis of reproducibility and repeatability (R & R), in the framework of Interlaboratory Studies (ILS)....
Millions of measuring instruments are verified each year before being placed on the markets worldwide. In the EU, such initial conformity assessments are regulated by the Measuring Instruments Directive (MID) and its modules F and F1 allow for statistical acceptance sampling.
This paper re-interprets the acceptance sampling conditions formulated by the MID in the formal framework of...
As the market share of wind energy has been rapidly growing, wake effect analysis is gaining substantial attention in the wind industry. Wake effects represent a wind shade cast by upstream turbines to the downwind direction, resulting in power deficits in downstream turbines. To quantify the aggregated influence of wake effects on a wind farm's power generation, various simulation models have...
JMP 16 marks a substantial expansion of JMP’s capabilities. In the area of DoE, JMP 16 introduces the candidate set designer, which gives the user complete control over the possible combinations of factor settings that will be run in the experiment. The candidate set design capability is also very useful as an approach to Design for Machine Learning, where we use principles of optimal design...
The book "Applications of DoE in Engineering and Science" by Leonard Lye contains a wealth of design of experiments (DOE) case studies, including factorial designs, fractional factorial designs, various RSM designs, and combination designs. A selection of these case studies will be presented using the latest version of Design Expert®, a software package developed for use in DOE applications,...
In manufacturing systems, many quality measurements are in the form of images, including overlay measurements in semiconductor manufacturing, and dimensional deformation profiles of fuselages in an aircraft assembly process. To reduce the process variability and ensure on-target quality, process control strategies should be deployed, where the high-dimensional image output is controlled by one...
Coordinate metrology is a key technology supporting the quality infrastructure associated with manufacturing. Coordinate metrology can be thought of as a two-stage process, the first stage using a coordinate measuring machine (CMM) to gather coordinate data $\mathbf{x}_{1:m} = \{\mathbf{x}_i,i =1, \ldots,m\}$, related to a workpiece surface, the second extracting a set of parameters...
The industrial cyber-physical systems (ICPS) will accelerate the transformation of offline data-driven modeling to fast computation services, such as computation pipelines for prediction, monitoring, prognosis, diagnosis, and control in factories. However, it is computationally intensive to adapt computation pipelines to heterogeneous contexts in ICPS in manufacturing.
In this paper, we...
In modern metrology an exact specification of unknown characteristic values, such as shape parameters or material constants, is often not possible due to e.g. the ever decreasing size of the objects under investigation. Using non-destructive measurements and inverse problems is both an elegant and economical way to obtain the desired information while also providing the possibility to...
Randomized experiment is a quintessential methodology in science, engineering, business and industry for assessing causal effects of interventions on outcomes. Randomization tests, conceived by Fisher, are useful tools to analyze data obtained from such experiments because they assess the statistical significance of estimated treatment effects without making any assumptions about the...