Abstract
A robust approach should be considered when estimating regression coefficients in multiresponse problems. Many models are derived from the least squares method. Because the presence of outlier data is unavoidable in most real cases and because the least squares method is sensitive to these types of points, robust regression approaches appear to be a more reliable and suitable method for addressing this problem. Additionally, in many problems, more than one response must be analyzed; thus, multiresponse problems have more applications. The robust regression approach used in this paper is based on Mestimator methods. One of the most widely used weighting functions used in regression estimation is Huber’s function. In multiresponse surfaces, an individual estimation of each response can cause a problem in future deductions because of separate outlier detection schemes. To address this obstacle, a simultaneous independent multiresponse iterative reweighting (SIMIR) approach is suggested. By presenting a coincident outlier index (COI) criterion while considering a realistic number of outliers in a multiresponse problem, the performance of the proposed method is illustrated. Two wellknown cases are presented as numerical examples from the literature. The results show that the proposed approach performs better than the classic estimation, and the proposed index shows efficiency of the proposed approach.
Introduction
A common method of explaining and analyzing the results of experiments is response surface modeling. This term is used for a regression equation that shows the whole behavior of the control variables, the nuisance factors, and the response or responses. We can use the estimated function to predict the response in each value of specific controllable factors. After gathering experimental data, a relationship between the factors (input data) and the response or responses (output results) should be defined to complete the analysis procedure. If we cannot construct a suitable model to define the precise relation between the input variables and the response or the responses' consequents, then the interpretations will not be reliable. After determining an experimental design and performing experiments, the next steps include the statistical analysis and the selection of the optimal input variables.
One of the most common approaches of regression coefficient estimation is the Least Squares (LS) method. A solution given by LS determines the coefficient values that minimize the sum of the squares of the residuals, in other words, the sum of the square differences between the experimental response values and those calculated by the fitted equation.
The quality of a manufactured product is often evaluated by several performance measures, which are called quality characteristics, each of which is described by a response variable. The values of these response variables are affected by one or more process parameters, which are the input variables. Often, processes with two or more response variables operate in a conflicting way. A group of responses often characterizes the performance of a manufactured product. These responses are usually correlated and measured by different measurement scales. Therefore, a decisionmaker must resolve the parameter selection problem to optimize each response. This problem is considered to be a multiresponse optimization problem, which is subject to different response requirements.
It is usually difficult to realize an optimal level of the input variables that can result in values close to the ideal or target values for all of the response variables. The main goal of multiresponse optimization is, therefore, to find the settings of the input variables that achieve an optimal compromise in the response variables.
In many cases, especially in experimental results, some of the data should be treated outliers. These points, which may occur because of operator reading faults or other similar factors, may have a confusing effect on the total interpretation of the results. These points are called outliers. A data observation or a group of data points that are well separated from the majority of the whole pattern of observation, in other words, data that deviate from the general pattern, are called outliers. However, they are avoidable during the processing to some extent. The main concept in robustness is the presence of outliers and, more precisely, the changes in the distribution of the data. A common way to address outliers and to find them when using LS is to identify the bad observations. To detect the outliers, some graphical procedures such as normal probability plots and numerical regression diagnostics have been proposed. These procedures are defined in Weisberg (1985). Wisnowskia et al. (2001) studied the analysis of multiple outlier detection procedures for a linear regression model. Monte Carlo simulation is used to compare different approaches, and the performances and limitations of each method are discussed. Outlier detection in multivariate problems is not simple to understand; to describe this problem, simple visual methods can be applied. Fernandez Pierna et al. (2002) compared this type of method, called the convex hull method, with classical techniques and robust methods.
The concept of outlier data is qualitative in the sense that it is not the same as incorrect data but rather refers to data that are different from the majority. Often, the presence of outlier data illustrates the existence of an unexpected phenomenon at the start of experimentation but that can be explained, possibly from experimental causes. A problem that we often encountered in the application of regression is the presence of an outlier or outliers in the data. Outliers can be generated by a simple operational mistake, a small sample size, or other factors. Even one outlying observation can destroy an LS estimation, resulting in parameter estimates that do not provide useful information for the majority of the data. Robust regression analysis was developed to improve LS estimation in the presence of outliers and to provide additional information about valid observations. The primary purpose of robust regression analysis is to fit a model that represents the information that is in the majority of the data.
To address this obstacle, some robust approaches were proposed by different authors. Robust regression methods were introduced to address the abovementioned problems. Ample (Fernandez Pierna et al. 2002) introduced robustness and computational approaches that include Huber (1981) robust statistics and different estimation algorithms. One common robust estimation approach is the Mestimator, which is based on a maximum likelihood estimation (MLE). LS was derived by this type of estimation and considers a special residuals function. The main idea of Mestimators is to replace the squared residuals by another function. The Mestimator works by an iterative procedure. As a consequence, several authors (e.g., Cummins and Andrews 1995) have called this estimator iteratively reweighted least squares, or the IRLS method. Additionally in our case, to estimate the regression coefficients, the iterative weighting method can be applied to estimate robust coefficients. One Mestimator function is from Huber (1981), which has become increasingly popular. Since then, more robust approaches have been discussed by investigators. However since Mestimators are simple to understand and considering that recent methods are sometimes so sensitive that they wrongly identify good points as outliers (Hund et al. 2002), we chose to use these type of estimators. Maronna et al. (2006) explained the most recent robust regression algorithms.
Morgenthaler and Schumacher (1999) discussed robust response surfaces in chemistry based on the design of experiments. Hund et al. (2002) presented various methods of outlier detection and evaluated robustness tests with different experimental designs. Robust regression methods and reconstruction experimental design methods have been compared. Wiens and Wu (2010) proposed a comparative study of Mestimators and presented a design that is more optimal compared with possible regression models.
In multiresponse problems, the first step is the accurate determination of the regression coefficient because contamination and outlier data can have a negative effect on the models. The robustness concept in multiresponses has been presented by different authors; however, robust design was developed by Taguchi (1986, 1987). This approach is often used in process improvement project, to redesign processes for the purpose of increasing customer satisfaction by improving operational performances. Usually, the model parameters are estimated by LS in robust design. This methodology specifically utilizes both experimentation and optimization methods to determine the system's optimum operating conditions. Koksoy (2008, 2006) presented MSE as a robust design criterion in multiresponse problems. Additionally, genetic algorithms and generalized reduced gradients method were used in their solution stage. In the mentioned studies, the general framework for multivariate problems in which data are collected from a combined array has been presented. For example, Quesada and Del Castillo (2004) proposed a dual response approach to multivariate robust parameter designs.
There are also several papers that consider correlations between responses, allowing the variancecovariance structure of the multiple responses to be accounted for. In this case, some multivariate techniques can be applied to these problems. Daszykowski et al. (2007) reviewed robust models and both univariate and multivariate outliers, and the effects of data analysis have been studied. One of the most efficient and useful robust multivariate regressions is the minimum covariance determinant, which was proposed by Rousseeuw et al. (2004).
In multiresponse problems, robust regression approaches can be used to decrease the effects of contaminations and to focus outliers. In this paper, it is assumed that there is no correlation between responses; a similar assumption is made in other studies, such as Koksoy (2008). However, by considering each response and using univariate Mestimators to estimate the coefficients, a problem could occur, and outlier detection is required to consider all of the responses simultaneously. An outlier appearance in only one response cannot be considered a wrong observation while the other responses are considered to contain normal behavior. If we consider the responses individually, an experiment could treat an observation as outlier data, while in an iterative procedure, that point would be downweighted more than it is appropriate. Considering all of the responses simultaneously leads to calculating the real number of outliers in a multiresponse problem. In this paper, we suppose that the outlier data will occur because of a mistake in the experimentation, but not in the recording of the data. Hence considering one response as an outlier while considering others not to be contaminated is not rational. A brief review on the literature is given in Table 1, as follows:
To the best of our knowledge, there are a few studies on multiresponse robust regression, and this paper focuses on the multiresponse robust regression that considers the response residuals simultaneously. To estimate the regression coefficients in the multiresponse problem, we propose a procedure in which we apply a simultaneous independent multiresponse iterative reweighting procedure and change the Mestimator weighting function; a more precise estimation of each response can be obtained. By considering this procedure, a new criterion is proposed named the coincident outlier index (COI), and the performance of this procedure is analyzed.
This paper is organized as follows. ‘Using Mestimators for robust estimation of regression coefficients’ section presents the robust Mestimator procedure and the modification of the response surface by an iterative weighting procedure. The proposed method for the multiresponse problem is defined in ‘Robust simultaneous estimation of multiresponse problem’ section. To illustrate the proposed method, a numerical example is presented before the ‘Conclusions’ section. Finally, the last section provides the conclusions of this paper.
Using Mestimators for robust estimation of regression coefficients
The Mestimator proposed by Huber (1981) is the generalized form of the (MLEs). This part is extracted from Maronna et al. (2006). The Mestimator is the solution of Equation (1), as follows:
where ρ is a function with specific properties. Supposed that f is a density function and ρ = log f, where f is a density function, then \stackrel{\u2322}{\mathrm{\theta}} will be introduced as the MLE of the parameter. There are several ρ functions. One common ρ function is Huber (1981). This function is welldefined in Equation (2):
By considering this function and by defining ψ = ρ′ and a weighting function by Equation (3), the iterative algorithm to estimate the unknown parameter can be defined.
Mestimates for estimating regression coefficients are developed in the same way as defined in previous part. Equation (4) should be considered, and the coefficients can be obtained by solving following equation:
The \stackrel{\u2322}{\mathrm{\sigma}} in Equation (4) can also be estimated individually by Equation (5), as follows, or can be solved simultaneously in Equation (4). r_{ i } is a residual of the response.
To make the estimation procedure invariant with respect to the scale of the residuals, the r_{ i }s are divided by ‘s’. The value of ‘s’ is often taken to be equal to 1.4826 MAD, where MAD is the median of the absolute deviations of the residuals from their median, and 1.4826 is a bias adjustment for the standard deviation under the normal distribution.
An iterative reweighting method can be defined as follows: First, compute an initial estimate β_{0} and compute \stackrel{\u2322}{\sigma} from Equation (5). After that, for k = 0, 1, 2, … :
First, Compute an initial estimate β_{0} and compute \stackrel{\u2322}{\sigma} from Equation (5). After that, For k = 0, 1, 2, … :

(a)
Given {{\displaystyle \stackrel{\u2322}{\beta}}}_{k} , for i = 1, . . . , n, compute {{\displaystyle r}}_{i,k}={{\displaystyle y}}_{i}{{\displaystyle X}}_{i}^{\text{'}}{{\displaystyle \stackrel{\u2322}{\beta}}}_{k} and {{\displaystyle w}}_{i,k}=W\left({{\displaystyle r}}_{i,k}/\stackrel{\u2322}{\sigma}\right).

(b)
Compute {{\displaystyle \stackrel{\u2322}{\beta}}}_{k+1} by solving the following:
{\displaystyle \sum _{i=1}^{n}{{\displaystyle w}}_{i,k}{{\displaystyle X}}_{i}}\phantom{\rule{1em}{0ex}}\left({{\displaystyle y}}_{i}{{\displaystyle X}}_{i}^{\text{'}}{{\displaystyle \stackrel{\u2322}{\beta}}}_{k}\right)=0(6)
Finally, the algorithm s tops when max \left(\left{{\displaystyle r}}_{i,k}{{\displaystyle r}}_{i,k+1}\right\right)/\stackrel{\u2322}{\sigma}<\u03f5This algorithm converges if W(x) is nonincreasing for x > 0 (Maronna et al. 2006). If ψ is monotone, because the solution is essentially unique, the choice of the starting point influences the number of iterations but not the final result. This procedure is called (IRWLS).
The procedure is as follows: compute the first coefficients of the regression model, then compute the residuals and weights, and finally compute the new coefficients using Equation (6). This procedure can be repeated because the values of the coefficients and the values of the residuals and weights are different; as a result, this procedure can be repeated until a good solution is obtained. The procedure terminates when the change in the estimation from one iteration to the next is sufficiently small. The estimators of coefficients based on the LS method are basically unbiased; the robust estimators like LS methods are basically unbiased too.
Robust simultaneous estimation of multiresponse problem
In a multiresponse problem, similar to a single response problem, robust estimation of the regression model is an important issue. A simple approach to estimating the regression models in multiresponse problems is to consider the responses individually and to estimate the solution to each problem by the robust Mestimator approach. However, this approach could cause some problems. Assume in one experiment that a specific response residual appears to be an outlier. However, other responses do not show any signs of being unacceptable data for that specific experiment. The outlier data could occur because of a fault in the experimentation. It is not rational to say, then, whether one experiment's result is an outlier for one response because it may not be unacceptable data for the other responses. From this type of deduction, a large amount of the experiment's results could become outliers, and for each response, some points will be assumed to be outliers by mistake. Thus, assuming independence between responses, a simultaneous independent multiresponse iterative reweighting (SIMIR) approach is proposed to solve this problem. In this approach, based on Mestimators, some changes are applied in the procedure of weighting functions to estimate the coefficients of the model. The weighting function proposed in this method, downweights the residuals by considering each response of the multiresponse problem in each iteration, simultaneously. We have j responses in this problem and i experiments. The variable r(i)_{ j } defines the residual for the i th replicate of the j th response. The proposed weighting function is given in Equation (7):
The proposed pseudo code is as follows:

1.
Compute the actual values of the responses in each experiment by performing all of the experiments

2.
Estimate the regression coefficients of the initial regression model by applying the proper method.
While \left(\left{{\displaystyle \mathit{r}}}_{i,k}{{\displaystyle \mathit{r}}}_{i,k+1}\right\right)/\stackrel{\u2322}{\sigma}<\u03f5 Do (ϵ is determined by the analyzer).

3.
Calculate the residuals of each response in all of the experiments

4.
Compute the \stackrel{\u2322}{\sigma} by Equation ( 5).

5.
If all r(i) _{ j } are smaller than the threshold determined in the Huber Mestimator method, then dedicate the weight to be equal to 1 for the residuals in this iteration and go to step (7); else, go to step (6).

6.
Down weight the residuals by considering the values of the residuals in all of the responses by a function in Equation ( 7 ).

7.
Estimate the regression coefficients by solving Equation ( 6 ).
The performance of the proposed method is presented by a numerical example in the next section. By applying the SIMIR procedure, the squared errors (SE) of the estimated parameters, which are regression coefficients, are reduced compared to those of the least squares estimation of each response; however, to some extent, this strategy is not as precise as the robust individual estimation. One important problem in multiresponse problems is the number of real outliers.
The SE criterion is computed in Equation (8):
Individually, residuals computed for one response could not be outliers in the whole multiresponse problem. To detect the outliers in a multiresponse problem, it is not correct to mention the outliers in each individual response. We present the COI to detect the real number of outliers in a robust estimation of the regression coefficients in a multiresponse problem. This index can be computed by this procedure, for which we define the threshold by considering the suggested C (defined in Equation (7)) in the Huber procedure and by considering scaled residuals. If we consider the number of responses as n, then the proposed threshold is defined as T=\left(\left[\frac{n}{2}\right]+1\right)\times C. If the sum of the residuals is greater than this threshold, then that experiment is treated as an outlier. Thus, the COI is equal to the number of points that are greater than T. Equation (9) defines the proposed COI, as follows:
where N is the number of experiments.
Numerical example
In this section, the efficiency of our proposed approach compared with existing approaches is illustrated for two cases. In case one, the number of coinciding outliers differs from the number of outliers that were detected by an individual procedure. In the second case, the previous experiments contain most of the outliers because of a true fault by the experimenter. The number of coinciding outliers and the outliers that are detected individually does not differ significantly in this case.
Case 1. tire tread compound problem
In this section, we illustrate the proposed method using the wellknown problem ‘tire tread compound problem’, which was originally presented by Derringer and Suich (1980). In this model, three main chemical materials, such as silica (x_{1}), silane (x_{2}), and sulfur (x_{3}), and four responses are assumed. The experimental data results are given in Table 2.
As a first step, we attempt to find a primary regression model with four responses. A central composite design (CCD) with six center points is applied to describe the model. All of the controllable variables are  1.63 ≤ x_{ i } ≤ 1.63, i = 1, 2, 3. The regression coefficients that are obtained by the least squares estimation method and according to the CCD are given in Table 3, as follows:
The scaled residuals of this multiresponse problem are reported in Table 4, as follows:
For the first response, the residuals obtained by the experiments numbered 2, 7, 15, 16, and 19 appear to be outliers, and for the second response residuals, those numbered 5, 8, and 13 appear to be outliers. For the third response residuals, those numbered 11, 17, and 18 appear to be outliers. The fourth model does not contain a residual that implies an outlier. By omitting the values of the implied outliers, actual regression models can be obtained by a least squares method because least square is the most efficient method in the absence of outlier data. This method is considered to be the actual result in Table 5. In addition, to obtain the robust regression models by two different approaches, the robust individual regression approach and the SIMIR approach are applied and can be compared by considering the SE criteria.
First, we consider each response individually and apply the Mestimator procedure to each response. The constant C in this example is assumed to be 1.37. Finally, the SIMIR procedure is applied to the data. Coefficients obtained by actual, robust individual and SIMIR procedures are given in Table 5.
Then, to evaluate the procedures mentioned (actual, individual robust, and SIMIR approaches), the SE of these estimated parameters are computed using Equation (8), and the results are reported in Table 6. In Equation (8), it is assumed that the value of θ is the regression coefficient in the actual method, and \widehat{\theta} is the regression coefficient in the considered method (actual, individual robust, and SIMIR approaches).
Additionally, the evaluation is given in Figure 1, as follows:
A comparison among the three approaches by considering the sum of squared errors (SSE) is computed in Table 6 and illustrated in Figure 1. In this figure, for the first three responses, the scaled values of the SSE are given.
The results show that the robust individual regression estimation is more precise than that of the least squares estimation or the proposed SIMIR approach, but the coincident outlier index that was presented in the previous section is more reliable and realistic. Moreover, the results state that in the case with an absence of outliers, LS performs better than both the robust individual and the SIMIR procedures. In this example, if we want to count the outliers as a multiresponse problem individually, the results would be 11 experiments. However, by the proposed COI index, the real number of the multiresponse problem outliers is 2, and it would be more rational that in 20 experiments, almost 10% of the experiments result in outliers.
Case 2. elastic element of a force transducer problem
We provide another example to illustrate the efficiency of the proposed method. The following example was presented as a case study in Romano et al. (2004), in which the problem was about the elastic element of a force transducer. This example involves a combined array design with three control (x) and two noise (z) variables. The control factors are the three parameters that describe the element configuration, namely the lozenge angle (x1), the bore diameter (x2), and the halflength of the vertical segment (x3). Noise factors are the deviation of the lozenge angle from its nominal value (z1) and the deviation of the bore diameter from its nominal value (z2). These internal noise factors are undeniably independent. The two indicators, namely the nonlinearity (y1) and the hysteresis (y2), define the responses. Table 7 displays the data from this experiment.
First, we need to find a primary regression model for the two responses. All of the controllable variables are  1 ≤ x_{ i } ≤ 1, i = 1, 2. The regression coefficients that were obtained by the least squares estimation method are given in Table 8, as follows:
The scaled residuals of this multiresponse problem are reported in Table 9, as follows:
Consequently, for the first response, outliers appear with the residuals obtained by the experiments, numbered 17, 18, 21, 22, 23 and 25; for second response residuals, they are numbered 19, 20, 21 and 22. By omitting these values, the actual regression models can be obtained. To obtain the robust regression models, two different approaches are applied, and these two approaches are compared by considering the SE criteria.
First, we consider each response individually and apply the Mestimator procedure to each response. The constant C in this example is assumed to be 1.37. Thus, three groups of coefficients are reported in Table 10.
To evaluate the three procedures, the proposed SE criterion are calculated, and the results are given in Table 11 as follows:
In addition, to provide more illustration, Figure 2 is given as follows:
Similar to the previous example, our results showed that the robust individual regression estimation performs better, but not significantly better than the least squares method; the SIMIR approach was also considered, but the COI is more realistic and accurate. In this example, if we want to count the outliers in a multiresponse problem individually, the results would consist of eight experiments. However, by the proposed COI index, the real number of outliers in the multiresponse problem is six.
Therefore, by considering these two examples, the number of detected outliers calculated by both the classical method and the SIMIR proposed approach is shown in Figure 3, as follows:
Conclusions
As mentioned in the previous sections, a robust simultaneous estimation of regression coefficients in the multiresponse problem in the case in which contaminated data exists was presented in this paper. In addition, the results showed that the proposed approach would consider a number of points to be real outliers in the multiresponse problem, although individual robust regression shows some other points as outliers. Thus, an aggregative approach in the weighting function was proposed, in which all of the responses were surveyed. The SIMIR approach performed better than the classic method for detecting outliers and estimating regression coefficients. Additionally, our results show that the proposed approach would provide a better COI index than the classical approach for outlier detection. For future research, other robust regression approaches can be studied. In addition, considering a problem with correlated responses can be another aspect of related future research.
Authors' information
MB is an associate professor of the Department of Industrial Engineering, Shahed University, and his research interests are Multiple Response Optimization and Facilities Location Problem. AM is a Ph.D. student of the Department of Industrial Engineering at Amirkabir University of Technology (Tehran Polytechnic).
References
Cummins DJ, Andrews CW: Iteratively reweighted partial least squares: a performance analysis by Monte Carlo simulation. J Chemometr 1995, 9: 489–507. 10.1002/cem.1180090607
Daszykowski M, Kaczmarek K, Vander Heyden Y, Walczak B: Robust statistics in data analysis–a review basic concepts. Chemometr Intell Lab 2007, 85: 203–219. 10.1016/j.chemolab.2006.06.016
Derringer G, Suich R: Simultaneous optimization of several response variables. J Qual Technol 1980, 12: 214–219.
Fernandez Pierna JA, Wahl F, De Noord OE, Massart DL: Methods for outlier detection in prediction. Chemometr Intell Lab 2002, 63: 27–39. 10.1016/S01697439(02)000345
Hampel FR: A general definition of qualitative robustness. Ann Math Statist 1971, 42: 1887–1896. 10.1214/aoms/1177693054
Hund E, Massart DL, SmeyersVerbeke J: Robust regression and outlier detection in the evaluation of robustness tests with different experimental designs. Anal Chim Acta 2002, 463: 53–73. 10.1016/S00032670(02)003379
Huber PJ: Robust statistics. New York: Wiley; 1981.
Maronna RA, Martin RD, Yohai VJ: Robust statistics: theory and methods. New York: Wiley; 2006.
Morgenthaler S, Schumacher MM: Robust analysis of a response surface design. Chemometr Intell Lab 1999, 47: 127–141. 10.1016/S01697439(98)001993
Koksoy O: A nonlinear programming solution to robust multiresponse quality problem. Appl Math Comput 2008, 196: 603–612. 10.1016/j.amc.2007.06.023
Koksoy O: Multiresponse robust design: mean square error (MSE) criterion. Appl Math Comput 2006, 175: 1716–1729. 10.1016/j.amc.2005.09.016
Quesada GM, Del Castillo E: A dualresponse approach to the multivariate robust parameter design problem. Technometrics 2004, 46: 176–187. 10.1198/004017004000000220
Romano D, Varetto M, Vicario G: Multiresponse robust design: a general framework based on combined array. J Qual Technol 2004, 36: 27–37.
Rousseeuw P, Van Aelst S, Van Driessen K, Agull J: Robust Multivariate Regression. Technometrics 2004, 46: 293–305. 10.1198/004017004000000329
Taguchi G: Introduction to quality engineering. White Plains, NJ: Kraus International Publications; 1986.
Taguchi G: System of Experimental Design: Engineering Methods to Optimize Quality and Minimize Cost. White Plains, NJ: Quality Resources; 1987.
Weisberg S: Applied linear regression. 2nd edition. New York: Wiley; 1985.
Wiens D, Wu EKH: A comparative study of robust designs for Mestimated regression models. Comput Stat Data An 2010, 54: 1683–1695. 10.1016/j.csda.2010.01.033
Wisnowskia JW, Montgomery DC, Simpson JR: A comparative analysis of multiple outlier detection procedures in the linear regression model. Comput Stat Data An 2001, 36: 351–382. 10.1016/S01679473(00)000426
Acknowledgments
Authors are thankful to the reviewers for their valuable comments.
Author information
Authors and Affiliations
Corresponding author
Additional information
Competing interests
The authors declare that they have no competing interests.
Authors' contributions
MB has worked on the literature and modeling; he also proposed the response surface methodology. AM has performed the robust concept and numerical example. Both authors read and approved the final manuscript.
Authors’ original submitted files for images
Below are the links to the authors’ original submitted files for images.
Rights and permissions
Open Access This article is distributed under the terms of the Creative Commons Attribution 2.0 International License (https://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
About this article
Cite this article
Bashiri, M., Moslemi, A. Simultaneous robust estimation of multiresponse surfaces in the presence of outliers. J Ind Eng Int 9, 7 (2013). https://doi.org/10.1186/2251712X97
Received:
Accepted:
Published:
DOI: https://doi.org/10.1186/2251712X97
Keywords
 Multiresponse problem
 Robust regression
 Outliers
 Mestimator