User’s Guide : Advanced Single Equation Analysis : Robust Least Squares
  
Robust Least Squares
Ordinary least squares estimators are sensitive to the presence of observations that lie outside the norm for the regression model of interest. The sensitivity of conventional regression methods to these outlier observations can result in coefficient estimates that do not accurately reflect the underlying statistical relationship.
Robust least squares refers to a variety of regression methods designed to be robust, or less sensitive, to outliers. EViews offers three different methods for robust least squares: M‑estimation (Huber, 1973), S-estimation (Rousseeuw and Yohai, 1984), and MM-estimation (Yohai 1987). The three methods differ in their emphases:
M-estimation addresses dependent variable outliers where the value of the dependent variable differs markedly from the regression model norm (large residuals).
S-estimation is a computationally intensive procedure that focuses on outliers in the regressor variables (high leverages).
MM‑estimation is a combination of S‑estimation and M‑estimation. The procedure starts by performing S-estimation, and then uses the estimates obtained from S‑estimation as the starting point for M‑estimation. Since MM‑estimation is a combination of the other two methods, it addresses outliers in both the dependent and independent variables.
Least squares diagnostics for outlier detection are described in greater detail in “Leverage Plots” and “Influence Statistics”.