%0 Journal Article %T Multiscale Latent Variable Regression %A Mohamed N. Nounou %A Hazem N. Nounou %J International Journal of Chemical Engineering %D 2010 %I Hindawi Publishing Corporation %R 10.1155/2010/935315 %X Multiscale wavelet-based representation of data has been shown to be a powerful tool in feature extraction from practical process data. In this paper, this characteristic of multiscale representation is utilized to improve the prediction accuracy of some of the latent variable regression models, such as Principal Component Regression (PCR) and Partial Least Squares (PLS), by developing a multiscale latent variable regression (MSLVR) modeling algorithm. The idea is to decompose the input-output data at multiple scales using wavelet and scaling functions, construct multiple latent variable regression models at multiple scales using the scaled signal approximations of the data and then using cross-validation, and select among all MSLVR models the model which best describes the process. The main advantage of the MSLVR modeling algorithm is that it inherently accounts for the presence of measurement noise in the data by the application of the low-pass filters used in multiscale decomposition, which in turn improves the model robustness to measurement noise and enhances its prediction accuracy. The advantages of the developed MSLVR modeling algorithm are demonstrated using a simulated inferential model which predicts the distillate composition from measurements of some of the trays' temperatures. 1. Introduction Process models are an essential part of many process operations, such as model-based control [1, 2]. However, constructing empirical models using measurements of the process variables is associated with many difficulties, which include dealing with collinearity or redundancy in the variables and accounting for the presence of measurement noise in the data. Collinearity is common in models which involve large number of variables, such as Finite Impulse Response (FIR) models [3, 4] and inferential models. Collinearity increases the variance of the estimated model parameters, which degrades their accuracy of estimation. Many modeling techniques have been developed to deal with collinearity, which include Ridge Regression (RR) [5¨C7] and latent variable regression [3¨C5]. RR reduces the variations in model parameters by imposing a penalty on the norm of their estimated values. The latent variable regression models, on the other hand, use singular value decomposition to reduce the dimension of the input variables to provide a more conditioned set of inputs. Some of the popular latent variable regression model estimation techniques include the well-known Principal Component Regression (PCR) and Partial Least Squares (PLS) modeling methods [3¨C5]. Also, the %U http://www.hindawi.com/journals/ijce/2010/935315/