“Two-Pass Orthogonal Least Squares Algorithm to Train and Reduce Fuzzy Logic Systems”
by Jörg Hohensohn and Jerry M. Mendel
August 1994
Fuzzy logic systems (FLS's) can be designed using training data (i.e., given numerical input/output pairs) and supervised learning algorithms. FLS's can be viewed as a linear combination of nonlinear fuzzy basis functions (FBF's), and methods of linear optimization like orthogonal least squares (OLS) can be applied to tune the parameters of the FLS. The generated FLS should be as small as possible while fulfilling the required task of function approximation, so as to reduce computation when the designed FLS is applied to new data. OLS can be used to select a subset of most significant FBF's out of an initial pool set up by the training data. The drawback to OLS is that the resulting system still contains information from all $M$ initial rules, derived from the training points, even though only the most important $M_{S}$ rules have been established by OLS. This is due to a normalization of the FBF's, and leads to excessive computation times during further processing Our solution is to construct new FBF's out of the reduced rulebase and to run OLS a second time. The resulting system not only is of reduced computational complexity, but is of very similar behavior to the unreduced system The second run of OLS can be applied to a larger set of training data, which significantly improves the precision. We illustrate our two-pass OLS algorithm for prediction of the Mackey-Glass chaotic time series. Extensive simulations are given in which we look at tradeoffs among the design parameters of a FLS for this application.