Skip to Main Content
A successive least-squares approach is proposed to find an optimal model of a flat neural network in a short period of time. It is based on a minimum description length (MDL) neural network that uses the MDL principle as the stopping criterion. Different from conventional algorithms on flat neural networks that apply least-squares technique on weights between hidden layer and output layer only, it extends the least-squares technique to weights between the input layer and the hidden layer. We apply this algorithm to the chaotic Mackey-Glass time series and chaotic laser time series. The results show that it provides satisfactory prediction within a small amount of time.