The 2007 International Joint Conference on Neural Networks (IJCNN 2007), held in Orlando, FL, marked 20 years of neural networks. IJCNN has been a long collaboration of the INNS and IEEE on the technology and applications of neural computing. One of the sessions of peer-reviewed presentations was based on a diverse set of papers with the overall theme of “Temporal Data Analysis”. This issue of Neural Computing and Applications contains a series of four expanded articles from that session, for which I was the chair. The articles are interesting for the diversity of excellent applications of neural computing, and they also highlight our journal’s international dimension with authors from Brazil, Japan, Portugal, and the United States.

The examples of temporal data analysis represented in this issue of NC&A include a wide variety of applications. They cover time-dependence studies involving musical sounds, chemical reactions, Federal Fund rates, and general forecasting models.

Mizuki Ihara and authors (Nara Institute of Science and Technology and Kyoto University, Japan) show an interesting application of the source-filter model from speech synthesis to the identification of musical instruments from the parameters of their time-varying sounds. Their model takes into account temporal continuity of pitch and loudness. The parameters of their probabilistic model are estimated by minimization of the free energy. After the learning of model parameters, instrument identification is carried out.

Petia Georgieva and authors (University of Aveiro and University of Porto, Portugal) use neural networks to estimate chemical process reaction rates. They formulate a hybrid neural network and mechanistic model that outperforms traditional reaction rate estimation methods. They also propose a new procedure for supervised training when target outputs are not available. They successfully test their approach for two benchmark problems: the estimation of the precipitation rate of calcium phosphate and the estimation of sugar crystallization growth rates.

A. G. Malliaris and Mary Malliaris (Loyola University, Chicago, USA) study four competing methods for forecasting the temporal behavior of short-term Federal Fund interest rates using monthly data from 1958 to 2005. Their results indicate that the neural network model does best when the data sample is divided into periods when the Federal Fund rates were low, medium, and high.

Aloísio Carlos de Pina and Gerson Zaverucha (Federal University of Rio de Janeiro, Brazil) go beyond traditional methods of time series forecasting to study the potential advantage of particle filters as a generalization of Kalman filter methods. They advocate the use of regression error characteristic (REC) analysis curves for visualization and comparisons to show better performance by particle filters.

I hope the readers of this journal will find this issue, with its variety of applications, interesting and useful. If the area of temporal data analysis is new to you, or you have substantial involvement already, the authors of these articles will be eager to hear from you, share further information, and learn of your ideas and projects on this topic.