Abstract
The values selected for the free parameters of Artificial Neural Networks usually have a high impact on their performance. As a result, several works investigate the use of optimization techniques, mainly metaheuristics, for the selection of values related to the network architecture, like number of hidden neurons, number of hidden layers, activation function, and to the learning algorithm, like learning rate, momentum coefficient, etc. A large number of these works use Genetic Algorithms for parameter optimization. Lately, other bioinspired optimization techniques, like Ant Colony optimization, Particle Swarm Optimization, among others, have been successfully used. Although bioinspired optimization techniques have been successfully adopted to tune neural networks parameter values, little is known about the relation between the quality of the estimates of the fitness of a solution used during the search process and the quality of the solution obtained by the optimization method. In this paper, we describe an empirical study on this issue. To focus our analysis, we restricted the datasets to the domain of gene expression analysis. Our results indicate that, although the computational power saved by using simpler estimation methods can be used to increase the number of solutions tested in the search process, the use of accurate estimates to guide that search is the most important factor to obtain good solutions.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Alon, U., Barkai, N., Notterman, D.A., Gish, K., Ybarra, S., Mack, D., Levine, A.J.: Broad patterns of gene expression revealed by clustering analysis of tumor and normal colon tissues probed by oligonucleotide arrays. Proc. of the National Academy of Sciences 96(12), 6745–6750 (1999)
Blanco, A., Delgado, M., Pegalajar, M.C.: A real-coded genetic algorithm for training recurrent neural networks. Neural Networks 14(1), 93–105 (2001)
Castillo, P.A., Merelo, J.J., Arenas, M.G., Romero, G.: Comparing evolutionary hybrid systems for design and optimization of multilayer perceptron structure along training parameters. Information Sciences 177(14), 2884–2905 (2007)
Castro, L.N., Von-Zuben, F.: Learning and optimization using the clonal selection principle. IEEE Transactions on Evolutionary Computation 6(3), 239–251 (2002)
Chen, Y., Abraham, A.: Hybrid learning methods for stock index modeling. In: Kamruzzaman, J., Begg, R.K., Sarker, R.A. (eds.) Artificial Neural Networks in Finance, Health and Manufacturing: Potential and Challenges, Idea Group Inc. Publishers, USA (2006)
Dorigo, M., Birattari, M., Stutzle, T.: Ant colony optimization: Artificial ants as a computational intelligence technique. IEEE Comp. Intel. Mag. 1(4), 28–39 (2006)
Gao, L., Zhou, C., Gao, H.-B., Shi, Y.-R.: Credit scoring model based on neural network with particle swarm optimization. In: Proc. of the Sec. Int. Conf. on Advances in Natural Computation, pp. 76–79. Springer, Heidelberg (2006)
Goldberg, D.E.: Genetic Algorithms in Search, Optimization and Machine Learning. Addison-Wesley, Reading (1989)
Haslinger, C., Schweifer, N., Stilgenbauer, S., Dohner, H., Lichter, P., Kraut, N., Stratowa, C., Abseher, R.: Microarray Gene Expression Profiling of B-Cell Chronic Lymphocytic Leukemia Subgroups Defined by Genomic Aberrations and VH Mutation Status. J. of Clinical Oncology 22(19), 3937–3949 (2004)
Haykin, S.: Neural Networks: A Comprehensive Foundation. Prentice-Hall, Englewood Cliffs (1999)
Ishikawa, M., Yoshida, K., Yamashita, Y., Ota, J., Takada, S., Kisanuki, H., Koinuma, K., Choi, Y.L., Kaneda, R., Iwao, T., Tamada, K., Sugano, K., Mano, H.: Experimental trial for diagnosis of pancreatic ductal carcinoma based on gene expression profiles of pancreatic ductal cells. Cancer Science 96(7), 387–393 (2005)
Kim, D.H., Abraham, A.: Optimal learning of fuzzy neural network using artificial immune algorithm. Neural Network World 18(2), 147–170 (2008)
Lacerda, E.G.M., Carvalho, A.C.P.L.F., Ludermir, T.B.: Model selection via genetic algorithms for rbf networks. J. of Intelligent and Fuzzy Systems 13(2-4), 111–122 (2002)
Leung, F.H.F., Lam, H.K., Ling, S.H., Tam, P.K.S.: Tuning of the structure and parameters of a neural network using an improved genetic algorithm. IEEE Transactions on Neural Networks 14(1), 79–88 (2003)
Nutt, C.L., Mani, D.R., Betensky, R.A., Tamayo, P., Gregory Cairncross, J., Ladd, C., Pohl, U., Hartmann, C., McLaughlin, M.E., Batchelor, T.T., Black, P.M., von Deimling, A., Pomeroy, S.L., Golub, T.R., Louis, D.N.: Gene expression-based classification of malignant gliomas correlates better with survival than histological classification. Cancer Research 63(7), 1602–1607 (2003)
Rossi, A.L.D., Carvalho, A.C.P.L.F., Soares, C.: Bio-inspired parameter tunning of mlp networks for gene expression analysis. In: International Conference on Hybrid Intelligent Systems, pp. 435–440 (2008)
Rumelhart, D.E., Hinton, G.E., Williams, R.J.: Learning internal representations by error propagation. In: Parallel distributed processing: explorations in the microstructure of cognition, foundations, vol. 1, pp. 318–362. MIT Press, Cambridge (1986)
Shi, Y., Eberhart, R.: A modified particle swarm optimizer. In: Proc. of the IEEE Int. Conference on Evolutionary Computation, pp. 69–73. Anchorage, Alaska (1998)
Tsai, J.-T., Chou, J.-H., Liu, T.-K.: Tuning the structure and parameters of a neural network by using hybrid taguchi-genetic algorithm. IEEE Transactions on Neural Networks 17(1), 69–80 (2006)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Rossi, A.L.D., Soares, C., Carvalho, A.C.P.L.F. (2009). Bioinspired Parameter Tuning of MLP Networks for Gene Expression Analysis: Quality of Fitness Estimates vs. Number of Solutions Analysed. In: Köppen, M., Kasabov, N., Coghill, G. (eds) Advances in Neuro-Information Processing. ICONIP 2008. Lecture Notes in Computer Science, vol 5507. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-03040-6_31
Download citation
DOI: https://doi.org/10.1007/978-3-642-03040-6_31
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-03039-0
Online ISBN: 978-3-642-03040-6
eBook Packages: Computer ScienceComputer Science (R0)