Telecommunication Systems

, Volume 46, Issue 3, pp 217–243 | Cite as

A line search approach for high dimensional function optimization

  • Crina Grosan
  • Ajith Abraham
  • Aboul Ella Hassainen


This paper proposes a modified line search method which makes use of partial derivatives and re-starts the search process after a given number of iterations by modifying the boundaries based on the best solution obtained at the previous iteration (or set of iterations). Using several high dimensional benchmark functions, we illustrate that the proposed Line Search Re-Start (LSRS) approach is very suitable for high dimensional global optimization problems. Performance of the proposed algorithm is compared with two popular global optimization approaches, namely, genetic algorithm and particle swarm optimization method. Empirical results for up to 10,000 dimensions clearly illustrate that the proposed approach performs very well for the tested high dimensional functions.


Global optimization Line search Re-start High dimensions 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Addis, A., & Leyffer, S. (2006). A trust-region algorithm for global optimization. Computational Optimization and Applications, 35, 287–304. CrossRefGoogle Scholar
  2. 2.
    Bäck, T., Fogel, D., & Michalewicz, Z. (1997). Handbook of evolutionary computation. New York: IOP Publishing and Oxford University Press. CrossRefGoogle Scholar
  3. 3.
    Bäck, T., Fogel, D., & Michalewicz, Z. (2000). Evolutionary computation 1: basic algorithms and operators. Bristol: IOP Publishing. Google Scholar
  4. 4.
    Baritompa, B., & Hendrix, E. M. T. (2005). On the investigation of stochastic global optimization algorithms. Journal of Global Optimization, 31(4), 567–578. CrossRefGoogle Scholar
  5. 5.
    Bomze, I. M., Csendes, T., Horst, R., & Pardalos, P. M. (Eds.) (1996). Developments in global optimization. Dordrecht/Boston/London: Kluwer Academic. Google Scholar
  6. 6.
    Byrd, R. H., Dert, C. L., Rinnooy Kan, A. H. G., & Schnabel, R. B. (1990). Concurrent stochastic methods for global optimization. Mathematical Programming, 45(1–3), 1–29. CrossRefGoogle Scholar
  7. 7.
    Dixon, L. C. W., & Szegö, G. P. (Eds.) (1978). Towards global optimization 2. Amsterdam: North-Holland. Google Scholar
  8. 8.
    Eberhart, R. C., & Kennedy, J. (1995). A new optimizer using particle swarm theory. In Proceedings of the sixth international symposium on micromachine and human science, Nagoya, Japan, 1995 (pp. 39–43). Google Scholar
  9. 9.
    Emmerich, M. T. M., Giannakoglou, K. C., & Naujoks, B. (2006). Single and multiobjective evolutionary optimization assisted by Gaussian random field metamodels. IEEE Transactions on Evolutionary Computation, 10(4), 421–439. CrossRefGoogle Scholar
  10. 10.
    Floudas, C. A., & Pardalos, P. M. (1990). A collection of test problems for constrained global optimization algorithms. In G. Goods & J. Hartmanis (Eds.) Lecture notes in computer science : Vol. 455. Berlin: Springer. Google Scholar
  11. 11.
    Floudas, C. A., & Pardalos, P. M. (Eds.) (2001). Encyclopaedia of optimization. Dordrecht: Kluwer Academic. Google Scholar
  12. 12.
    Floudas, C. A., & Pardalos, P. M. (Eds.) (2003). Frontiers in global optimization. Dordrecht: Kluwer Academic. Google Scholar
  13. 13.
    Gergel, V. (1997). A global optimization algorithm for multivariate functions with Lipschizian first derivatives. Journal of Global Optimization, 10, 257–281. CrossRefGoogle Scholar
  14. 14.
    Goldberg, D. E. (1989). Genetic algorithms in search, optimization and machine learning. Reading: Addison-Wesley. Google Scholar
  15. 15.
    Grosan, C., & Oltean, M. (2005). Adaptive representation for single objective optimization. Soft Computing, 9(8), 594–605. CrossRefGoogle Scholar
  16. 16.
    Hedar, A., & Fukushima, M. (2004). Heuristic pattern search and its hybridization with simulated annealing for nonlinear global optimization. Optimization Methods and Software, 19, 291–308. CrossRefGoogle Scholar
  17. 17.
    Hedar, A. R., & Fukushima, M. (2006). Tabu search directed by direct search methods for nonlinear global optimization. European Journal of Operations Research, 170, 329–349. CrossRefGoogle Scholar
  18. 18.
    Hirsch, M. J., Meneses, C. N., Pardalos, P. M., & Resende, M. G. C. (2007). Global optimization by continuous grasp. Optimization Letters, 1, 201–212. CrossRefGoogle Scholar
  19. 19.
    Hofinger, S., Schindler, T., & Aszodi, A. (2002). Parallel global optimization of high-dimensional problems. In D. Kranzlmuller et al. (Eds.) Lecture notes in computer science : Vol. 2474. Euro PVM/MPI (pp. 148–155). Berlin: Springer. Google Scholar
  20. 20.
    Holland, J. H. (1975). Adaptation in natural and artificial system. Ann Arbor. Michigan: The Michigan University Press. Google Scholar
  21. 21.
    Horst, R., & Tuy, H. (1996). Global optimization—deterministic approaches. Berlin/Heidelberg/New York: Springer. Google Scholar
  22. 22.
    Horst, R., & Pardalos, P. M. (Eds.) (1995). Handbook of global optimization. Dordrecht/Boston/London: Kluwer Academic. Google Scholar
  23. 23.
    Hu, X., Shi, Y., & Eberhart, R. C. (2004). Recent advances in particle swarm. In Proceedings of congress on evolutionary computation (CEC), Portland, Oregon, 2004 (pp. 90–97). Google Scholar
  24. 24.
    Ismael, A., Vaz, F., & Vicente, L. N. (2007). A particle swarm pattern search method for bound constrained global optimization. Journal of Global Optimization. Google Scholar
  25. 25.
    Kennedy, J. (1997). The particle swarm: social adaptation of knowledge. In Proceedings of IEEE international conference on evolutionary computation, Indianapolis, Indiana (pp. 303–308). Berlin: Springer. Google Scholar
  26. 26.
    Kennedy, J. (1997). Minds and cultures: particle swarm implications. In Socially intelligent agents (Technical Report FS-97-02) (pp. 67–72). Menlo Park: AAAI Press. Papers from the 1997 AAAI Fall Symposium. Google Scholar
  27. 27.
    Kennedy, J. (1998). The behavior of particles. In Proceedings of 7th annual conference on evolutionary programming, San Diego, USA, 1998. Google Scholar
  28. 28.
    Kennedy, J. (1998). Thinking is social: experiments with the adaptive culture model. Journal of Conflict Resolution, 42, 56–76. CrossRefGoogle Scholar
  29. 29.
    Kennedy, J., & Eberhart, R. (1995). Particle swarm optimization, In Proceedings of IEEE international conference on neural networks, 1995 (pp. 1942–1948). Google Scholar
  30. 30.
    Kennedy, J., Eberhart, R., & Shi, Y. (2001). Swarm intelligence. San Mateo, San Diego: Morgan Kaufmann, Academic Press. Google Scholar
  31. 31.
    Koumousis, V. K., & Katsaras, C. P. (2006). A saw-tooth genetic algorithm combining the effects of variable population size and reinitialization to enhance performance. IEEE Transactions on Evolutionary Computation, 10(1), 19–28. CrossRefGoogle Scholar
  32. 32.
    Krishnakumar, K. (1989). Micro-genetic algorithms for stationary and nonstationary function optimization. In Proceedings of SPIE intelligent control adaptive systems (pp. 289–296). Bellingham: SPIE Press. Google Scholar
  33. 33.
    Liang, J. J., Qin, A. K., Suganthan, P. N., & Baskar, S. (2006). Comprehensive learning particle swarm optimizer for global optimization of multimodal functions. IEEE Transactions on Evolutionary Computation, 10(3), 281–295. CrossRefGoogle Scholar
  34. 34.
    Liu, H., Abraham, A., & Zhang, W. (2007). A fuzzy adaptive turbulent particle swarm optimization, International Journal of Innovative Computing and Applications, 1(1). Google Scholar
  35. 35.
    Maaranen, H., Miettinen, K., & Penttinen, A. (2007). On initial populations of a genetic algorithm for continuous optimization problems. Journal of Global Optimization, 37, 405–436. CrossRefGoogle Scholar
  36. 36.
    Macready, W. G., & Wolpert, D. H. (1997). The no free lunch theorems. IEEE Transactions on Evolutionary Computation, 1(1), 67–82. CrossRefGoogle Scholar
  37. 37.
    Migdalas, A., Pardalos, P. M., & Storoy, S. (Eds.) (1997). Parallel computing in optimization. Dordrecht: Kluwer Academic. Google Scholar
  38. 38.
    Moré, J. J., Garbow, B. S., & Hillstrom, K. E. (1981). Testing unconstrained optimization software. ACM Transactions on Mathematical Software, 7(1), 17–41. CrossRefGoogle Scholar
  39. 39.
    Pardalos, P. M., & Rosen, J. B. (Eds.) (1990). Computational methods in global optimization. Annals of Operations Research, 25. Google Scholar
  40. 40.
    Parsopoulos, K. E., & Vrahitis, M. N. (2002). Recent approaches to global optimization problems through Particle Swarm Optimization. Natural Computing, 1, 235–306. CrossRefGoogle Scholar
  41. 41.
    Pintér, J. D. (1996). Global optimization in action. Dordrecht/Boston/London: Kluwer Academic. Google Scholar
  42. 42.
    Schutte, J. F., Reinbolt, J. A., Fregly, B. J., Haftka, R. T., & George, A. D. (2004). Parallel global optimization with the particle swarm algorithm. International Journal for Numerical Methods in Engineering, 61, 2296–2315. CrossRefGoogle Scholar
  43. 43.
    Stepanenco, S., & Engels, B. (2007). Gradient tabu search. Journal of Computational Chemistry, 28(2), 601–611. CrossRefGoogle Scholar
  44. 44.
    Törn, A. A., & Zilinskas, A. (1989). In Lecture notes in computer science : Vol. 350. Global optimization. Berlin: Springer. Google Scholar
  45. 45.
    Trafalis, T. B., & Kasap, S. (2002). A novel metaheuristic approach for continuous global optimization. Journal of Global Optimization, 23, 171–190. CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC 2010

Authors and Affiliations

  • Crina Grosan
    • 1
  • Ajith Abraham
    • 2
  • Aboul Ella Hassainen
    • 3
  1. 1.Department of Computer ScienceBabes-Bolyai UniversityCluj-NapocaRomania
  2. 2.Machine Intelligence Research Labs (MIR Labs)Scientific Network for Innovation and Research ExcellenceWashingtonUSA
  3. 3.Quantitative Methods and IS Department, College of Business AdministrationKuwait UniversitySafatKuwait

Personalised recommendations