Derivative Free Stochastic Discrete Gradient Method with Adaptive Mutation
In data mining we come across many problems such as function optimization problem or parameter estimation problem for classifiers for which a good learning algorithm for searching is very much necessary. In this paper we propose a stochastic based derivative free algorithm for unconstrained optimization problem. Many derivative-based local search methods exist which usually stuck into local solution for non-convex optimization problems. On the other hand global search methods are very time consuming and works for only limited number of variables. In this paper we investigate a derivative free multi search gradient based method which overcomes the problems of local minima and produces global solution in less time. We have tested the proposed method on many benchmark dataset in literature and compared the results with other existing algorithms. The results are very promising.
KeywordsDescent Direction Nonsmooth Optimization Adaptive Mutation Discrete Gradient Local Optimization Method
Unable to display preview. Download preview PDF.
- 2.Bagirov, A.M.: Derivative-free methods for unconstrained nonsmooth optimization and its numerical analysis. Investigacao Operacional, 19–75 (1999)Google Scholar
- 3.Hiriart-Urruty, J.B., Lemarechal, C.: Convex Analysis and Minimization Algorithms. Springer, New York (1993)Google Scholar
- 4.Clarke, F.: Optimization and Non-smooth Analysis. John Wiley & Sons, New York (1983)Google Scholar