Parameter Optimization in Convolutional Neural Networks Using Gradient Descent
The aim of the present study is to develop an algorithm by exploring the basic structure and functional operation of convolution neural network's (CNN) design as well as the features of back propagation (BP) and gradient descent (GD). The hierarchical approach employing BP and GD offers a reliable form of an algorithm. Most of the algorithms share weight to minimize the size of parameter and cost function that can be joined with the BP and the GD. Backpropagation provides an opportunity for backward feedback to enhance the reliability with minimizing error, while the gradient descent is used to solve the issues related with deep learning as well as the machine learning algorithms.