Sensitivity Analysis for Neural Networks pp 29-31 | Cite as
Sensitivity Analysis with Parameterized Activation Function
Chapter
First Online:
Abstract
Among all the traditional methods introduced in Chap. 2, none has involved activation function in the calculation of sensitivity analysis. This chapter attempts to generalize Piché’s method by parameterizing antisymmetric squashing activation functions, through which a universal expression of MLP’s sensitivity will be derived without any restriction on input or output perturbations.
Copyright information
© Springer-Verlag Berlin Heidelberg 2009