Fully Complex Multi-Layer Perceptron Network for Nonlinear Signal Processing

Rent the article at a discount

Rent now

* Final gross prices may vary according to local VAT.

Get Access


Designing a neural network (NN) to process complex-valued signals is a challenging task since a complex nonlinear activation function (AF) cannot be both analytic and bounded everywhere in the complex plane ℂ. To avoid this difficulty, ‘splitting’, i.e., using a pair of real sigmoidal functions for the real and imaginary components has been the traditional approach. However, this ‘ad hoc’ compromise to avoid the unbounded nature of nonlinear complex functions results in a nowhere analytic AF that performs the error back-propagation (BP) using the split derivatives of the real and imaginary components instead of relying on well-defined fully complex derivatives. In this paper, a fully complex multi-layer perceptron (MLP) structure that yields a simplified complex-valued back-propagation (BP) algorithm is presented. The simplified BP verifies that the fully complex BP weight update formula is the complex conjugate form of real BP formula and the split complex BP is a special case of the fully complex BP. This generalization is possible by employing elementary transcendental functions (ETFs) that are almost everywhere (a.e.) bounded and analytic in ℂ. The properties of fully complex MLP are investigated and the advantage of ETFs over split complex AF is shown in numerical examples where nonlinear magnitude and phase distortions of non-constant modulus modulated signals are successfully restored.