A Statistical Approach to the Asymptotic Behavior of a Class of Generalized Nonlinear Schrödinger Equations
- First Online:
- 92 Downloads
A statistical relaxation phenomenon is studied for a general class of dispersive wave equations of nonlinear Schrödinger-type which govern non-integrable, non-singular dynamics. In a bounded domain the solutions of these equations have been shown numerically to tend in the long-time limit toward a Gibbsian statistical equilibrium state consisting of a ground-state solitary wave on the large scales and Gaussian fluctuations on the small scales. The main result of the paper is a large deviation principle that expresses this concentration phenomenon precisely in the relevant continuum limit. The large deviation principle pertains to a process governed by a Gibbs ensemble that is canonical in energy and microcanonical in particle number. Some supporting Monte-Carlo simulations of these ensembles are also included to show the dependence of the concentration phenomenon on the properties of the dispersive wave equation, especially the high frequency growth of the dispersion relation. The large deviation principle for the process governed by the Gibbs ensemble is based on a large deviation principle for Gaussian processes, for which two independent proofs are given.
Unable to display preview. Download preview PDF.