, Volume 244, Issue 1, pp 187-208
Date: 13 Nov 2003

A Statistical Approach to the Asymptotic Behavior of a Class of Generalized Nonlinear Schrödinger Equations

Rent the article at a discount

Rent now

* Final gross prices may vary according to local VAT.

Get Access

Abstract

A statistical relaxation phenomenon is studied for a general class of dispersive wave equations of nonlinear Schrödinger-type which govern non-integrable, non-singular dynamics. In a bounded domain the solutions of these equations have been shown numerically to tend in the long-time limit toward a Gibbsian statistical equilibrium state consisting of a ground-state solitary wave on the large scales and Gaussian fluctuations on the small scales. The main result of the paper is a large deviation principle that expresses this concentration phenomenon precisely in the relevant continuum limit. The large deviation principle pertains to a process governed by a Gibbs ensemble that is canonical in energy and microcanonical in particle number. Some supporting Monte-Carlo simulations of these ensembles are also included to show the dependence of the concentration phenomenon on the properties of the dispersive wave equation, especially the high frequency growth of the dispersion relation. The large deviation principle for the process governed by the Gibbs ensemble is based on a large deviation principle for Gaussian processes, for which two independent proofs are given.

Communicated by P. Constantin
This research was supported in part by grants from the Department of Energy (DE-FG02-99ER25376) and from the National Science Foundation (NSF-DMS-0202309)
This research was partially supported by a Mathematical Sciences Postdoctoral Research Fellowship from the National Science Foundation.
This research was supported in part by grants from the Department of Energy (DE-FG02-99ER25376) and from the National Science Foundation (NSF-DMS-0207064).