Wave breaking in beam-plasma system with finite geometry
We study wave breaking in a beam-plasma system placed in an infinite magnetic field, with finite geometry. A purely nonlinear nondispersive equation is deduced with the help of a reductive perturbation technique. Numerical analysis clearly shows that the initial profile of the wave (either parabolic or circular) grows with time leading to a discontinuous form of the wavefront, the phenomenon of wave breaking.
KeywordsMagnetic Field Field Theory Elementary Particle Quantum Field Theory Wave Breaking
Unable to display preview. Download preview PDF.
- Bhatnagar, P. L. (1982).Nonlinear Waves in Dispersive Medium, Oxford University Press, Oxford.Google Scholar
- Das, K. P., and Ghosh, B. (1986).Journal of Plasma Physics,36, 135.Google Scholar
- Lamb, H., Jr. (1980).Soliton Theory, Wiley, New York.Google Scholar
- Lonngren, K. E. (1983).Plasma Physics,25, 943.Google Scholar
- Washimi, H., and Taniuti, T. (1966).Physical Review Letters,17, 966.Google Scholar