Abstract
A typical microwave burst on 1968 January 11, 1700 UT is used to demonstrate that the radiation spectrum at maximum phase can be described by gyromagnetic absorption. A model for the source is derived from the observed spectrum. With the aid of this model, we try to explain the decreasing phase of the burst intensity. Satisfactory agreement with observation is obtained, when one assumes that the cooling of the burst plasma is caused by heat conduction parallel to the magnetic field lines.
Similar content being viewed by others
References
Culhane, J. L., Vesecky, J. F., and Phillips, K. J. H.: 1970, Solar Phys. 15, 394.
Fürst, E.: 1972, Solar Phys. 25, 178.
Kahler, S.: 1971, Astrophys. J. 164, 365.
Kai, K.: 1965, Publ. Astron. Soc. Japan 16, 309.
Oster, L. and Sofia, S.: 1966, Astrophys. J. 143, 944.
Solar-Geophysical Data: 1968, Boulder, Colo.
Spitzer, L.: 1962, Physics of Fully Ionized Gases, Interscience Publishers, New York.
Takakura, T.: 1961, Publ. Astron. Soc. Japan 13, 166.
Takakura, T. and Kai, K.: 1966, Publ. Astron. Soc. Japan 18, 57.
Wild, J. P.: 1964, in AAS-NASA Symposium on Physics of Solar Flares, U.S. Government Printing Office, Washington, p. 161.
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Fürst, E. The intensity decrease of microwave bursts. Sol Phys 28, 159–168 (1973). https://doi.org/10.1007/BF00152920
Received:
Issue Date:
DOI: https://doi.org/10.1007/BF00152920