Skip to main content

Advertisement

Log in

Identifying Flux Rope Signatures Using a Deep Neural Network

  • Published:
Solar Physics Aims and scope Submit manuscript

Abstract

Among the current challenges in space weather, one of the main ones is to forecast the internal magnetic configuration within interplanetary coronal mass ejections (ICMEs). The classification of such an arrangement is essential to predict geomagnetic disturbances. When a monotonic and coherent magnetic configuration is observed, it is associated with the result of a spacecraft crossing a large flux rope with the topology of helical magnetic field lines. This article applies machine learning and a current physical flux rope analytical model to identify and further understand the internal structure of ICMEs. We trained an image recognition artificial neural network with analytical flux rope data, generated from the range of many possible trajectories within a cylindrical (circular and elliptical cross-section) model. The trained network was then evaluated against the observed ICMEs from Wind during 1995–2015.

The methodology developed in this article can classify 84% of simple real cases correctly and has a 76% success rate when extended to a broader set with 5% noise applied, although it does exhibit a bias in favor of positive flux rope classification. As a first step towards a generalizable classification and parameterization tool, these results are promising. With further tuning and refinement, our model presents a strong potential to evolve into a robust tool for identifying flux rope configurations from in situ data.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Figure 1
Figure 2
Figure 3
Figure 4
Figure 5
Figure 6
Figure 7
Figure 8
Figure 9
Figure 10
Figure 11
Figure 12

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

  • Barshan, E., Fieguth, P.: 2015, Stage-wise training: An improved feature learning strategy for deep models. In: Storcheus, D., Rostamizadeh, A., Kumar, S. (eds.) Feature Extraction: Modern Questions and Challenges, Proc. Machine Learning Res. 44, 49.

    Google Scholar 

  • Burlaga, L., Sittler, E., Mariani, F., Schwenn, R.: 1981, Magnetic loop behind an interplanetary shock: Voyager, Helios, and IMP 8 observations. J. Geophys. Res. 86(A8), 6673. DOI. ADS.

    Article  ADS  Google Scholar 

  • Camporeale, E.: 2019, The challenge of machine learning in space weather: Nowcasting and forecasting. Space Weather 17(8), 1166. DOI. ADS.

    Article  ADS  Google Scholar 

  • Chetlur, S., Woolley, C., Vandermersch, P., Cohen, J., Tran, J., Catanzaro, B., Shelhamer, E.: 2014, cuDNN: Efficient primitives for deep learning. arXiv e-prints. arXiv. ADS.

  • Ciresan, D.C., Meier, U., Masci, J., Gambardella, L.M., Schmidhuber, J.: 2011, Flexible, high performance convolutional neural networks for image classification. 22nd. Int. Joint Conf. Artificial Intelligence..

    Google Scholar 

  • Goodfellow, I., Bengio, Y., Courville, A., Bengio, Y.: 2016, Deep Learning 1, MIT Press, Cambridge. http://www.deeplearningbook.org.

    MATH  Google Scholar 

  • Hunter, J.D.: 2007, Matplotlib: A 2D graphics environment. Comput. Sci. Eng. 9(3), 90. DOI. ADS.

    Article  Google Scholar 

  • Kingma, D.P., Ba, J.: 2015, Adam: A method for stochastic optimization. In: Bengio, Y., LeCun, Y. (eds.) 3rd Int. Conf. on Learning Representations.

    Google Scholar 

  • Klein, L.W., Burlaga, L.F.: 1982, Interplanetary magnetic clouds at 1 AU. J. Geophys. Res. 87(A2), 613. DOI. ADS.

    Article  ADS  Google Scholar 

  • LeCun, Y., Bengio, Y.: 1995, Convolutional networks for images, speech, and time series. In: The Handbook of Brain Theory and Neural Networks 3361(10).

    Google Scholar 

  • Lepping, R.P., Jones, J.A., Burlaga, L.F.: 1990, Magnetic field structure of interplanetary magnetic clouds at 1 AU. J. Geophys. Res. 95(A8), 11957. DOI. ADS.

    Article  ADS  Google Scholar 

  • Lepping, R.P., Acũna, M.H., Burlaga, L.F., Farrell, W.M., Slavin, J.A., Schatten, K.H., Mariani, F., Ness, N.F., Neubauer, F.M., Whang, Y.C., Byrnes, J.B., Kennon, R.S., Panetta, P.V., Scheifele, J., Worley, E.M.: 1995, The wind magnetic field investigation. Space Sci. Rev. 71(1–4), 207. DOI. ADS.

    Article  ADS  Google Scholar 

  • Manchester, W., Kilpua, E.K.J., Liu, Y.D., Lugaz, N., Riley, P., Török, T., Vršnak, B.: 2017, The physical processes of CME/ICME evolution. Space Sci. Rev. 212(3–4), 1159. DOI. ADS.

    Article  ADS  Google Scholar 

  • McKinney, W.: 2010, Data structures for statistical computing in python. In: van der Walt, S., Millman, J. (eds.) Proc. 9th Python Sci. Conf., 56. DOI.

    Chapter  Google Scholar 

  • Nair, V., Hinton, G.E.: 2010, Rectified linear units improve restricted Boltzmann machines. Proc. 27th Int. Conf. on Machine Learning, 807. ISBN 978-1-60558-907-7.

    Google Scholar 

  • Nieves-Chinchilla, T., Linton, M.G., Hidalgo, M.A., Vourlidas, A., Savani, N.P., Szabo, A., Farrugia, C., Yu, W.: 2016, A circular-cylindrical flux-rope analytical model for magnetic clouds. Astrophys. J. 823(1), 27. DOI. ADS.

    Article  ADS  Google Scholar 

  • Nieves-Chinchilla, T., Linton, M.G., Hidalgo, M.A., Vourlidas, A.: 2018a, Elliptic-cylindrical analytical flux rope model for magnetic clouds. Astrophys. J. 861(2), 139. DOI. ADS.

    Article  ADS  Google Scholar 

  • Nieves-Chinchilla, T., Vourlidas, A., Raymond, J.C., Linton, M.G., Al-haddad, N., Savani, N.P., Szabo, A., Hidalgo, M.A.: 2018b, Understanding the internal magnetic field configurations of ICMEs using more than 20 years of wind observations. Solar Phys. 293(2), 25. DOI. ADS.

    Article  ADS  Google Scholar 

  • Nieves-Chinchilla, T., Jian, L.K., Balmaceda, L., Vourlidas, A., dos Santos, L.F.G., Szabo, A.: 2019, Unraveling the internal magnetic field structure of the Earth-directed interplanetary coronal mass ejections during 1995 - 2015. Solar Phys. 294(7), 89. DOI. ADS.

    Article  ADS  Google Scholar 

  • Ogilvie, K.W., Chornay, D.J., Fritzenreiter, R.J., Hunsaker, F., Keller, J., Lobell, J., Miller, G., Scudder, J.D., Sittler, J.E.C., Torbert, R.B., Bodet, D., Needell, G., Lazarus, A.J., Steinberg, J.T., Tappan, J.H., Mavretic, A., Gergin, E.: 1995, SWE, a comprehensive plasma instrument for the wind spacecraft. Space Sci. Rev. 71(1–4), 55. DOI. ADS.

    Article  ADS  Google Scholar 

  • Paszke, A., Gross, S., Chintala, S., Chanan, G., Yang, E., DeVito, Z., Lin, Z., Desmaison, A., Antiga, L., Lerer, A.: 2017, Automatic differentiation in PyTorch. NeurIPS Autodiff Workshop.

    Google Scholar 

  • Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M. (Édouard Duchesnay): 2011, Scikit-learn: Machine learning in python. J. Mach. Learn. Res. 12(85), 2825.

    MathSciNet  MATH  Google Scholar 

  • Reback, J., McKinney, W., jbrockmendel, den Bossche, J.V., Augspurger, T., Cloud, P., gfyoung, Sinhrks, Klein, A., Hawkins, S., Roeschke, M., Tratner, J., She, C., Ayd, W., Petersen, T., MomIsBestFriend, Garcia, M., Schendel, J., Hayden, A., Jancauskas, V., Battiston, P., Saxton, D., Seabold, S., McMaster, A., chris-b1, h-vetinari, Hoyer, S., Dong, K., Overmeire, W., Winkel, M.: 2020. pandas-dev/pandas: Pandas 1.1.0. DOI.

  • Richardson, I.G., Cane, H.V.: 2004, Identification of interplanetary coronal mass ejections at 1 AU using multiple solar wind plasma composition anomalies. J. Geophys. Res. 109(A9), A09104. DOI. ADS.

    Article  ADS  Google Scholar 

  • Stehman, S.V.: 1997, Selecting and interpreting measures of thematic classification accuracy. Remote Sens. Environ. 62(1), 77. DOI. ADS.

    Article  ADS  Google Scholar 

  • Tremblay, J., Prakash, A., Acuna, D., Brophy, M., Jampani, V., Anil, C., To, T., Cameracci, E., Boochoon, S., Birchfield, S.: 2018, Training deep networks with synthetic data: Bridging the reality gap by domain randomization. In: Proceedings IEEE Conf. on Computer Vision and Pattern Recognition, 969. ADS.

    Google Scholar 

  • van der Walt, S., Colbert, S.C., Varoquaux, G.: 2011, The NumPy array: A structure for efficient numerical computation. Comput. Sci. Eng. 13(2), 22. DOI. ADS.

    Article  Google Scholar 

  • Virtanen, P., Gommers, R., Oliphant, T.E., Haberland, M., Reddy, T., Cournapeau, D., Burovski, E., Peterson, P., Weckesser, W., Bright, J., van der Walt, S.J., Brett, M., Wilson, J., Jarrod Millman, K., Mayorov, N., Nelson, A.R.J., Jones, E., Kern, R., Larson, E., Carey, C., Polat, İ., Feng, Y., Moore, E.W., VanderPlas, J., Laxalde, D., Perktold, J., Cimrman, R., Henriksen, I., Quintero, E.A., Harris, C.R., Archibald, A.M., Ribeiro, A.H., Pedregosa, F., van Mulbregt, P. (Contributors, S.) et al.: 2020, SciPy 1.0: Fundamental algorithms for scientific computing in python. Nat. Methods 17, 261. DOI. https://rdcu.be/b08Wh.

    Article  Google Scholar 

  • Vourlidas, A.: 2014, The flux rope nature of coronal mass ejections. Plasma Phys. Control. Fusion 56(6), 064001. DOI. ADS.

    Article  ADS  Google Scholar 

Download references

Acknowledgements

We thank Dr. Barbara Thompson and Tiago Pinho Da Silva M.S. for all discussions and reviews during the work done in this paper. We thank Marta Florido-Llinas BSA for making the flux rope interactive tool available. Resources supporting this work were provided by the NASA High-End Computing (HEC) Program through the NASA Center for Climate Simulation (NCCS) at Goddard Space Flight Center. This material is based upon work supported by the National Science Foundation under Grant No. AGS-1433086. T. N-C also acknowledges Goddard Strategic Collaboration Initiative. We acknowledge the tools used in this work. We used CUDA for processing (cuDNN) (Chetlur et al., 2014), for data analysis and processing we used Numpy (van der Walt, Colbert, and Varoquaux, 2011), Pandas (McKinney, 2010, and Reback et al., 2020), SciPy (Virtanen et al., 2020), and scikit-learn (Pedregosa et al., 2011), and all plots were done using Matplotlib (Hunter, 2007).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Luiz F. G. dos Santos.

Ethics declarations

Disclosure of Potential Conflicts of Interest

The authors declare that there are no conflicts of interest.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This article belongs to the Topical Collection:

Towards Future Research on Space Weather Drivers

Guest Editors: Hebe Cremades and Teresa Nieves-Chinchilla

Appendices

Appendix A: Metrics

\(Accuracy\) (Equation 1) is the ratio of the true cases to the total number of instances. \(Precision\) (Equation 2) is the fraction of relevant instances among the retrieved instances, while \(Recall\) (Equation 3) is the fraction of the total number of relevant instances that were retrieved. Both \(Precision\) and \(Recall\) need to be taken into account when evaluating the performance of a predictive model. \(F_{1}~Score\) (Equation 4) is a well-established measure of the performance of a predictor that considers both \(Precision\) (Equation 2) and \(Recall\) (Equation 3). Its ideal value is one, and worst value is zero.

$$\begin{gathered} Accuracy = \frac{True~Positive + True~Negative}{Total~Number~of~Instances}. \end{gathered}$$
(1)
$$\begin{gathered} Precision = \frac{True~Positive}{True~Positive + False~Positive}. \end{gathered}$$
(2)
$$\begin{gathered} Recall = \frac{True~Positive}{True~Positive + False~Negative}. \end{gathered}$$
(3)
$$\begin{gathered} F_{1} Score = \frac{2 * Precision * Recall}{Precision + Recall}. \end{gathered}$$
(4)

Appendix B: Complete Classification

Table 4 contains the results for the classification done in all the 353 cases and has the necessary information to compare the classification done in the reference catalog and the classification done by the DCNN model with different amounts of noise. The events marked with an asterisk (*) were used in the evaluation subset part of the training.

Table 4 List of all the ICMEs from Nieves-Chinchilla et al. (2019). For each selected ICME the table presents, event number, ICME start date, label assigned by the reference catalog, classification with the no-noise model, classification with \(5\%\) noise model, and classification with \(10\%\) noise model. The events marked with the asterisk (*) were used in the evaluation part of the training.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

dos Santos, L.F.G., Narock, A., Nieves-Chinchilla, T. et al. Identifying Flux Rope Signatures Using a Deep Neural Network. Sol Phys 295, 131 (2020). https://doi.org/10.1007/s11207-020-01697-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s11207-020-01697-x

Keywords

Navigation