Advertisement

An Information Distance Metric Preserving Projection Algorithm

Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 849)

Abstract

This paper proposes a novel dimensionality reduction algorithm. The algorithm, coined information distance metric preserving projection (IDPP), aims to identify the complicated intrinsic property of high dimensional space. IDPP employed geodesic information distance to evaluate the relationship between each pair-wise data points. It yielded a distance preserving projection to map sample data from high dimensional observation space to low dimensional feature one. IDPP preserves intrinsic structure of high dimensional space globally. It possesses explicit projection formula which makes it easily to be used for new sample data. Unsupervised and supervised approaches constructed on the basis of IDPP was evaluated on financial data. Experimental results show that trustworthiness of IDPP is almost the same as ISOMAP, and it performs much better than the rival algorithms.

Keywords

Dimensionality reduction Manifold learning Financial analysis 

Notes

Acknowledgments

This work is supported partly by the Beijing Social Science Foundation (NO. 16SRB021), Beijing Philosophy and Social Science Foundation (NO. 16YJB029).

References

  1. 1.
    Moyer, R.C., McGuigan, J., Rao, R.: Contemporary Financial Management. Cengage Learning, Ohio (2012)Google Scholar
  2. 2.
    Berrya, M.W., Brownea, M., et al.: Algorithms and applications for approximate nonnegative matrix factorization. Comput. Stat. Data Anal. 52, 155–173 (2007)MathSciNetCrossRefGoogle Scholar
  3. 3.
    Yu, L., Wang, S., Lai, K.: A neural-network-based nonlinear meta-modeling approach to financial time series forecasting. Appl. Soft Comput. 9, 563–574 (2009)CrossRefGoogle Scholar
  4. 4.
    Canbas, S., Cabuk, A., BilginKilic, S.: Prediction of commercial bank failure via multivariate statistical analysis of financial structures: the Turkish case. Eur. J. Oper. Res. 166, 528–546 (2005)MathSciNetCrossRefGoogle Scholar
  5. 5.
    Tenenbaum, J.B., de Silva, V., Langford, J.C.: A global geometric framework for nonlinear dimensionality reduction. Science 290(5500), 2319–2323 (2000)CrossRefGoogle Scholar
  6. 6.
    Roweis, S., Saul, L.: Nonlinear dimensionality reduction by locally linear embedding. Science 290(5500), 2323–2326 (2000)CrossRefGoogle Scholar
  7. 7.
    He, X., Yan, S., et al.: Face recognition using Laplacian faces. IEEE Trans. Pattern Anal. Mach. Intell. 27(3), 328–340 (2005)CrossRefGoogle Scholar
  8. 8.
    Huang, Y., Kou, G.: A kernel entropy manifold learning approach for financial data analysis. Decis. Support Syst. 64, 31–42 (2014)CrossRefGoogle Scholar
  9. 9.
    Carter, K.M., Raich, R., et al.: FINE: fisher information non-parametric embedding. IEEE Trans. Patt. Anal. Mach. Intell. 31(11), 2093–2098 (2009)CrossRefGoogle Scholar
  10. 10.
    Carter, K.M., Raich, R., et al.: Information-geometric dimensionality reduction. IEEE Sig. Process. Mag. 28, 89–99 (2011)CrossRefGoogle Scholar
  11. 11.
    de Silva, V., Tenenbaum, J.B.: Global versus local methods in nonlinear dimensionality reduction. In: Advances in Neural Information Processing Systems, vol. 15, pp. 705–712 (2003)Google Scholar
  12. 12.
    Cai, D., He, X., Han, J.: Spectral regression for efficient regularized subspace learning. In: ICCV 2007 (2007)Google Scholar
  13. 13.
    Venna, J., Kaski, S.: Local multi-dimensional scaling with controlled trade-off between trustworthiness and continuity. In: Workshop of Self-Organizing Maps, pp. 695–702 (2005)Google Scholar

Copyright information

© Springer Nature Singapore Pte Ltd. 2018

Authors and Affiliations

  1. 1.Information SchoolCapital University of Economics and BusinessBeijingChina
  2. 2.School of Statistics and MathematicsCentral University of Finance and EconomicsBeijingChina

Personalised recommendations