Advertisement

Correlation, Independance and Inverse Modeling

  • V. Vigneron
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5099)

Abstract

Learning from examples has a wide number of forms depending on what is to be learned from which available information. One of these form is y = f(x) where the input-output pair (x,y) is the available information and f represents the process mapping \({\bf x}\in\cal X\) to \({\bf y}\in\cal Y\). In general and for real world problems, it is not reasonnable to expect having the exact representation of f. A fortiori when the dimension of x is large and the number of examples is little. In this paper, we introduce a new model, capable to reduce the complexity of many ill-posed problems without loss of generality. The underlying Bayesian artifice is presented as an alternative to the currently used frequency approaches which does not offer a compelling criterion in the case of high dimensional problems.

Keywords

Inverse Modeling Hyperspectral Imaging Independent Component Analysis Blind Signal High Dimensional Problem 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Falk, J.E.: Global solutions for signomial programs. Technical Report 6-274, George Washington Univ., Washington DC, USA (1973)Google Scholar
  2. 2.
    Hyvarinen, A., Oja, E.: Independent component analysis: algorithms and applications. Neural Networks 13(4-5), 411–430 (2000)CrossRefGoogle Scholar
  3. 3.
    Mendel, J.M.: Maximum-Likelihood deconvolution. A journey into model based signal processing. Springer, New York (1990)Google Scholar
  4. 4.
    Pham, D.T., Cardoso, J.-F.: Independent Component Analysis and Blind Signal Separation (chapter Optimization issues in noisy Gaussian ICA). In: Puntonet, C.G., Prieto, A.G. (eds.) ICA 2004. LNCS, vol. 3195, pp. 41–48. Springer, Heidelberg (2004)Google Scholar
  5. 5.
    Robert, C.P.: Méthodes de simulation en statistiques. Introduction aux méthodes de Monte-Carlo par chînes de Markov. Economica, Paris (1996)Google Scholar
  6. 6.
    Tarantola, A.: Inverse problem theory. Models for data fitting and model parameter estimation. Elsevier, Amsterdam (1987)Google Scholar
  7. 7.
    Tikhonov, A.N., Arsenin, V.Y.: Solutions of ill-posed problems. V.H. Vinsten, Washington (1977)zbMATHGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2008

Authors and Affiliations

  • V. Vigneron
    • 1
    • 2
  1. 1.IBISC-lab CNRS FRE 3190Université d’EvryEvry CedexFrance
  2. 2.Équipe MATISSE-SAMOS CES CNRS-UMR 8173Paris cedex 13France

Personalised recommendations