Kernel Based Subspace Projection of Near Infrared Hyperspectral Images of Maize Kernels

  • Rasmus Larsen
  • Morten Arngren
  • Per Waaben Hansen
  • Allan Aasbjerg Nielsen
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5575)


In this paper we present an exploratory analysis of hyper-spectral 900-1700 nm images of maize kernels. The imaging device is a line scanning hyper spectral camera using a broadband NIR illumination. In order to explore the hyperspectral data we compare a series of subspace projection methods including principal component analysis and maximum autocorrelation factor analysis. The latter utilizes the fact that interesting phenomena in images exhibit spatial autocorrelation. However, linear projections often fail to grasp the underlying variability on the data. Therefore we propose to use so-called kernel version of the two afore-mentioned methods. The kernel methods implicitly transform the data to a higher dimensional space using non-linear transformations while retaining the computational complexity. Analysis on our data example illustrates that the proposed kernel maximum autocorrelation factor transform outperform the linear methods as well as kernel principal components in producing interesting projections of the data.


Hyperspectral Image Hyperspectral Data Kernel Principal Component Analysis Maize Kernel Minimum Noise Fraction 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. 1.
    Pearson, K.: On lines and planes of closest fit to systems of points in space. Philosofical Magazine 2(3), 559–572 (1901)CrossRefzbMATHGoogle Scholar
  2. 2.
    Hotelling, H.: Analysis of a complex of statistical variables into principal components. Journal of Educational Psychology 24, 417–441, 498–520 (1933)CrossRefzbMATHGoogle Scholar
  3. 3.
    Jolliffe, I.T.: Principal Component Analysis, 2nd edn. Springer, Heidelberg (2002)zbMATHGoogle Scholar
  4. 4.
    Schölkopf, B., Smola, A., Müller, K.-R.: Nonlinear component analysis as a kernel eigenvalue problem. Neural Computation 10(5), 1299–1319 (1998)CrossRefGoogle Scholar
  5. 5.
    Shawe-Taylor, J., Cristianini, N.: Kernel Methods for Pattern Analysis. Cambridge University Press, Cambridge (2004)CrossRefzbMATHGoogle Scholar
  6. 6.
    Bishop, C.M.: Pattern Recognition and Machine Learning. Springer, Heidelberg (2006)zbMATHGoogle Scholar
  7. 7.
    Press, W.H., Teukolsky, S.A., Vetterling, W.T., Flannery, B.P.: Numerical Recipes: The Art of Scientific Computing, 3rd edn. Cambridge University Press, Cambridge (2007)zbMATHGoogle Scholar
  8. 8.
    Eckart, C., Young, G.: The approximation of one matrix by another of lower rank. Psykometrika 1, 211–218 (1936)CrossRefzbMATHGoogle Scholar
  9. 9.
    Johnson, R.M.: On a theorem stated by Eckart and Young. Psykometrika 28(3), 259–263 (1963)CrossRefzbMATHMathSciNetGoogle Scholar
  10. 10.
    Nielsen, A.A.: Kernel minimum noise fraction transformation (2008) (submitted)Google Scholar
  11. 11.
    Switzer, P.: Min/Max Autocorrelation factors for Multivariate Spatial Imagery. In: Billard, L. (ed.) Computer Science and Statistics, pp. 13–16 (1985)Google Scholar
  12. 12.
    Hoseney, R.C.: Principles of Cereal Science and Technology. American Association of Cereal Chemists (1994)Google Scholar
  13. 13.
    Belitz, H.-D., Grosch, W., Schieberle, P.: Food Chemistry, 3rd edn. Springer, Heidelberg (2004)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Rasmus Larsen
    • 1
  • Morten Arngren
    • 1
    • 2
  • Per Waaben Hansen
    • 2
  • Allan Aasbjerg Nielsen
    • 3
  1. 1.DTU InformaticsTechnical University of DenmarkKgs. LyngbyDenmark
  2. 2.FOSS Analytical ASHillerødDenmark
  3. 3.DTU SpaceTechnical University of DenmarkKgs. LyngbyDenmark

Personalised recommendations