Abstract
Data classification is one of the core technologies in the field of pattern recognition and machine learning, which is of great theoretical significance and application value. With the increasing improvement of data acquisition, storage, transmission means and the amount of data, how to extract the essential attribute data from massive data, data accurate classification has become an important research topic. Inverse nth n order gravitational field is essentially a generalization of the n order in the physics, which can effectively describe the interaction between all the particles in the gravitational field. This paper proposes a new inverse nth power gravitation (I-n-PG) based clustering method is proposed for data classification. Some randomly generated data samples as well as some well-known classification data sets are used for the verification of the proposed I-n-PG classifier. The experiments show that our proposed I-n-PG classifier performs very well on both of these two test sets.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Chandrasekhar, S.: Newton’s Principia for the Common Reader. Oxford University Press, New York (1995)
Heintz, W.H.: Runge-Lenz vector for non- relativistic Kepler motion modified by an inverse cube force. Am. J. Phys. 44(7), 687–694 (1976)
Roseveare, N.: Mercury’s Perihelion from Le verrier to Einstein. Oxford University Press, New York (1982)
Paul, E.: Incremental Induction of Decision Trees. Mach. Learn. 4(2), 161–186 (1997)
Murthy, S.K.: Automatic construction of decision trees from data: a multi-disciplinary survey. Data Min. Knowl. Disc. 2, 345–389 (1998)
Mugambi, E.M.: Polynomial—fuzzy data knowledge-based systems: decision tree structures for classifying medical data. Knowl.-Based Syst. 17(2), 81–87 (2004)
Zheng, Z.: Constructing conjunctions using systematic search on decision trees. Knowl. Based Syst. J. 10, 421–430 (1998)
Gama, J., Brazdil, P.: Linear Tree. Intell. Data Anal. 3, 1–22 (1999)
Meretakis, D., Wuthrieh, B.: Extending Naive Bayes classifiers using long itemsets. In: Proceeding 1999 International Conference Knowledge Discovery and Data Mining (KDD 1999), San Diego, pp. 165–174, August 1999
Cheng, J., Bell, D., Liu, W.: Learning Bayesian Networks from Data: An Efficient Approach Based on Information Theory (1999)
Acid, S., de Campos, L.M.: Searching for Bayesian Network structures in the space of restricted acyclic partially directed graphs. J. Artif. Intell. Res. 18, 445–490 (2003)
Zhang, G.: Neural networks for classification: a survey. IEEE Trans. Syst. Man Cybern. Part C 30(4), 451–462 (2000)
Kon, M., Plaskota, L.: Information complexity of neural networks. Neural Netw. 13, 365–375 (2000)
Siddique, M.N.H., Tokhi, M.O.: Training neural networks: backpropagation vs. genetic algorithms. In: IEEE International Joint Conference on Neural Networks, vol. 4, pp. 2673–2678 (2011)
Lam Savio, L.Y., Lee, D.L.: Feature reduction for neural network based text categorization. In: Digital Symposium Collection of 6th International Conference on Database System for Advanced Application (1999)
Ji, S.W., Xu, W., Yang, M., Yu, K.: 3D convolutional neural networks for human action recognition. In: The 27th International Conference on Machine Learning, pp. 495–502 (2010)
Damoulas, T., Girolami, M.A.: Pattern Recognition with a Bayesian Kernel Combination Machine (2009)
Kingsbury, N., Tay, D.B.H., Palaniswami, M.: Multi-Scale Kernel Methods for Classification (2005)
Li, B., Zheng, D., Sun, L.: Exploiting Multi-Scale Support Vector Regression for Image Compression (2007)
Pozdnoukhov, A., Kanevski, M.: Multi-Scale Support Vector Algorithms for Hot Spot Detection and Modeling (2007)
Lin, Y.Y., Liu, T.L., Fuh, C.S.: Local Ensemble Kernel Learning for Object Category Recognition (2007)
Sonnenburg, S., Rlitsch, G., Schafer, C.: A General and Efficient Multiple Kernel Learning Algorithm (2005)
Smola, A.J., SchOlkopf, B.: A Tutorial on Support Vector Regression (2004)
Vapnik, V.N.: The Nature of Statistical Learning Theory. Springer, New York (1995)
Karampatziakis, N.: Static analysis of binary executables using structural SVMs. Adv. Neural. Inf. Process. Syst. 23, 1063–1071 (2010)
Ridder, D., Kouropteva, O., Okun, O., Pietikäinen, M., Duin, Robert P.W.: Supervised locally linear embedding. In: Kaynak, O., Alpaydin, E., Oja, E., Xu, L. (eds.) ICANN/ICONIP -2003. LNCS, vol. 2714, pp. 333–341. Springer, Heidelberg (2003). doi:10.1007/3-540-44989-2_40
Roweis, S.T., Saul, L.K.: Nonlinear dimensionality reduction by locally linear embedding. Science 290, 2323–2326 (2000)
Tenenbaum, J.B., de Silva, V., Langford, J.C.: A global geometric framework for nonlinear dimensionality reduction. Science 290, 2319–2323 (2000)
Belkin, M., Niyogi, P.: Laplacian Eigen maps for dimensionality reduction and data representation. Neural Comput. 15(6), 1373–1396 (2003)
Rigamonti, R., Brown, M., Lepetit, V.: Are sparse representation really relevant for image classification? In: Proceeding of International Conference on Computer Vision and Pattern Recognition (2011)
Shi, Q., Eriksson, A., Van Den Hengel, A., Shen, C.: Is face recognition really a Compressive Sensing problem? In: Proceeding of International Conference on Computer Vision and Pattern Recognition (2011)
Zhang, L., Yang, M., Feng, X.: Sparse representation or collaborative representation: which helps face recognition? In: Proceeding of International Conference on Computer Vision (2011)
Wright, J., Yang, A.Y., Ganesh, A., Sastry, S.S., Ma, Y.: Robust face recognition via sparse representation. PAMI 31(2), 210–227 (2009)
Zhang, H., Yang, J., Zhang, Y., Huang, T.: Close the loop: joint blind image restoration and recognition with sparse representation prior. In: ICCV (2011)
He, X., Yan, S., Hu, Y., Niyogi, P., Zhang, H.J.: Face recognition using Laplacianfaces. IEEE Trans. Pattern Anal. Mach. Intell. 27(3), 328–340 (2005)
Yan, S., Dong, X., Zhang, B., Zhang, H., Yang, Q., Lin, S.: Graph embedding and extensions: a general framework for dimensionality reduction. IEEE Trans. Pattern Anal. Mach. Intell. 29(1), 40–51 (2007)
Lanckriet, G.R.G., Cristianini, N., Bartlett, P., Ghaoui, L.E., Jordan, M.I.: Learning the kernel matrix with semidenite programming. J. Mach. Learn. Res. 5(1), 27–72 (2004)
Lee, W.-J., Verzakov, S., Duin, R.P.W.: Kernel combination versus classifier combination. In: Haindl, M., Kittler, J., Roli, F. (eds.) MCS 2007. LNCS, vol. 4472, pp. 22–31. Springer, Heidelberg (2007). doi:10.1007/978-3-540-72523-7_3
Tipping, M.E.: The relevance vector machine. In: Proceeding of Advances in Neural Information Processing Systems, vol. 12, pp. 652–658 (2000)
Acknowledgment
This work was supported by National Nature Science Foundation of China under the research project 61273290.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this paper
Cite this paper
Xu, H., Hao, L., Jianag, C., Haq, E.U. (2017). A New Inverse Nth Gravitation Based Clustering Method for Data Classification. In: Xing, C., Zhang, Y., Liang, Y. (eds) Smart Health. ICSH 2016. Lecture Notes in Computer Science(), vol 10219. Springer, Cham. https://doi.org/10.1007/978-3-319-59858-1_2
Download citation
DOI: https://doi.org/10.1007/978-3-319-59858-1_2
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-59857-4
Online ISBN: 978-3-319-59858-1
eBook Packages: Computer ScienceComputer Science (R0)