1.
S. Aeberhard, D. Coomans, and O. de Vel, “Comparison of classifiers in high dimensional settings,” Technical Report 92–02, Departments of Computer Science and of Mathematics and Statistics, James Cook University of North Queensland, 1992.
2.
K. P. Bennett, “Decision tree construction via linear programming,” in Proceedings of the 4th Midwest Artificial Intelligence and Cognitive Science Society Conference, Utica, Illinois, pages 97–101, 1992.
3.
K. P. Bennett and E. J. Bredensteiner, “Geometry in learning,” In C. Gorini, E. Hart, W. Meyer, and T. Phillips, editors, Geometry at Work, Washington, D.C., 1998. Mathematical Association of America, To appear.
4.
K. P. Bennett and O. L. Mangasarian, “Neural network training via linear programming,” in P. M. Pardalos, editor, Advances in Optimization and Parallel Computing, North Holland, pp. 56–67, 1992.
5.
K. P. Bennett and O. L. Mangasarian, “Multicategory discrimination via linear programming,” Optimization Methods and Software, vol. 3, pp. 27–39, 1994.
6.
K. P. Bennett and O. L. Mangasarian, “Serial and parallel multicategory discrimination,” SIAM Journal on Optimization, Vol. 4, pp. 722–734, 1994.
7.
K. P. Bennett, D. H. Wu, and L. Auslender, “On support vector decision trees for database marketing,” R.P.I. Math Report No. 98–100, Rensselaer Polytechnic Institute, Troy, NY, 1998.
8.
V. Blanz, B. Schölkopf, H. Bülthoff, C. Burges, V. Vapnik, and T. Vetter, “Comparison of view-based object recognition algorithms using realistic 3D models,” in C. von der Malsburg, W. von Seelen, J. C. Vorbrüggen, and B. Sendhoff, editors, Artificial Neural Networks — ICANN'96, vol. 1112 of Lecture Notes in Computer Science, Springer: Berlin, 1996.
9.
P. S. Bradley and O. L. Mangasarian, “Feature selection via concave minimization and support vector machines,” Machine Learning Proceedings of the Fifteenth International Conference (ICML '98) (J. Shavlik, editor), Morgan Kaufmann: San Francisco, CA, pp. 82–90, 1998.
10.
E. J. Bredensteiner and K. P. Bennett, “Feature minimization within decision trees,” Computational Optimization and Applications, vol. 10, pp. 111–126, 1998.
11.
C. Cortes and V. N. Vapnik, “Support vector networks,” Machine Learning, vol. 20, pp. 273–297, 1995.
12.
R. Courant and D. Hilbert, Methods of Mathematical Physics, J. Wiley: New York, 1953.
13.
Y. Le Cun, B. Boser, J. S. Denker, D. Henderson, R. E. Howard, W. Hubbard, and L. J. Jackel, “Backpropagation applied to handwritten zip code recognition,” Neural Computation, vol. 1. pp. 541–551, 1989.
14.
G. B. Dantzig, Linear Programming and Extensions, Princeton University Press, Princeton, New Jersey, 1963.
15.
I. W. Evett and E. J. Spiehler, “Rule induction in forensic science,” Technical report, Central Research Establishment, Home Office Forensic Science Service, Aldermaston, Reading, Berkshire RG7 4PN, 1987.
16.
O. L. Mangasarian, “Linear and nonlinear separation of patterns by linear programming,” Operations Research, vol. 13, pp. 444–452, 1965.
17.
O. L. Mangasarian, “Multi-surface method of pattern separation,” IEEE Transactions on Information Theory, vol. IT-14, pp. 801–807, 1968.
18.
O. L. Mangasarian, Nonlinear Programming, McGraw-Hill: New York, 1969.
19.
O. L. Mangasarian, “Mathematical programming in machine learning,” In G. DiPillo and F. Giannessi, editors, Proceedings of Nonlinear Optimization and Applications Workshop, pp. 283–295, Plenum Press: New York, 1996.
20.
O. L. Mangasarian, “Arbitrary-norm separating plane,” Mathematical Programming Technical Report 97–07, University of Wisconsin-Madison, 1997.
21.
O. L. Mangasarian, W. N. Street, and W. H. Wolberg, “Breast cancer diagnosis and prognosis via linear programming,” Operations Research, vol. 43, pp. 570–577, 1995.
22.
P. M. Murphy and D. W. Aha, “UCI repository of machine learning databases,” [http://www.ics.uci.edu/~mlearn/MLRepository.html], Department of Information and Computer Science, University of California, Irvine, California, 1994.
23.
B. A. Murtagh and M. A. Saunders, “MINOS 5.4 user's guide,” Technical Report SOL 83.20, Stanford University, 1993.
24.
A. Roy, S. Govil, and R. Miranda, An algorithm to generate radial basis function (RBF)-like nets for classification problems, Neural Networks, vol. 8, pp. 179–202, 1995.
25.
A. Roy, L. S. Kim, and S. Mukhopadhyay, “A polynomial time algorithm for the construction and training of a class of multilayer perceptrons,” Neural Networks, vol. 6, pp. 535–545, 1993.
26.
A. Roy and S. Mukhopadhyay, “Pattern classification using linear programming,” ORSA Journal of Computing, vol. 3, pp. 66–80, 1990.
27.
B. Schölkopf, C. Burges, and V. Vapnik, “Incorporating invariances in support vector machines,” in C. von der Malsburg, W. von Seelen, J. C. Vorbrüggen, and B. Sendhoff, editors, Artificial Neural Networks — ICANN'96, vol. 1112 of Lecture Notes in Computer Science, pp. 47–52, Springer: Berlin 1996.
28.
B. Schölkopf, K. Sung, C. Burges, F. Girosi, P. Niyogi, T. Poggio, and V. Vapnik, “Comparing support vector machines with gaussian kernels to radial basis function classifiers,” AI Memo No. 1599; CBCL Paper No. 142, Massachusetts Institute of Technology, Cambridge, 1996.
29.
V. Vapnik, The Nature of Statistical Learning Theory, Springer-Verlag, 1995.
30.
V. N. Vapnik and A. Ja. Chervonenkis, Theory of Pattern Recognition, Nauka, Moscow, 1974. In Russian.
31.
W. H. Wolberg and O. L. Mangasarian, “Multisurface method of pattern separation for medical diagnosis applied to breast cytology,” Proceedings of the National Academy of Sciences U.S.A., vol. 87, pp. 9193–9196, 1990.