Abstract
In this paper, we propose a method for optimization of the parameters of a Support Vector Machine which is more accurate than the usually applied grid search method. The method is based on Iterated Local Search, a classic metaheuristic that performs multiple local searches in different parts of the space domain. When the local search arrives at a local optimum, a perturbation step is performed to calculate the starting point of a new local search based on the previously found local optimum. In this way, exploration of the space domain is balanced against wasting time in areas that are not giving good results. We show a preliminary evaluation of our method on a radial-basis kernel and some sample data, showing that it is more accurate than an application of grid search on the same problem. The method is applicable to other kernels and future work should demonstrate to what extent our Iterated Local Search based method outperforms the standard grid search method over other heterogeneous datasets from different domains.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
Note that automatic configuration for algorithms is the same problem faced when doing hyper-parameter tuning in machine learning; it is just another wording.
- 2.
References
Aarts, E., Korst, J., Michiels, W.: Simulated annealing. Search Methodologies: Introductory Tutorials in Optimization and Decision Support Techniques, pp. 187–210 (2005)
Alpaydin, E.: Introduction to Machine Learning (Adaptive Computation and Machine Learning). The MIT Press, Cambridge (2009)
Anaissi, A., Goyal, M., Catchpoole, D.R., Braytee, A., Kennedy, P.J.: Ensemble feature learning of genomic data using support vector machine. PLoS One 11(6), 1 June 2016, Article Number e0157330 (2016)
Balaprakash, P., Birattari, M., Stützle, T.: Improvement strategies for the F-race algorithm: sampling design and iterative refinement. In: Bartz-Beielstein, T., Blesa Aguilera, M.J., Blum, C., Naujoks, B., Roli, A., Rudolph, G., Sampels, M. (eds.) HM 2007. LNCS, vol. 4771, pp. 108–122. Springer, Heidelberg (2007). https://doi.org/10.1007/978-3-540-75514-2_9
Bergstra, J., Bengio, Y.: Random search for hyper-parameter optimization. J. Mach. Learn. Res. 13(1), 281–305 (2012)
Ceylan, O., Taşkn, G.: SVM parameter selection based on harmony search with an application to hyperspectral image classification. In: 24th Signal Processing and Communication Application Conference (SIU), pp. 657–660 (2016)
Cherkassky, V., Ma, Y.: Practical selection of SVM parameters and noise estimation for SVM regression. Neural Networks 17(1), 113–126 (2004)
Conca, P., Stracquadanio, G., Nicosia, G.: Automatic tuning of algorithms through sensitivity minimization. In: Pardalos, P., Pavone, M., Farinella, G.M., Cutello, V. (eds.) MOD 2015. LNCS, vol. 9432, pp. 14–25. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-27926-8_2
Cortes, C., Vapnik, V.N.: Support-vector networks. Mach. Learn. 20(3), 273–297 (1995)
Gatos, I., Tsantis, S., Spiliopoulos, S., Karnabatidis, D., Theotokas, I., Zoumpoulis, P., Loupas, T., Hazle, J.D., Kagadis, G.C.: A new computer aided diagnosis system for evaluation of chronic liver disease with ultrasound shear wave elastography imaging. Med. Phys. 43(3), 1428–1436 (2016)
Hansen, P., Mladenović, N., Moreno-Pérez, J.A.: Variable neighbourhood search: methods and applications. Ann. Oper. Res. 175(1), 367–407 (2010)
Hutter, F., Hoos, H.H., Leyton-Brown, K.: Sequential model-based optimization for general algorithm configuration. In: Coello, C.A.C. (ed.) LION 2011. LNCS, vol. 6683, pp. 507–523. Springer, Heidelberg (2011). https://doi.org/10.1007/978-3-642-25566-3_40
Hutter, F., Stützle, T., Leyton-Brown, K., Hoos, H.H.: ParamILS: an automatic algorithm configuration framework. J. Artif. Intell. Res. 36(1), 267–306 (2009)
Joachims, T.: Text categorization with support vector machines: learning with many relevant features. In: Nédellec, C., Rouveirol, C. (eds.) ECML 1998. LNCS, vol. 1398, pp. 137–142. Springer, Heidelberg (1998). https://doi.org/10.1007/BFb0026683
Kecman, V.: Learning and Soft Computing. The MIT Press, Cambridge (2001)
Keerthi, S.: Efficient tuning of SVM hyperparameters using radius/margin bound and iterative algorithms. IEEE Trans. Neural Networks 13(5), 1225–1229 (2002)
Kwok, J.T., Tsang, I.W.: Linear dependency between \(\epsilon \) and the input noise in \(\epsilon \)-support vector regression. IEEE Trans. Neural Networks 14(3), 544–553 (2003)
Lameski, P., Zdravevski, E., Mingov, R., Kulakov, A.: SVM parameter tuning with grid search and its impact on reduction of model over-fitting. In: Yao, Y., Hu, Q., Yu, H., Grzymala-Busse, J.W. (eds.) RSFDGrC 2015. LNCS (LNAI), vol. 9437, pp. 464–474. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-25783-9_41
López-Ibáñez, M., Dubois-Lacoste, J., Pérez-Cáceres, L., Birattari, M., Stützle, T.: The irace package: Iterated racing for automatic algorithm configuration. Operat. Res. Perspect. 3, 43–58 (2016)
Lourenço, H.R.: Job-shop scheduling: computational study of local search and large-step optimization methods. Eur. J. Oper. Res. 83(2), 347–364 (1995)
Lourenço, H.R., Martin, O.C., Stützle, T.: Iterated local search: framework and applications. In: Gendreau, M., Potvin, J.Y. (eds.) Handbook of Metaheuristics. International Series in Operations Research & Management Science, vol. 146, pp. 363–397. Springer, Boston (2010). https://doi.org/10.1007/978-1-4419-1665-5_12
Mattera, D., Haykin, S.: Support vector machines for dynamic reconstruction of a chaotic system. In: Schölkopf, B., Burges, C.J.C., Smola, A.J. (eds.) Advances in Kernel Methods, pp. 211–241. MIT Press, Cambridge (1999)
McLachlan, G.J., Do, K.-A., Ambroise, C.: Analyzing Microarray Gene Expression Data. Wiley, New York (2004)
Melvin, I., Ie, E., Kuang, R., Weston, J., Stafford, W.N.N., Leslie, C.: SVM-Fold: a tool for discriminative multi-class protein fold and superfamily recognition. BMC Bioinform. 8(Suppl. 4), S2 (2007)
Osuna, E., Freund, R., Girosit, F.: Training support vector machines: an application to face detection. In: Proceedings of the 1997 Conference on Computer Vision and Pattern Recognition (CVPR 1997), pp. 130–137. IEEE Computer Society (1997)
Pardalos, P.M., Resende, M.G.C.: Handbook of Applied Optimization. Oxford University Press, Oxford (2002)
Shawe-Taylor, J., Cristianini, N.: Kernel Methods for Pattern Analysis. Cambridge University Press, New York (2004)
Sherin, B.M., Supriya, M.H.: Selection and parameter optimization of SVM kernel function for underwater target classification. In: 2015 IEEE Underwater Technology (UT), pp. 1–5 (2015)
Vapnik, V.N.: The Nature of Statistical Learning Theory. Springer, New York (2000)
Vos, P.C., Hambrock, T., Hulsbergen van de Kaa, C.A., Futterer, J.J., Barentsz, J.O., Huisman, H.J.: Computerized analysis of prostate lesions in the peripheral zone using dynamic contrast enhanced MRI. Med. Phys. 35(3), 888–899 (2008)
Yang, C., Ding, L., Liao, S.: Parameter tuning via kernel matrix approximation for support vector machine. J. Comput. 7(8), 2047–2054 (2012)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG
About this paper
Cite this paper
Consoli, S., Kustra, J., Vos, P., Hendriks, M., Mavroeidis, D. (2018). Improving Support Vector Machines Performance Using Local Search. In: Nicosia, G., Pardalos, P., Giuffrida, G., Umeton, R. (eds) Machine Learning, Optimization, and Big Data. MOD 2017. Lecture Notes in Computer Science(), vol 10710. Springer, Cham. https://doi.org/10.1007/978-3-319-72926-8_2
Download citation
DOI: https://doi.org/10.1007/978-3-319-72926-8_2
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-72925-1
Online ISBN: 978-3-319-72926-8
eBook Packages: Computer ScienceComputer Science (R0)