Skip to main content

Part of the book series: SpringerBriefs in Applied Sciences and Technology ((BRIEFSINTELL))

  • 1191 Accesses

Abstract

In supervised learning scenarios, the objective is to learn a functional model f that best explains a set of observed patterns with their corresponding labels.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. E. Nadaraya, On estimating regression. Theor. Probab. Appl. 10, 186–190 (1964)

    Article  Google Scholar 

  2. G. Watson, Smooth regression analysis. Sankhya Ser. A 26, 359–372 (1964)

    MATH  MathSciNet  Google Scholar 

  3. S. Klanke, H. Ritter, Variants of unsupervised kernel regression: general cost functions. Neurocomputing 70(7–9), 1289–1303 (2007)

    Article  Google Scholar 

  4. C. M. Bishop, Pattern Recognition and Machine Learning (Information Science and Statistics) (Springer, Berlin, 2007)

    Google Scholar 

  5. T. Hastie, R. Tibshirani, J. Friedman, The Elements of Statistical Learning (Springer, Berlin, 2009)

    Book  MATH  Google Scholar 

  6. W. Härdle, L. Simar, Appplied Multivariate Statistical Analysis (Springer, Berlin, 2007)

    Google Scholar 

  7. B. W. Silverman, Density Estimation for Statistics and Data Analysis, volume 26 of Monographs on Statistics and Applied Probability (Chapman and Hall, London, 1986)

    Google Scholar 

  8. P.J. Huber, Robust Statistics (Wiley, New York, 1981)

    Book  MATH  Google Scholar 

  9. R. Stoean, M. Preuss, C. Stoean, D. Dumitrescu, Concerning the potential of evolutionary support vector machines, in IEEE Congress on Evolutionary Computation (CEC) (2007), pp. 1436–1443

    Google Scholar 

  10. R. Stoean, M.P.C. Stoean, E. E-Darzi, D. Dumitrescu, Support vector machine learning with an evolutionary engine. J. Oper. Res. Soci. 60(8), 1116–1122 (2009)

    Article  MATH  Google Scholar 

  11. I. Mierswa, K. Morik, About the non-convex optimization problem induced by non-positive semidefinite kernel learning. Adv. Data Anal. Classif. 2(3), 241–258 (2008)

    Article  MathSciNet  Google Scholar 

  12. I. Mierswa, Controlling overfitting with multi-objective support vector machines, in Proceedings of the 9th Conference on Genetic and Evolutionary Computation (GECCO) (ACM Press, New York, 2007), pp. 1830–1837

    Google Scholar 

  13. K. Deb, A. Pratap, S. Agarwal, T. Meyarivan, A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans. Evol. Comput. 6(2), 182–197 (2002)

    Article  Google Scholar 

  14. F. Gieseke, T. Pahikkala, O. Kramer, Fast evolutionary maximum margin clustering, in Proceedings of the International Conference on Machine Learning (ICML) (ACM Press, New York, 2009), pp. 361–368

    Google Scholar 

  15. O. Kramer, B. Satzger, J. Lässig, Power prediction in smart grids with evolutionary local kernel regression, in Hybrid Artificial Intelligence Systems (HAIS) LNCS (Springer, Berlin, 2010), pp. 262–269

    Google Scholar 

  16. R. Clark, A calibration curve for radiocarbon dates. Antiquity 46(196), 251–266 (1975)

    Google Scholar 

  17. O. Kramer, F. Gieseke, Evolutionary kernel density regression. Expert Syst. Appl. 39(10), 9246–9254 (2012)

    Article  Google Scholar 

  18. A. Hinneburg, D.A. Keim, A general approach to clustering in large databases with noise. Knowl. Inf. Syst. 5(4), 387–415 (2003)

    Article  Google Scholar 

  19. P. Schnell, A method to find point-groups. Biometrika 6, 47–48 (1964)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Oliver Kramer .

Rights and permissions

Reprints and permissions

Copyright information

© 2014 The Author(s)

About this chapter

Cite this chapter

Kramer, O. (2014). Kernel Evolution. In: A Brief Introduction to Continuous Evolutionary Optimization. SpringerBriefs in Applied Sciences and Technology(). Springer, Cham. https://doi.org/10.1007/978-3-319-03422-5_7

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-03422-5_7

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-03421-8

  • Online ISBN: 978-3-319-03422-5

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics