000 04049nam a22005415i 4500
001 978-981-13-5956-9
003 DE-He213
005 20240423124959.0
007 cr nn 008mamaa
008 190522s2019 si | s |||| 0|eng d
020 _a9789811359569
_9978-981-13-5956-9
024 7 _a10.1007/978-981-13-5956-9
_2doi
050 4 _aQ334-342
050 4 _aTA347.A78
072 7 _aUYQ
_2bicssc
072 7 _aCOM004000
_2bisacsh
072 7 _aUYQ
_2thema
082 0 4 _a006.3
_223
100 1 _aZhou, Zhi-Hua.
_eauthor.
_4aut
_4http://id.loc.gov/vocabulary/relators/aut
245 1 0 _aEvolutionary Learning: Advances in Theories and Algorithms
_h[electronic resource] /
_cby Zhi-Hua Zhou, Yang Yu, Chao Qian.
250 _a1st ed. 2019.
264 1 _aSingapore :
_bSpringer Nature Singapore :
_bImprint: Springer,
_c2019.
300 _aXII, 361 p. 59 illus., 20 illus. in color.
_bonline resource.
336 _atext
_btxt
_2rdacontent
337 _acomputer
_bc
_2rdamedia
338 _aonline resource
_bcr
_2rdacarrier
347 _atext file
_bPDF
_2rda
505 0 _a1.Introduction -- 2. Preliminaries -- 3. Running Time Analysis: Convergence-based Analysis -- 4. Running Time Analysis: Switch Analysis -- 5. Running Time Analysis: Comparison and Unification -- 6. Approximation Analysis: SEIP -- 7. Boundary Problems of EAs -- 8. Recombination -- 9. Representation -- 10. Inaccurate Fitness Evaluation -- 11. Population -- 12. Constrained Optimization -- 13. Selective Ensemble -- 14. Subset Selection -- 15. Subset Selection: k-Submodular Maximization -- 16. Subset Selection: Ratio Minimization -- 17. Subset Selection: Noise -- 18. Subset Selection: Acceleration. .
520 _aMany machine learning tasks involve solving complex optimization problems, such as working on non-differentiable, non-continuous, and non-unique objective functions; in some cases it can prove difficult to even define an explicit objective function. Evolutionary learning applies evolutionary algorithms to address optimization problems in machine learning, and has yielded encouraging outcomes in many applications. However, due to the heuristic nature of evolutionary optimization, most outcomes to date have been empirical and lack theoretical support. This shortcoming has kept evolutionary learning from being well received in the machine learning community, which favors solid theoretical approaches. Recently there have been considerable efforts to address this issue. This book presents a range of those efforts, divided into four parts. Part I briefly introduces readers to evolutionary learning and provides some preliminaries, while Part II presents general theoretical tools for the analysis of running time and approximation performance in evolutionary algorithms. Based on these general tools, Part III presents a number of theoretical findings on major factors in evolutionary optimization, such as recombination, representation, inaccurate fitness evaluation, and population. In closing, Part IV addresses the development of evolutionary learning algorithms with provable theoretical guarantees for several representative tasks, in which evolutionary learning offers excellent performance. .
650 0 _aArtificial intelligence.
650 0 _aAlgorithms.
650 0 _aComputer science
_xMathematics.
650 1 4 _aArtificial Intelligence.
650 2 4 _aAlgorithms.
650 2 4 _aMathematical Applications in Computer Science.
700 1 _aYu, Yang.
_eauthor.
_4aut
_4http://id.loc.gov/vocabulary/relators/aut
700 1 _aQian, Chao.
_eauthor.
_4aut
_4http://id.loc.gov/vocabulary/relators/aut
710 2 _aSpringerLink (Online service)
773 0 _tSpringer Nature eBook
776 0 8 _iPrinted edition:
_z9789811359552
776 0 8 _iPrinted edition:
_z9789811359576
856 4 0 _uhttps://doi.org/10.1007/978-981-13-5956-9
912 _aZDB-2-SCS
912 _aZDB-2-SXCS
942 _cSPRINGER
999 _c172869
_d172869