03967nam a22004695i 4500001001800000003000900018005001700027007001500044008004100059020001800100024003100118050001300149050002000162072001600182072001800198072002300216082001400239245018000253264006100433300004000494336002600534337002600560338003600586347002400622490010000646505166500746520058202411650002202993650001503015650002403030650002903054650002203083650004603105650004503151650003703196700003003233710003403263773002003297776003603317830010003353856004403453978-3-540-68431-2DE-He21320170515111447.0cr nn 008mamaa121227s1997 gw | s |||| 0|eng d a97835406843127 a10.1007/3-540-62685-92doi 4aQ334-342 4aTJ210.2-211.495 7aUYQ2bicssc 7aTJFM12bicssc 7aCOM0040002bisacsh04a006.322310aComputational Learning Theoryh[electronic resource] :bThird European Conference, EuroCOLT '97 Jerusalem, Israel, March 17–19, 1997 Proceedings /cedited by Shai Ben-David. 1aBerlin, Heidelberg :bSpringer Berlin Heidelberg,c1997. aCCCXLVIII, 338 p.bonline resource. atextbtxt2rdacontent acomputerbc2rdamedia aonline resourcebcr2rdacarrier atext filebPDF2rda1 aLecture Notes in Computer Science, Lecture Notes in Artificial Intelligence,x0302-9743 ;v12080 aSample compression, learnability, and the Vapnik-Chervonenkis dimension -- Learning boxes in high dimension -- Learning monotone term decision lists -- Learning matrix functions over rings -- Learning from incomplete boundary queries using split graphs and hypergraphs -- Generalization of the PAC-model for learning with partial information -- Monotonic and dual-monotonic probabilistic language learning of indexed families with high probability -- Closedness properties in team learning of recursive functions -- Structural measures for games and process control in the branch learning model -- Learning under persistent drift -- Randomized hypotheses and minimum disagreement hypotheses for learning with noise -- Learning when to trust which experts -- On learning branching programs and small depth circuits -- Learning nearly monotone k-term DNF -- Optimal attribute-efficient learning of disjunction, parity, and threshold functions -- learning pattern languages using queries -- On fast and simple algorithms for finding Maximal subarrays and applications in learning theory -- A minimax lower bound for empirical quantizer design -- Vapnik-Chervonenkis dimension of recurrent neural networks -- Linear Algebraic proofs of VC-Dimension based inequalities -- A result relating convex n-widths to covering numbers with some applications to neural networks -- Confidence estimates of classification accuracy on new examples -- Learning formulae from elementary facts -- Control structures in hypothesis spaces: The influence on learning -- Ordinal mind change complexity of language identification -- Robust learning with infinite additional information. aThis book constitutes the refereed proceedings of the Third European Conference on Computational Learning Theory, EuroCOLT'97, held in Jerusalem, Israel, in March 1997. The book presents 25 revised full papers carefully selected from a total of 36 high-quality submissions. The volume spans the whole spectrum of computational learning theory, with a certain emphasis on mathematical models of machine learning. Among the topics addressed are machine learning, neural nets, statistics, inductive inference, computational complexity, information theory, and theoretical physics. 0aComputer science. 0aComputers. 0aMathematical logic. 0aArtificial intelligence.14aComputer Science.24aArtificial Intelligence (incl. Robotics).24aMathematical Logic and Formal Languages.24aComputation by Abstract Devices.1 aBen-David, Shai.eeditor.2 aSpringerLink (Online service)0 tSpringer eBooks08iPrinted edition:z9783540626855 0aLecture Notes in Computer Science, Lecture Notes in Artificial Intelligence,x0302-9743 ;v120840uhttp://dx.doi.org/10.1007/3-540-62685-9