000 04243nam a22005655i 4500
001 978-3-540-68431-2
003 DE-He213
005 20240423132449.0
007 cr nn 008mamaa
008 121227s1997 gw | s |||| 0|eng d
020 _a9783540684312
_9978-3-540-68431-2
024 7 _a10.1007/3-540-62685-9
_2doi
050 4 _aQ334-342
050 4 _aTA347.A78
072 7 _aUYQ
_2bicssc
072 7 _aCOM004000
_2bisacsh
072 7 _aUYQ
_2thema
082 0 4 _a006.3
_223
245 1 0 _aComputational Learning Theory
_h[electronic resource] :
_bThird European Conference, EuroCOLT '97, Jerusalem, Israel, March 17 - 19, 1997, Proceedings /
_cedited by Shai Ben-David.
250 _a1st ed. 1997.
264 1 _aBerlin, Heidelberg :
_bSpringer Berlin Heidelberg :
_bImprint: Springer,
_c1997.
300 _aCCCXLVIII, 338 p.
_bonline resource.
336 _atext
_btxt
_2rdacontent
337 _acomputer
_bc
_2rdamedia
338 _aonline resource
_bcr
_2rdacarrier
347 _atext file
_bPDF
_2rda
490 1 _aLecture Notes in Artificial Intelligence,
_x2945-9141 ;
_v1208
505 0 _aSample compression, learnability, and the Vapnik-Chervonenkis dimension -- Learning boxes in high dimension -- Learning monotone term decision lists -- Learning matrix functions over rings -- Learning from incomplete boundary queries using split graphs and hypergraphs -- Generalization of the PAC-model for learning with partial information -- Monotonic and dual-monotonic probabilistic language learning of indexed families with high probability -- Closedness properties in team learning of recursive functions -- Structural measures for games and process control in the branch learning model -- Learning under persistent drift -- Randomized hypotheses and minimum disagreement hypotheses for learning with noise -- Learning when to trust which experts -- On learning branching programs and small depth circuits -- Learning nearly monotone k-term DNF -- Optimal attribute-efficient learning of disjunction, parity, and threshold functions -- learning pattern languages using queries -- On fast and simple algorithms for finding Maximal subarrays and applications in learning theory -- A minimax lower bound for empirical quantizer design -- Vapnik-Chervonenkis dimension of recurrent neural networks -- Linear Algebraic proofs of VC-Dimension based inequalities -- A result relating convex n-widths to covering numbers with some applications to neural networks -- Confidence estimates of classification accuracy on new examples -- Learning formulae from elementary facts -- Control structures in hypothesis spaces: The influence on learning -- Ordinal mind change complexity of language identification -- Robust learning with infinite additional information.
520 _aThis book constitutes the refereed proceedings of the Third European Conference on Computational Learning Theory, EuroCOLT'97, held in Jerusalem, Israel, in March 1997. The book presents 25 revised full papers carefully selected from a total of 36 high-quality submissions. The volume spans the whole spectrum of computational learning theory, with a certain emphasis on mathematical models of machine learning. Among the topics addressed are machine learning, neural nets, statistics, inductive inference, computational complexity, information theory, and theoretical physics.
650 0 _aArtificial intelligence.
650 0 _aMachine theory.
650 0 _aComputer science.
650 1 4 _aArtificial Intelligence.
650 2 4 _aFormal Languages and Automata Theory.
650 2 4 _aTheory of Computation.
700 1 _aBen-David, Shai.
_eeditor.
_4edt
_4http://id.loc.gov/vocabulary/relators/edt
710 2 _aSpringerLink (Online service)
773 0 _tSpringer Nature eBook
776 0 8 _iPrinted edition:
_z9783540626855
776 0 8 _iPrinted edition:
_z9783662213094
830 0 _aLecture Notes in Artificial Intelligence,
_x2945-9141 ;
_v1208
856 4 0 _uhttps://doi.org/10.1007/3-540-62685-9
912 _aZDB-2-SCS
912 _aZDB-2-SXCS
912 _aZDB-2-LNC
912 _aZDB-2-BAE
942 _cSPRINGER
999 _c188037
_d188037