Bayesian reasoning and machine learning
By: Barber, David.Material type: BookPublisher: New Delhi : Cambridge University Press, ©2012Description: xxiv, 697 p. : ill. ; 26 cm.ISBN: 9781107439955.Subject(s): Machine learning | Bayesian statistical decision theory | COMPUTERS / Computer Vision & Pattern RecognitionOnline resources: Cover image | Contributor biographical information | Publisher description | Table of contents only
|Item type||Current location||Collection||Call number||Status||Date due||Barcode||Item holds||Course reserves|
|Highly Demanded Book||IIITD General Stacks||Computer Science and Engineering||006.31 BAR-B (Browse shelf)||Checked out||03/02/2020||007315|
|Highly Demanded Book||IIITD General Stacks||Computer Science and Engineering||006.31 BAR-B (Browse shelf)||Checked out||04/02/2020||007316|
|Highly Demanded Book||IIITD General Stacks||Computer Science and Engineering||006.31 BAR-B (Browse shelf)||Checked out||04/02/2020||006254|
|Highly Demanded Book||IIITD General Stacks||Computer Science and Engineering||006.31 BAR-B (Browse shelf)||Available||005761|
|Reference||IIITD Reference||Computer Science and Engineering||REF 006.31 BAR-B (Browse shelf)||Not For Loan||004434|
Includes bibliographical references and index.
Machine generated contents note: Preface; Part I. Inference in Probabilistic Models: 1. Probabilistic reasoning; 2. Basic graph concepts; 3. Belief networks; 4. Graphical models; 5. Efficient inference in trees; 6. The junction tree algorithm; 7. Making decisions; Part II. Learning in Probabilistic Models: 8. Statistics for machine learning; 9. Learning as inference; 10. Naive Bayes; 11. Learning with hidden variables; 12. Bayesian model selection; Part III. Machine Learning: 13. Machine learning concepts; 14. Nearest neighbour classification; 15. Unsupervised linear dimension reduction; 16. Supervised linear dimension reduction; 17. Linear models; 18. Bayesian linear models; 19. Gaussian processes; 20. Mixture models; 21. Latent linear models; 22. Latent ability models; Part IV. Dynamical Models: 23. Discrete-state Markov models; 24. Continuous-state Markov models; 25. Switching linear dynamical systems; 26. Distributed computation; Part V. Approximate Inference: 27. Sampling; 28. Deterministic approximate inference; Appendix. Background mathematics; Bibliography; Index.
"Machine learning methods extract value from vast data sets quickly and with modest resources. They are established tools in a wide range of industrial applications, including search engines, DNA sequencing, stock market analysis, and robot locomotion, and their use is spreading rapidly. People who know the methods have their choice of rewarding jobs. This hands-on text opens these opportunities to computer science students with modest mathematical backgrounds. It is designed for final-year undergraduates and master's students with limited background in linear algebra and calculus. Comprehensive and coherent, it develops everything from basic reasoning to advanced techniques within the framework of graphical models. Students learn more than a menu of techniques, they develop analytical and problem-solving skills that equip them for the real world. Numerous examples and exercises, both computer based and theoretical, are included in every chapter. Resources for students and instructors, including a MATLAB toolbox, are available online"--
"Vast amounts of data present amajor challenge to all thoseworking in computer science, and its many related fields, who need to process and extract value from such data. Machine learning technology is already used to help with this task in a wide range of industrial applications, including search engines, DNA sequencing, stock market analysis and robot locomotion. As its usage becomes more widespread, no student should be without the skills taught in this book. Designed for final-year undergraduate and graduate students, this gentle introduction is ideally suited to readers without a solid background in linear algebra and calculus. It covers everything from basic reasoning to advanced techniques in machine learning, and rucially enables students to construct their own models for real-world problems by teaching them what lies behind the methods. Numerous examples and exercises are included in the text. Comprehensive resources for students and instructors are available online"--