MARC details
000 -LEADER |
fixed length control field |
02374nam a22002777a 4500 |
003 - CONTROL NUMBER IDENTIFIER |
control field |
IIITD |
005 - DATE AND TIME OF LATEST TRANSACTION |
control field |
20240815020005.0 |
008 - FIXED-LENGTH DATA ELEMENTS--GENERAL INFORMATION |
fixed length control field |
240518b |||||||| |||| 00| 0 eng d |
020 ## - INTERNATIONAL STANDARD BOOK NUMBER |
International Standard Book Number |
9781316519332 |
040 ## - CATALOGING SOURCE |
Original cataloging agency |
IIITD |
082 00 - DEWEY DECIMAL CLASSIFICATION NUMBER |
Classification number |
CB 006.3 |
Item number |
ROB-P |
100 1# - MAIN ENTRY--PERSONAL NAME |
Personal name |
Roberts, Daniel A |
245 14 - TITLE STATEMENT |
Title |
The principles of deep learning theory : |
Remainder of title |
an effective theory approach to understanding neural networks |
Statement of responsibility, etc |
by Daniel A. Roberts and Sho Yaida |
260 ## - PUBLICATION, DISTRIBUTION, ETC. (IMPRINT) |
Place of publication, distribution, etc |
New York : |
Name of publisher, distributor, etc |
Cambridge University Press, |
Date of publication, distribution, etc |
©2022 |
300 ## - PHYSICAL DESCRIPTION |
Extent |
x, 460 p. : |
Other physical details |
ill ; |
Dimensions |
26 cm. |
500 ## - GENERAL NOTE |
General note |
This book include an index. |
504 ## - BIBLIOGRAPHY, ETC. NOTE |
Bibliography, etc |
Includes bibliographical references and index. |
505 ## - FORMATTED CONTENTS NOTE |
Title |
Pretraining |
-- |
Neural network |
-- |
effective theory of deep linear networks at initialization |
-- |
RG flow of presentations |
-- |
effective theory of the NTK at initialization |
-- |
Kernel learning |
-- |
representation learning |
520 ## - SUMMARY, ETC. |
Summary, etc |
"This textbook establishes a theoretical framework for understanding deep learning models of practical relevance. With an approach that borrows from theoretical physics, Roberts and Yaida provide clear and pedagogical explanations of how realistic deep neural networks actually work. To make results from the theoretical forefront accessible, the authors eschew the subject's traditional emphasis on intimidating formality without sacrificing accuracy. Straightforward and approachable, this volume balances detailed first-principle derivations of novel results with insight and intuition for theorists and practitioners alike. This self-contained textbook is ideal for students and researchers interested in artificial intelligence with minimal prerequisites of linear algebra, calculus, and informal probability theory, and it can easily fill a semester-long course on deep learning theory. For the first time, the exciting practical advances in modern artificial intelligence capabilities can be matched with a set of effective principles, providing a timeless blueprint for theoretical research in deep learning"-- |
650 #0 - SUBJECT ADDED ENTRY--TOPICAL TERM |
Topical term or geographic name as entry element |
Deep learning (Machine learning) |
650 #7 - SUBJECT ADDED ENTRY--TOPICAL TERM |
Topical term or geographic name as entry element |
SCIENCE / Physics / Mathematical & Computational |
650 #7 - SUBJECT ADDED ENTRY--TOPICAL TERM |
Topical term or geographic name as entry element |
Pretraining |
700 ## - ADDED ENTRY--PERSONAL NAME |
Personal name |
Yaida, Sho |
776 08 - ADDITIONAL PHYSICAL FORM ENTRY |
Display text |
Online version: |
Main entry heading |
Roberts, Daniel A. |
Title |
Principles of deep learning theory |
Edition |
1. |
Place, publisher, and date of publication |
New York : Cambridge University Press, 2022 |
International Standard Book Number |
9781009023405 |
Record control number |
(DLC) 2021060636 |
942 ## - ADDED ENTRY ELEMENTS (KOHA) |
Source of classification or shelving scheme |
Dewey Decimal Classification |
Koha item type |
Books |
Koha issues (borrowed), all copies |
1 |