000 | 01880 a2200253 4500 | ||
---|---|---|---|
003 | OSt | ||
005 | 20241125170846.0 | ||
008 | 241125b xxu||||| |||| 00| 0 eng d | ||
020 | _a9781316519332 | ||
040 | _cIIT Kanpur | ||
041 | _aeng | ||
082 |
_a006.31 _bR541p |
||
100 | _aRoberts, Daniel A | ||
245 |
_aPrinciples of deep learning theory _ban effective theory approach to understanding neural networks _cDaniel A. Roberts and Sho Yaida |
||
260 |
_bCambridge University Press _c2022 _aCambridge |
||
300 | _ax, 460p | ||
500 | _abased on research in collaboration with Boris Hanin | ||
520 | _aThis textbook establishes a theoretical framework for understanding deep learning models of practical relevance. With an approach that borrows from theoretical physics, Roberts and Yaida provide clear and pedagogical explanations of how realistic deep neural networks actually work. To make results from the theoretical forefront accessible, the authors eschew the subject's traditional emphasis on intimidating formality without sacrificing accuracy. Straightforward and approachable, this volume balances detailed first-principle derivations of novel results with insight and intuition for theorists and practitioners alike. This self-contained textbook is ideal for students and researchers interested in artificial intelligence with minimal prerequisites of linear algebra, calculus, and informal probability theory, and it can easily fill a semester-long course on deep learning theory. For the first time, the exciting practical advances in modern artificial intelligence capabilities can be matched with a set of effective principles, providing a timeless blueprint for theoretical research in deep learning | ||
650 | _aDeep learning | ||
650 | _aMachine learning | ||
650 | _aMathematical & Computational | ||
700 | _aYaida, Sho | ||
942 | _cBK | ||
999 |
_c567271 _d567271 |