000 02058nam a22002777a 4500
003 IIITD
005 20240808020004.0
008 240508b |||||||| |||| 00| 0 eng d
020 _a9780367767341
040 _aIIITD
082 0 0 _aCB 006.3
_bKAM-T
100 1 _aKamath, Uday
245 1 0 _aTransformers for machine learning :
_ba deep dive
_cby Uday Kamath, Kenneth L Graham and Wael Emara
260 _aNew york :
_bChapman and Hall,
_c©2022
300 _axxv, 257 p. :
_bill. ;
_c23 cm.
440 _aChapman & Hall/CRC machine learning & pattern recognition
504 _aIncludes bibliographical references and index.
505 _tDeep Learning and Transformers: An Introduction
_tTransformers: Basics and Introduction
_tBidirectional Encoder Representations from Transformers (BERT)
_tMultilingual Transformer Architectures
_tTransformer Modifications
_tPre-trained and Application-Specific Transformers
_tInterpretability and Explainability Techniques for Transformers.
520 _a"Transformers are becoming a core part of many neural network architectures, employed in a wide range of applications such as NLP, Speech Recognition, Time Series, and Computer Vision. Transformers have gone through many adaptations and alterations, resulting in newer techniques and methods. Transformers for Machine Learning: A Deep Dive is the first comprehensive book on transformers. The theoretical explanations of the state-of-the-art transformer architectures will appeal to postgraduate students and researchers (academic and industry) as it will provide a single entry point with deep discussions of a quickly moving field. The practical hands-on case studies and code will appeal to undergraduate students, practitioners, and professionals as it allows for quick experimentation and lowers the barrier to entry into the field"--
650 0 _aNeural networks (Computer science).
650 0 _aComputational intelligence.
650 0 _aMachine learning.
700 _aGraham, Kenneth L
700 _aEmara, Wael
942 _2ddc
_cBK
_02
999 _c172364
_d172364