- Publisher
Mercury Learning and Information - Published
15th December 2023 - ISBN 9781683928980
- Language English
- Pages 364 pp.
- Size 6" x 9"
E-books are now distributed via VitalSource
VitalSource offer a more seamless way to access the ebook, and add some great new features including text-to-voice. You own your ebook for life, it is simply hosted on the vendor website, working much like Kindle and Nook. Click here to see more detailed information on this process.
- Publisher
Mercury Learning and Information - Published
21st November 2023 - ISBN 9781683928966
- Language English
- Pages 364 pp.
- Size 6" x 9"
Library E-Books
We are signed up with aggregators who resell networkable e-book editions of our titles to academic libraries. These editions, priced at par with simultaneous hardcover editions of our titles, are not available direct from Stylus.
These aggregators offer a variety of plans to libraries, such as simultaneous access by multiple library patrons, and access to portions of titles at a fraction of list price under what is commonly referred to as a "patron-driven demand" model.
- Publisher
Mercury Learning and Information - Published
21st November 2023 - ISBN 9781683928973
- Language English
- Pages 364 pp.
- Size 6" x 9"
This book provides a comprehensive group of topics covering the details of the Transformer architecture, BERT models, and the GPT series, including GPT-3 and GPT-4. Spanning across ten chapters, it begins with foundational concepts such as the attention mechanism, then tokenization techniques, explores the nuances of Transformer and BERT architectures, and culminates in advanced topics related to the latest in the GPT series, including ChatGPT. Key chapters provide insights into the evolution and significance of attention in deep learning, the intricacies of the Transformer architecture, a two-part exploration of the BERT family, and hands-on guidance on working with GPT-3. The concluding chapters present an overview of ChatGPT, GPT-4, and visualization using generative AI. In addition to the primary topics, the book also covers influential AI organizations such as DeepMind, OpenAI, Cohere, Hugging Face, and more. Readers will gain a comprehensive understanding of the current landscape of NLP models, their underlying architectures, and practical applications. Features companion files with numerous code samples and figures from the book.
FEATURES:
- Provides a comprehensive group of topics covering the details of the Transformer architecture, BERT models, and the GPT series, including GPT-3 and GPT-4.
- Features companion files with numerous code samples and figures from the book.
1: The Attention Mechanism
2: Tokenization
3: Transformer Architecture Introduction
4: Transformer Architecture in Greater Depth
5: The BERT Family Introduction
6: The BERT Family in Greater Depth
7: Working with GPT-3 Introduction
8: Working with GPT-3 in Greater Depth
9: ChatGPT and GPT-4
10: Visualization with Generative AI
Index
Oswald Campesato
Oswald Campesato specializes in Deep Learning, Python, Data Science, and generative AI. He is the author/co-author of over forty-five books including Google Gemini for Python, Large Language Models, and GPT-4 for Developers (all Mercury Learning).