Introduction to Transformers for Nlp: With the Hugging Face Library and Models to Solve Problems (Paperback)
Jain, Shashank Mohan
- 出版商: Apress
- 出版日期: 2022-10-21
- 售價: $1,250
- 貴賓價: 9.5 折 $1,188
- 語言: 英文
- 頁數: 165
- 裝訂: Quality Paper - also called trade paper
- ISBN: 1484288432
- ISBN-13: 9781484288436
-
相關分類:
Text-mining
立即出貨 (庫存=1)
買這商品的人也買了...
-
$2,900$2,755 -
$1,280$1,254 -
$2,330$2,214 -
$2,050$1,948
相關主題
商品描述
Get a hands-on introduction to Transformer architecture using the Hugging Face library. This book explains how Transformers are changing the AI domain, particularly in the area of natural language processing.
This book covers Transformer architecture and its relevance in natural language processing (NLP). It starts with an introduction to NLP and a progression of language models from n-grams to a Transformer-based architecture. Next, it offers some basic Transformers examples using the Google colab engine. Then, it introduces the Hugging Face ecosystem and the different libraries and models provided by it. Moving forward, it explains language models such as Google BERT with some examples before providing a deep dive into Hugging Face API using different language models to address tasks such as sentence classification, sentiment analysis, summarization, and text generation.
After completing Introduction to Transformers for NLP, you will understand Transformer concepts and be able to solve problems using the Hugging Face library.
What You Will Learn
- Understand language models and their importance in NLP and NLU (Natural Language Understanding)
- Master Transformer architecture through practical examples
- Use the Hugging Face library in Transformer-based language models
- Create a simple code generator in Python based on Transformer architecture
Who This Book Is ForData Scientists and software developers interested in developing their skills in NLP and NLU (Natural Language Understanding)
商品描述(中文翻譯)
這本書使用Hugging Face library提供了一個實踐性的介紹,讓讀者能夠深入了解Transformer架構。本書解釋了Transformer如何改變人工智慧領域,特別是在自然語言處理方面。
本書涵蓋了Transformer架構及其在自然語言處理(NLP)中的相關性。它從NLP的介紹開始,並從n-gram到基於Transformer的架構的語言模型的演進。接下來,它提供了一些使用Google colab引擎的基本Transformer示例。然後,介紹了Hugging Face生態系統以及其提供的不同庫和模型。接著,它解釋了語言模型,如Google BERT,並提供了一些示例,然後深入介紹了使用不同語言模型的Hugging Face API,以解決句子分類、情感分析、摘要和文本生成等任務。
完成《Introduction to Transformers for NLP》後,您將了解Transformer的概念,並能夠使用Hugging Face library解決問題。
本書的學習重點包括:
- 了解語言模型及其在NLP和NLU(自然語言理解)中的重要性
- 通過實際示例掌握Transformer架構
- 使用基於Transformer的語言模型的Hugging Face library
- 基於Transformer架構在Python中創建簡單的代碼生成器
本書適合數據科學家和軟件開發人員,他們有興趣在NLP和NLU(自然語言理解)方面提升自己的技能。
作者簡介
Shashank Mohan Jain has been working in the IT industry for around 20 years mainly in the areas of cloud computing, machine learning and distributed systems. He has keen interests in virtualization techniques, security, and complex systems. Shashank has software patents to his name in the area of cloud computing, IoT, and machine learning. He is a speaker at multiple reputed cloud conferences. Shashank holds Sun, Microsoft, and Linux kernel certifications.
作者簡介(中文翻譯)
Shashank Mohan Jain 在 IT 行業工作了大約 20 年,主要從事雲計算、機器學習和分散式系統等領域。他對虛擬化技術、安全性和複雜系統有濃厚的興趣。Shashank 在雲計算、物聯網和機器學習領域擁有軟體專利。他是多個知名雲端會議的演講者。Shashank 持有 Sun、Microsoft 和 Linux kernel 的認證。