Pretrain Vision and Large Language Models in Python: End-to-end techniques for building and deploying foundation models on AWS

Webber, Emily

  • 出版商: Packt Publishing
  • 出版日期: 2023-05-31
  • 售價: $2,010
  • 貴賓價: 9.5$1,910
  • 語言: 英文
  • 頁數: 258
  • 裝訂: Quality Paper - also called trade paper
  • ISBN: 180461825X
  • ISBN-13: 9781804618257
  • 相關分類: Amazon Web ServicesLangChainPython程式語言
  • 海外代購書籍(需單獨結帳)

相關主題

商品描述

Master the art of training vision and large language models with conceptual fundaments and industry-expert guidance. Learn about AWS services and design patterns, with relevant coding examples

 

Key Features:

  • Learn to develop, train, tune, and apply foundation models with optimized end-to-end pipelines.
  • Explore large-scale distributed training for models and datasets with AWS and SageMaker examples.
  • Evaluate, deploy, and operationalize your custom models with bias detection and pipeline monitoring.

 

Book Description:

Foundation models have forever changed machine learning. From BERT to ChatGPT, CLIP to Stable Diffusion, when billions of parameters are combined with large datasets and hundreds to thousands of GPUs, the result is nothing short of record-breaking. The recommendations, advice, and code samples in this book will help you pretrain and fine-tune your own foundation models from scratch on AWS and Amazon SageMaker, while applying them to hundreds of use cases across your organization.

 

With advice from seasoned AWS and machine learning expert Emily Webber, this book helps you learn everything you need to go from project ideation to dataset preparation, training, evaluation, and deployment for large language, vision, and multimodal models. With step-by-step explanations of essential concepts and practical examples, you'll go from mastering the concept of pretraining to preparing your dataset and model, configuring your environment, training, fine-tuning, evaluating, deploying, and optimizing your foundation models.

 

You will learn how to apply the scaling laws to distributing your model and dataset over multiple GPUs, remove bias, achieve high throughput, and build deployment pipelines.

 

By the end of this book, you'll be well equipped to embark on your own project to pretrain and fine-tune the foundation models of the future.

 

What You Will Learn:

  • Find the right use cases and datasets for pretraining and fine-tuning
  • Prepare for large-scale training with custom accelerators and GPUs
  • Configure environments on AWS and SageMaker to maximize performance
  • Select hyperparameters based on your model and constraints
  • Distribute your model and dataset using many types of parallelism
  • Avoid pitfalls with job restarts, intermittent health checks, and more
  • Evaluate your model with quantitative and qualitative insights
  • Deploy your models with runtime improvements and monitoring pipelines

 

Who this book is for:

If you're a machine learning researcher or enthusiast who wants to start a foundation modelling project, this book is for you. Applied scientists, data scientists, machine learning engineers, solution architects, product managers, and students will all benefit from this book. Intermediate Python is a must, along with introductory concepts of cloud computing. A strong understanding of deep learning fundamentals is needed, while advanced topics will be explained. The content covers advanced machine learning and cloud techniques, explaining them in an actionable, easy-to-understand way.

商品描述(中文翻譯)

掌握訓練視覺和大型語言模型的藝術,並獲得概念基礎和行業專家指導。學習有關AWS服務和設計模式的知識,並提供相關的編碼示例。

主要特點:
- 學習開發、訓練、調整和應用基礎模型,並優化端到端的流程。
- 使用AWS和SageMaker示例,探索大規模分佈式模型和數據集的訓練。
- 使用偏見檢測和流程監控評估、部署和操作自定義模型。

書籍描述:
基礎模型永遠改變了機器學習。從BERT到ChatGPT,從CLIP到Stable Diffusion,當數十億個參數與大型數據集和數百到數千個GPU結合時,結果無疑是創紀錄的。本書中的建議、建議和代碼示例將幫助您在AWS和Amazon SageMaker上從頭開始預訓練和微調自己的基礎模型,同時將它們應用於組織中的數百種用例。

在AWS和機器學習專家Emily Webber的建議下,本書將幫助您從項目構思到數據集準備、訓練、評估和部署大型語言、視覺和多模態模型所需的一切。通過對基本概念的逐步解釋和實際示例,您將從掌握預訓練的概念到準備數據集和模型,配置環境,訓練,微調,評估,部署和優化基礎模型。

您將學習如何應用擴展定律將模型和數據集分佈到多個GPU上,消除偏見,實現高吞吐量並構建部署流程。

通過閱讀本書,您將具備充分的能力開展自己的項目,預訓練和微調未來的基礎模型。

您將學到:
- 找到適合預訓練和微調的用例和數據集。
- 使用自定加速器和GPU進行大規模訓練的準備。
- 在AWS和SageMaker上配置環境以最大化性能。
- 根據模型和限制選擇超參數。
- 使用多種類型的並行性分佈模型和數據集。
- 避免作業重新啟動、間歇性健康檢查等問題。
- 使用定量和定性洞察評估模型。
- 使用運行時改進和監控流程部署模型。

本書適合對開展基礎建模項目感興趣的機器學習研究人員或愛好者。應用科學家、數據科學家、機器學習工程師、解決方案架構師、產品經理和學生都將從本書中受益。需要具備中級Python和雲計算入門概念。需要對深度學習基礎有很好的理解,高級主題將有詳細解釋。本書涵蓋了高級機器學習和雲技術,以可操作且易於理解的方式解釋。