Udemy

Natural Language Processing: NLP With Transformers in Python

立即報名
  • 30,041 名學生
  • 更新於 8/2022
4.6
(2,346 個評分)
CTgoodjobs 嚴選優質課程,為職場人士提升競爭力。透過本站連結購買Udemy課程,本站將獲得推廣佣金,有助未來提供更多實用進修課程資訊給讀者。

課程資料

報名日期
全年招生
課程級別
學習模式
修業期
11 小時 30 分鐘
教學語言
英語
授課導師
James Briggs
評分
4.6
(2,346 個評分)
4次瀏覽

課程簡介

Natural Language Processing: NLP With Transformers in Python

Learn next-generation NLP with transformers for sentiment analysis, Q&A, similarity search, NER, and more

Transformer models are the de-facto standard in modern NLP. They have proven themselves as the most expressive, powerful models for language by a large margin, beating all major language-based benchmarks time and time again.

In this course, we cover everything you need to get started with building cutting-edge performance NLP applications using transformer models like Google AI's BERT, or Facebook AI's DPR.

We cover several key NLP frameworks including:

  • HuggingFace's Transformers

  • TensorFlow 2

  • PyTorch

  • spaCy

  • NLTK

  • Flair

And learn how to apply transformers to some of the most popular NLP use-cases:

  • Language classification/sentiment analysis

  • Named entity recognition (NER)

  • Question and Answering

  • Similarity/comparative learning

Throughout each of these use-cases we work through a variety of examples to ensure that what, how, and why transformers are so important. Alongside these sections we also work through two full-size NLP projects, one for sentiment analysis of financial Reddit data, and another covering a fully-fledged open domain question-answering application.

All of this is supported by several other sections that encourage us to learn how to better design, implement, and measure the performance of our models, such as:

  • History of NLP and where transformers come from

  • Common preprocessing techniques for NLP

  • The theory behind transformers

  • How to fine-tune transformers

We cover all this and more, I look forward to seeing you in the course!

課程章節

  • 10 個章節
  • 104 堂課
  • 第 1 章 Introduction
  • 第 2 章 NLP and Transformers
  • 第 3 章 Preprocessing for NLP
  • 第 4 章 Attention
  • 第 5 章 Language Classification
  • 第 6 章 [Project] Sentiment Model With TensorFlow and Transformers
  • 第 7 章 Long Text Classification With BERT
  • 第 8 章 Named Entity Recognition (NER)
  • 第 9 章 Question and Answering
  • 第 10 章 Metrics For Language

課程內容

  • Industry standard NLP using transformer models
  • Build full-stack question-answering transformer models
  • Perform sentiment analysis with transformers models in PyTorch and TensorFlow
  • Advanced search technologies like Elasticsearch and Facebook AI Similarity Search (FAISS)
  • Create fine-tuned transformers models for specialized use-cases
  • Measure performance of language models using advanced metrics like ROUGE
  • Vector building techniques like BM25 or dense passage retrievers (DPR)
  • An overview of recent developments in NLP
  • Understand attention and other key components of transformers
  • Learn about key transformers models such as BERT
  • Preprocess text data for NLP
  • Named entity recognition (NER) using spaCy and transformers
  • Fine-tune language classification models


評價

  • S
    Sangeetha Vajiram
    4.5

    good

  • M
    MAHAMMAD JAVEED SHAIK
    5.0

    Nice explanation

  • S
    SUDULA JAGAN MOHAN REDDY
    4.5

    it was good expereience

  • S
    Santhosh Kumar H V
    5.0

    good

立即關注瀏覽更多

本網站使用Cookies來改善您的瀏覽體驗,請確定您同意及接受我們的私隱政策使用條款才繼續瀏覽。

我已閱讀及同意