Udemy

Natural Language Processing: NLP With Transformers in Python

Enroll Now
  • 30,041 Students
  • Updated 8/2022
4.6
(2,346 Ratings)
CTgoodjobs selects quality courses to enhance professionals' competitiveness. By purchasing courses through links on our site, we may receive an affiliate commission.

Course Information

Registration period
Year-round Recruitment
Course Level
Study Mode
Duration
11 Hour(s) 30 Minute(s)
Language
English
Taught by
James Briggs
Rating
4.6
(2,346 Ratings)
1 views

Course Overview

Natural Language Processing: NLP With Transformers in Python

Learn next-generation NLP with transformers for sentiment analysis, Q&A, similarity search, NER, and more

Transformer models are the de-facto standard in modern NLP. They have proven themselves as the most expressive, powerful models for language by a large margin, beating all major language-based benchmarks time and time again.

In this course, we cover everything you need to get started with building cutting-edge performance NLP applications using transformer models like Google AI's BERT, or Facebook AI's DPR.

We cover several key NLP frameworks including:

  • HuggingFace's Transformers

  • TensorFlow 2

  • PyTorch

  • spaCy

  • NLTK

  • Flair

And learn how to apply transformers to some of the most popular NLP use-cases:

  • Language classification/sentiment analysis

  • Named entity recognition (NER)

  • Question and Answering

  • Similarity/comparative learning

Throughout each of these use-cases we work through a variety of examples to ensure that what, how, and why transformers are so important. Alongside these sections we also work through two full-size NLP projects, one for sentiment analysis of financial Reddit data, and another covering a fully-fledged open domain question-answering application.

All of this is supported by several other sections that encourage us to learn how to better design, implement, and measure the performance of our models, such as:

  • History of NLP and where transformers come from

  • Common preprocessing techniques for NLP

  • The theory behind transformers

  • How to fine-tune transformers

We cover all this and more, I look forward to seeing you in the course!

Course Content

  • 10 section(s)
  • 104 lecture(s)
  • Section 1 Introduction
  • Section 2 NLP and Transformers
  • Section 3 Preprocessing for NLP
  • Section 4 Attention
  • Section 5 Language Classification
  • Section 6 [Project] Sentiment Model With TensorFlow and Transformers
  • Section 7 Long Text Classification With BERT
  • Section 8 Named Entity Recognition (NER)
  • Section 9 Question and Answering
  • Section 10 Metrics For Language

What You’ll Learn

  • Industry standard NLP using transformer models
  • Build full-stack question-answering transformer models
  • Perform sentiment analysis with transformers models in PyTorch and TensorFlow
  • Advanced search technologies like Elasticsearch and Facebook AI Similarity Search (FAISS)
  • Create fine-tuned transformers models for specialized use-cases
  • Measure performance of language models using advanced metrics like ROUGE
  • Vector building techniques like BM25 or dense passage retrievers (DPR)
  • An overview of recent developments in NLP
  • Understand attention and other key components of transformers
  • Learn about key transformers models such as BERT
  • Preprocess text data for NLP
  • Named entity recognition (NER) using spaCy and transformers
  • Fine-tune language classification models


Reviews

  • S
    Sangeetha Vajiram
    4.5

    good

  • M
    MAHAMMAD JAVEED SHAIK
    5.0

    Nice explanation

  • S
    SUDULA JAGAN MOHAN REDDY
    4.5

    it was good expereience

  • S
    Santhosh Kumar H V
    5.0

    good

Start FollowingSee all

We use cookies to enhance your experience on our website. Please read and confirm your agreement to our Privacy Policy and Terms and Conditions before continue to browse our website.

Read and Agreed