Course Information
Course Overview
BERT, GPT, Deep Learning, Machine Learning, & NLP with Hugging Face, Attention in Python, Tensorflow, PyTorch, & Keras
Interested in the field of Natural Language Processing (NLP)? Then this course is for you!
Ever since Transformers arrived on the scene, deep learning hasn't been the same.
Machine learning is able to generate text essentially indistinguishable from that created by humans
We've reached new state-of-the-art performance in many NLP tasks, such as machine translation, question-answering, entailment, named entity recognition, and more
In this course, you will learn very practical skills for applying transformers, and if you want, the detailed theory behind how transformers and attention work.
There are several reasons why this course is different from any other course. The first reason is that it covers all basic natural language process techniques, so you will have an understanding of what natural language processing is. The second reason is that it covers GPT-2, NER, and BERT which are very popular in natural language processing. The final reason is that you will have lots of practice projects with detailed explanations step-by-step notebook so you can read it when you have free time.
The course is split into 4 major parts:
Basic natural language processing
Fundamental Transformers
Text generation with GPT-2
Text classification
PART 1: Using Transformers
In this section, you will learn about the fundamental of the natural language process. It is really important to understand basic natural language processing before learning transformers. In this section we will cover:
What is natural language processing (NLP)
What is stemming and lemmatization
What is chunking
What is a bag of words?
In this section, we will build 3 small projects. These projects are:
Gender identification
Sentiment analyzer
Topic modelling
PART 2: Fundamental transformer
In this section, you will learn how transformers really work. We will also introduce the new concept called Hugging face transformer and GPT-2 to have a big understanding of how powerful the transformer is.
In this section, we will implement two projects.
IMDB project
Q&A project implementation
PART 3: Project: Text generation with GPT-2
In this project, we will generate text with GPT-2. This is a project for us to practice and reinforce what we have learned so far. It will also demonstrate how text is generated quickly with a transformer.
PART 4: Token classification.
In this section, we will learn how to classify a text using a transformer. We will also learn about NER which is also popular in transformers. The main project in this section is about Q &A project and it will be more advanced than the previous Q & A project.
Course Content
- 6 section(s)
- 34 lecture(s)
- Section 1 Introduction
- Section 2 Basic Natural Language Processing (NLP)
- Section 3 Fundamental transformer
- Section 4 Project: Text generation with GPT-2
- Section 5 Token Classification
- Section 6 Thank you
What You’ll Learn
- Chunking
- Bag of Words
- Hugging Face transformer
- POS tagging
- TF-IDF
- GPT-2
- Token Classification
- BERT
- Stemming
- Lemmatization
- NER
- Preprocessing data
- Attention
- Fine-tuning
Skills covered in this course
Reviews
-
NNguyen Chuong
Khóa học này đã giúp tôi hiểu sâu hơn đáng kể về trí tuệ nhân tạo. Nội dung có cấu trúc tốt và mang tính thông tin cao. Ngoài ra, đây còn là nguồn tài nguyên vô giá cho bất kỳ ai muốn nâng cao kiến thức của mình trong lĩnh vực đang phát triển nhanh chóng này.
-
LLê Hoàng Phong
"This course has significantly deepened my understanding of artificial intelligence. The content is well-structured and highly informative. Additionally, it is an invaluable resource for anyone wanting to improve their knowledge." our knowledge in this rapidly growing field."
-
ÁÁi My
Bài học giúp mình rất nhiều , giá cả hợp lí , giải thích dễ hiểu với cung cấp thông tin chi tiết
-
TThuy Trang
This course helps me a lot, it is an excellent course. Good teacher, easy to understand.