Intro to NLP
03. Structured Languages1:31
03. Structured Languages
04. Grammar0:35
04. Grammar
05. Unstructured Text1:17
05. Unstructured Text
06. Counting Words00:00:00
07. Context Is Everything1:34
08. NLP and Pipelines0:48
09. How NLP Pipelines Work1:11
10. Text Processing1:56
11. Feature Extraction2:48
12. Modeling0:54
Text Processing
02. Coding Exercises
03. Introduction to GPU Workspaces
04. Workspaces Best Practices
05. Workspace
06. Capturing Text Data00:1:18
07. Cleaning6:24
08. Normalization2:19
09. Tokenization00:00:00
10. Stop Word Removal00:00:00
11. Part-of-Speech Tagging00:00:00
12. Named Entity Recognition00:00:00
13. Stemming and Lemmatization00:00:00
14. Summary00:00:00
08. Normalization
Spam Classifier with Naive Bayes
01. Intro0:22
02. Guess the Person2:01
03. Known and Inferred00:00:00
04. Guess the Person Now00:00:00
05. Bayes Theorem00:00:00
06. Quiz False Positives00:00:00
06. Quiz: False Positives
07. Solution False Positives00:00:00
08. Bayesian Learning 100:00:00
08.2 Bayesian Learning 1
08.2 Bayesian Learning 1
09. Bayesian Learning 200:00:00
10. Bayesian Learning 300:00:00
11. Naive Bayes Algorithm 100:00:00
11. Naive Bayes Algorithm 1
12. Naive Bayes Algorithm 200:00:00
13. Building a Spam Classifier
14. Project
15. Spam Classifier – Workspace
16. Outro
Part of Speech Tagging with HMMs
01. Intro00:00:00
02. Part of Speech Tagging00:00:00
03. Lookup Table00:00:00
04. Bigrams00:00:00
05. When bigrams won’t work00:00:00
06. Hidden Markov Models3:07
07. Quiz How many paths00:00:00
08. Solution How many paths1:24
09. Quiz How many paths now00:00:00
10. Quiz Which path is more likely00:00:00
10. Quiz Which path is more likely
11. Solution Which path is more likely00:00:00
12. Viterbi Algorithm Idea00:00:00
13. Viterbi Algorithm00:00:00
14. Further Reading
15. Outro00:00:00
Project Part of Speech Tagging
01. Lesson Plan – Week 3
02. Introduction
03. Workspace Part of Speech Tagging
Project Description – Part of Speech Tagging
Project Rubric – Part of Speech Tagging
Feature extraction and embeddings
01. Feature Extraction00:00:00
02. Bag of Words00:00:00
03. TF-IDF00:00:00
04. One-Hot Encoding00:00:00
05. Word Embeddings00:00:00
06. Word2Vec00:00:00
07. GloVe00:00:00
08. Embeddings for Deep Learning00:00:00
09. t-SNE00:00:00
10. Summary00:00:00
Topic Modeling
01. Intro00:00:00
02. References
03. Bag of Words00:00:00
03. Bag of Words
04. Latent Variables00:00:00
04. Latent Variables
05. Matrix Multiplication00:00:00
06. Matrices00:00:00
07. Quiz Picking Topics00:00:00
07. Quiz Picking Topics
08. Solution Picking Topics00:00:00
09. Beta Distributions00:00:00
10. Dirichlet Distributions00:00:00
11. Latent Dirichlet Allocation00:00:00
12. Sample a Topic00:00:00
12.2 Sample a Topic00:00:00
13. Sample a Word00:00:00
13.2 Sample a Word00:00:00
14. Combining the Models00:00:00
15. Outro00:00:00
16. Notebook Topic Modeling
17. [SOLUTION] Topic Modeling
18. Next Steps
Sentiment Analysis
01. Intro00:00:00
02. Sentiment Analysis with a Regular Classifier2:28
05. Sentiment Analysis with RNN0:19
06. Notebook Sentiment Analysis with an RNN
07. [SOLUTION] Sentiment Analysis with an RNN
08. Optional Material
09. Outro
Sequence to Sequence
01. Introducing Jay Alammar
02. Previous Material
03. Jay Introduction4:02
04. Applications00:00:00
05. Architectures00:00:00
06. Architectures in More Depth00:00:00
07. Outro
Deep Learning Attention
01. Introduction to Attention4:21
02. Sequence to Sequence Recap2:55
03. Encoding — Attention Overview1:21
04. Decoding — Attention Overview2:31
05. Attention Overview
06. Attention Encoder00:00:00
07. Attention Decoder00:00:00
08. Attention Encoder & Decoder
09. Bahdanau and Luong Attention00:00:00
10. Multiplicative Attention00:00:00
11. Additive Attention00:00:00
12. Additive and Multiplicative Attention
13. Computer Vision Applications00:00:00
14. NLP Application Google Neural Machine Translation
15. Other Attention Methods00:00:00
16. The Transformer and Self-Attention4:25
17. Notebook Attention Basics
18. [SOLUTION] Attention Basics
19. Outro
RNN Keras Lab
01. Intro
02. Machine Translation
02. Machine Translation
02.2 Machine Translation
03. Deciphering Code with character-level RNNs
04. [SOLUTION] Deciphering code with character-level RNNs
05. Congratulations!
Cloud Computing Setup Instructions
01. Overview
02. Create an AWS Account
03. Get Access to GPU Instances
04. Launch Your Instance
05. Remotely Connecting to Your Instance
Project Machine Translation
01. Introduction to GPU Workspaces
02. Workspaces: Best Practices
03. NLP Machine Translation Workspace
04. Project: Machine Translation
Project Description – Project: Machine Translation
Project Rubric – Project: Machine Translation
Intro to Voice User Interfaces
01. Welcome to Voice User Interfaces!00:00:00
02. VUI Overview00:00:00
03. VUI Applications00:00:00
(Optional) Alexa History Skill
Speech Recognition
01. Intro00:00:00
02. Challenges in ASR00:00:00
03. Signal Analysis00:00:00
04. References: Signal Analysis
05. Quiz: FFT00:00:00
06. Feature Extraction with MFCC00:00:00
07. References: Feature Extraction
08. Quiz: MFCC00:00:00
09. Phonetics00:00:00
10. References: Phonetics
12. Voice Data Lab Introduction00:00:00
13. Lab: Voice Data
14. Acoustic Models and the Trouble with Time00:00:00
15. HMMs in Speech Recognition00:00:00
16. Language Models00:00:00
17. N-Grams00:00:00
18. Quiz: N-Grams
19. References: Traditional ASR
20. A New Paradigm00:00:00
21. Deep Neural Networks as Speech Models00:00:00
22. Connectionist Tempora Classification (CTC)
23. References: Deep Neural Network ASR
24. Outro00:00:00
11. Quiz: Phonetics
Project DNN Speech Recognizer
01. Overview
02. Introduction to GPU Workspaces
03. Workspaces Best Practices
04. Tasks
05. VUI Speech Recognizer Workspace
Project Description – Project DNN Speech Recognizer
Project Rubric – Project DNN Speech Recognizer
Recurrent Neural Networks
01. Introducing00:00:00
02. RNN Introduction00:00:00
03. RNN History00:00:00
04. RNN Applications00:00:00
05. Feedforward Neural Network-Reminder00:00:00
05.2 Feedforward Neural Network-Reminder00:00:00
06. The Feedforward Process
06.2 The Feedforward Process00:00:00
07. Feedforward Quiz
07. Feedforward Quiz
07.2 Feedforward Quiz
08. Backpropagation- Theory00:00:00
09. Backpropagation – Example (part a)
10. Backpropagation- Example (part b)
11. Backpropagation Quiz
12. RNN (part a)
13. RNN (part b)
14. RNN- Unfolded Model
15. Unfolded Model Quiz
16. RNN- Example
17. Backpropagation Through Time (part a)
18. Backpropagation Through Time (part b)
19. Backpropagation Through Time (part c)
20. BPTT Quiz 1
21. BPTT Quiz 2
22. BPTT Quiz 3
23. Some more math
24. RNN Summary
25. From RNN to LSTM
26. Wrap Up
06.3 The Feedforward Process00:00:00
08.2 Backpropagation- Theory00:00:00
Long Short-Term Memory Networks (LSTM)
01. Intro to LSTM
02. RNN vs LSTM
03. Basics of LSTM
04. Architecture of LSTM
05. The Learn Gate
06. The Forget Gate
07. The Remember Gate
08. The Use Gate
09. Putting it All Together
10. Quiz
11. Other architectures
12. Outro LSTM
Sentiment Analysis with Andrew Trask
01. Introduction
02. Materials
03. The Notebooks
04. Framing the Problem
05. Mini Project 1
06. Mini Project 1 Solution
07. Transforming Text into Numbers
08. Mini Project 2
09. Mini Project 2 Solution
10. Building a Neural Network
11. Mini Project 3
12. Mini Project 3 Solution
13. Understanding Neural Noise
14. Mini Project 4
15. Understanding Inefficiencies in our Network
16. Mini Project 5
17. Mini Project 5 Solution
18. Further Noise Reduction
19. Mini Project 6
20. Mini Project 6 Solution
21. Analysis What’s Going on in the Weights
22. Conclusion
Sentiment Prediction RNN
01. Intro
02. Sentiment RNN
03. Data Preprocessing
04. Creating Testing Sets
05. Building the RNN
06. Training the Network
07. Solutions
Embeddings and Word2Vec
01. Additional NLP Lessons
02. Embeddings Intro
03. Implementing Word2Vec
04. Subsampling Solution
05. Making Batches
06. Batches Solution
07. Building the Network
08. Negative Sampling
09. Building the Network Solution
10. Training Results
Project Part of Speech Tagging
01. Introduction
Project Description – Part of Speech Tagging
Project Rubric – Part of Speech Tagging
Reinforcement Learning from Human Feedback