
Click here for short Curriculum
Part 01 : Introduction to Python
Start coding with Python, drawing upon libraries and automation scripts to solve complex problems quickly.
-
Module 01: Lessons
-
Lesson 01: Why Python Programming
Welcome to Introduction to Python! Here's an overview of the course.
-
Lesson 02: Data Types and Operators
Familiarize yourself with the building blocks of Python! Learn about data types and operators, built-in functions, type conversion, whitespace, and style guidelines.
- Concept 01: Introduction
- Concept 02: Arithmetic Operators
- Concept 03: Quiz: Arithmetic Operators
- Concept 04: Solution: Arithmetic Operators
- Concept 05: Variables and Assignment Operators
- Concept 06: Quiz: Variables and Assignment Operators
- Concept 07: Solution: Variables and Assignment Operators
- Concept 08: Integers and Floats
- Concept 09: Quiz: Integers and Floats
- Concept 10: Booleans, Comparison Operators, and Logical Operators
- Concept 11: Quiz: Booleans, Comparison Operators, and Logical Operators
- Concept 12: Solution: Booleans, Comparison and Logical Operators
- Concept 13: Strings
- Concept 14: Quiz: Strings
- Concept 15: Solution: Strings
- Concept 16: Type and Type Conversion
- Concept 17: Quiz: Type and Type Conversion
- Concept 18: Solution: Type and Type Conversion
- Concept 19: String Methods
- Concept 20: String Methods
- Concept 21: Another String Method - Split
- Concept 22: Quiz: String Methods Practice
- Concept 23: Solution: String Methods Practice
- Concept 24: "There's a Bug in my Code"
- Concept 25: Conclusion
- Concept 26: Summary
-
Lesson 03: Data Structures
Use data structures to order and group different data types together! Learn about the types of data structures in Python, along with more useful built-in functions and operators.
- Concept 01: Introduction
- Concept 02: Lists and Membership Operators
- Concept 03: Quiz: Lists and Membership Operators
- Concept 04: Solution: List and Membership Operators
- Concept 05: Why Do We Need Lists?
- Concept 06: List Methods
- Concept 07: Quiz: List Methods
- Concept 08: Check for Understanding: Lists
- Concept 09: Tuples
- Concept 10: Quiz: Tuples
- Concept 11: Sets
- Concept 12: Quiz: Sets
- Concept 13: Dictionaries and Identity Operators
- Concept 14: Quiz: Dictionaries and Identity Operators
- Concept 15: Solution: Dictionaries and Identity Operators
- Concept 16: Quiz: More With Dictionaries
- Concept 17: When to Use Dictionaries?
- Concept 18: Check for Understanding: Data Structures
- Concept 19: Compound Data Structures
- Concept 20: Quiz: Compound Data Structures
- Concept 21: Solution: Compound Data Structions
- Concept 22: Practice Questions
- Concept 23: Solution: Practice Questions
- Concept 24: Conclusion
-
Lesson 04: Control Flow
Build logic into your code with control flow tools! Learn about conditional statements, repeating code with loops and useful built-in functions, and list comprehensions.
- Concept 01: Introduction
- Concept 02: Conditional Statements
- Concept 03: Practice: Conditional Statements
- Concept 04: Solution: Conditional Statements
- Concept 05: Quiz: Conditional Statements
- Concept 06: Solution: Conditional Statements
- Concept 07: Boolean Expressions for Conditions
- Concept 08: Quiz: Boolean Expressions for Conditions
- Concept 09: Solution: Boolean Expressions for Conditions
- Concept 10: For Loops
- Concept 11: Practice: For Loops
- Concept 12: Solution: For Loops Practice
- Concept 13: Quiz: For Loops
- Concept 14: Solution: For Loops Quiz
- Concept 15: Quiz: Match Inputs To Outputs
- Concept 16: Building Dictionaries
- Concept 17: Iterating Through Dictionaries with For Loops
- Concept 18: Quiz: Iterating Through Dictionaries
- Concept 19: Solution: Iterating Through Dictionaries
- Concept 20: While Loops
- Concept 21: Practice: While Loops
- Concept 22: Solution: While Loops Practice
- Concept 23: Quiz: While Loops
- Concept 24: Solution: While Loops Quiz
- Concept 25: For Loops vs. While Loops
- Concept 26: Check for Understanding: For and While Loops
- Concept 27: Solution: Check for Understanding: For and While Loops
- Concept 28: Break, Continue
- Concept 29: Quiz: Break, Continue
- Concept 30: Solution: Break, Continue
- Concept 31: Practice: Loops
- Concept 32: Solution: Loops
- Concept 33: Zip and Enumerate
- Concept 34: Quiz: Zip and Enumerate
- Concept 35: Solution: Zip and Enumerate
- Concept 36: List Comprehensions
- Concept 37: Quiz: List Comprehensions
- Concept 38: Solution: List Comprehensions
- Concept 39: Practice Questions
- Concept 40: Solutions to Practice Questions
- Concept 41: Conclusion
-
Lesson 05: Functions
Learn how to use functions to improve and reuse your code! Learn about functions, variable scope, documentation, lambda expressions, iterators, and generators.
- Concept 01: Introduction
- Concept 02: Defining Functions
- Concept 03: Quiz: Defining Functions
- Concept 04: Solution: Defining Functions
- Concept 05: Check For Understanding: Functions
- Concept 06: Variable Scope
- Concept 07: Variable Scope
- Concept 08: Solution: Variable Scope
- Concept 09: Check For Understanding: Variable Scope
- Concept 10: Documentation
- Concept 11: Quiz: Documentation
- Concept 12: Solution: Documentation
- Concept 13: Lambda Expressions
- Concept 14: Quiz: Lambda Expressions
- Concept 15: Solution: Lambda Expressions
- Concept 16: Iterators and Generators
- Concept 17: Quiz: Iterators and Generators
- Concept 18: Solution: Iterators and Generators
- Concept 19: Generator Expressions
- Concept 20: Conclusion
-
Lesson 06: Scripting
Setup your own programming environment to write and run Python scripts locally! Learn good scripting practices, interact with different inputs, and discover awesome tools.
- Concept 01: Introduction
- Concept 02: Python Installation
- Concept 03: Install Python Using Anaconda
- Concept 04: [For Windows] Configuring Git Bash to Run Python
- Concept 05: Running a Python Script
- Concept 06: Programming Environment Setup
- Concept 07: Editing a Python Script
- Concept 08: Scripting with Raw Input
- Concept 09: Quiz: Scripting with Raw Input
- Concept 10: Solution: Scripting with Raw Input
- Concept 11: Errors and Exceptions
- Concept 12: Errors and Exceptions
- Concept 13: Handling Errors
- Concept 14: Practice: Handling Input Errors
- Concept 15: Solution: Handling Input Errors
- Concept 16: Accessing Error Messages
- Concept 17: Reading and Writing Files
- Concept 18: Quiz: Reading and Writing Files
- Concept 19: Solution: Reading and Writing Files
- Concept 20: Quiz: Practice Debugging
- Concept 21: Solutions for Quiz: Practice Debugging
- Concept 22: Importing Local Scripts
- Concept 23: The Standard Library
- Concept 24: Quiz: The Standard Library
- Concept 25: Solution: The Standard Library
- Concept 26: Techniques for Importing Modules
- Concept 27: Quiz: Techniques for Importing Modules
- Concept 28: Third-Party Libraries
- Concept 29: Experimenting with an Interpreter
- Concept 30: Online Resources
- Concept 31: Practice Question
- Concept 32: Solution for Practice Question
- Concept 33: Conclusion
-
Lesson 07: Intro to Object-Oriented Programming
- Concept 01: Introduction
- Concept 02: Procedural vs. Object-Oriented Programming
- Concept 03: Class, Object, Method and Attribute
- Concept 04: OOP Syntax
- Concept 05: Exercise: OOP Syntax Practice - Part 1
- Concept 06: A Couple of Notes about OOP
- Concept 07: Exercise: OOP Syntax Practice - Part 2
- Concept 08: Commenting Object-Oriented Code
- Concept 09: A Gaussian Class
- Concept 10: How the Gaussian Class Works
- Concept 11: Exercise: Code the Gaussian Class
- Concept 12: Magic Methods
- Concept 13: Exercise: Code Magic Methods
- Concept 14: Inheritance
- Concept 15: Exercise: Inheritance with Clothing
- Concept 16: Inheritance: Probability Distribution
- Concept 17: Demo: Inheritance Probability Distributions
- Concept 18: Advanced OOP Topics
-
-
Module 02: Project
-
Lesson 01: Use a Pre-trained Image Classifier to Identify Dog Breeds
Project Description - Use a Pre-trained Image Classifier to Identify Dog Breeds
Project Rubric - Use a Pre-trained Image Classifier to Identify Dog Breeds
- Concept 01: Instructor
- Concept 02: Project Description
- Concept 03: Project Instructions
- Concept 04: Workspace How-to
- Concept 05: Timing Code
- Concept 06: Project Workspace - Timing
- Concept 07: Command Line Arguments
- Concept 08: Project Workspace - Command Line Arguments
- Concept 09: Mutable Data Types and Functions
- Concept 10: Creating Pet Image Labels
- Concept 11: Project Workspace - Pet Image Labels
- Concept 12: Classifying Images
- Concept 13: Project Workspace - Classifying Images
- Concept 14: Classifying Labels as Dogs
- Concept 15: Project Workspace - Adjusting Results
- Concept 16: Calculating Results
- Concept 17: Project Workspace - Calculating Results
- Concept 18: Printing Results
- Concept 19: Project Workspace - Printing Results
- Concept 20: Classify Uploaded Images
- Concept 21: Project Workspace - Classify Uploaded Images
- Concept 22: Final Results
- Concept 23: Project Workspace - Final Results
-
Part 02 : SQL For Data Analysis
Welcome to SQL For Data Analysis
-
Module 01: Prerequisite: SQL for Data Analysis
-
Lesson 01: Basic SQL
In this section, you will gain knowledge about SQL basics for working with a single table. You will learn the key commands to filter a table in many different ways.
- Concept 01: Video: SQL Introduction
- Concept 02: Video: The Parch & Posey Database
- Concept 03: Video + Text: The Parch & Posey Database
- Concept 04: Quiz: ERD Fundamentals
- Concept 05: Text: Map of SQL Content
- Concept 06: Video: Why SQL
- Concept 07: Video: How Databases Store Data
- Concept 08: Text + Quiz: Types of Databases
- Concept 09: Video: Types of Statements
- Concept 10: Statements
- Concept 11: Video: SELECT & FROM
- Concept 12: Your First Queries in SQL Workspace
- Concept 13: Solution: Your First Queries
- Concept 14: Formatting Best Practices
- Concept 15: Video: LIMIT
- Concept 16: Quiz: LIMIT
- Concept 17: Solution: LIMIT
- Concept 18: Video: ORDER BY
- Concept 19: Quiz: ORDER BY
- Concept 20: Solutions: ORDER BY
- Concept 21: Video: ORDER BY Part II
- Concept 22: Quiz: ORDER BY Part II
- Concept 23: Solutions: ORDER BY Part II
- Concept 24: Video: WHERE
- Concept 25: Quiz: WHERE
- Concept 26: Solutions: WHERE
- Concept 27: Video: WHERE with Non-Numeric Data
- Concept 28: Quiz: WHERE with Non-Numeric
- Concept 29: Solutions: WHERE with Non-Numeric
- Concept 30: Video: Arithmetic Operators
- Concept 31: Quiz: Arithmetic Operators
- Concept 32: Solutions: Arithmetic Operators
- Concept 33: Text: Introduction to Logical Operators
- Concept 34: Video: LIKE
- Concept 35: Quiz: LIKE
- Concept 36: Solutions: LIKE
- Concept 37: Video: IN
- Concept 38: Quiz: IN
- Concept 39: Solutions: IN
- Concept 40: Video: NOT
- Concept 41: Quiz: NOT
- Concept 42: Solutions: NOT
- Concept 43: Video: AND and BETWEEN
- Concept 44: Quiz: AND and BETWEEN
- Concept 45: Solutions: AND and BETWEEN
- Concept 46: Video: OR
- Concept 47: Quiz: OR
- Concept 48: Solutions: OR
- Concept 49: Text: Recap & Looking Ahead
-
Lesson 02: SQL Joins
In this lesson, you will learn how to combine data from multiple tables together.
- Concept 01: Video: Motivation
- Concept 02: Video: Why Would We Want to Split Data Into Separate Tables?
- Concept 03: Video: Introduction to JOINs
- Concept 04: Text + Quiz: Your First JOIN
- Concept 05: Solution: Your First JOIN
- Concept 06: Text: ERD Reminder
- Concept 07: Text: Primary and Foreign Keys
- Concept 08: Quiz: Primary - Foreign Key Relationship
- Concept 09: Text + Quiz: JOIN Revisited
- Concept 10: Video: Alias
- Concept 11: Quiz: JOIN Questions Part I
- Concept 12: Solutions: JOIN Questions Part I
- Concept 13: Video: Motivation for Other JOINs
- Concept 14: Video: LEFT and RIGHT JOINs
- Concept 15: Text: Other JOIN Notes
- Concept 16: LEFT and RIGHT JOIN
- Concept 17: Solutions: LEFT and RIGHT JOIN
- Concept 18: Video: JOINs and Filtering
- Concept 19: Quiz: Last Check
- Concept 20: Solutions: Last Check
- Concept 21: Text: Recap & Looking Ahead
-
Lesson 03: SQL Aggregations
In this lesson, you will learn how to aggregate data using SQL functions like SUM, AVG, and COUNT. Additionally, CASE, HAVING, and DATE functions provide you an incredible problem solving toolkit.
- Concept 01: Video: Introduction to Aggregation
- Concept 02: Video: Introduction to NULLs
- Concept 03: Video: NULLs and Aggregation
- Concept 04: Video + Text: First Aggregation - COUNT
- Concept 05: Video: COUNT & NULLs
- Concept 06: Video: SUM
- Concept 07: Quiz: SUM
- Concept 08: Solution: SUM
- Concept 09: Video: MIN & MAX
- Concept 10: Video: AVG
- Concept 11: Quiz: MIN, MAX, & AVG
- Concept 12: Solutions: MIN, MAX, & AVG
- Concept 13: Video: GROUP BY
- Concept 14: Quiz: GROUP BY
- Concept 15: Solutions: GROUP BY
- Concept 16: Video: GROUP BY Part II
- Concept 17: Quiz: GROUP BY Part II
- Concept 18: Solutions: GROUP BY Part II
- Concept 19: Video: DISTINCT
- Concept 20: Quiz: DISTINCT
- Concept 21: Solutions: DISTINCT
- Concept 22: Video: HAVING
- Concept 23: HAVING
- Concept 24: Solutions: HAVING
- Concept 25: Video: DATE Functions
- Concept 26: Video: DATE Functions II
- Concept 27: Quiz: DATE Functions
- Concept 28: Solutions: DATE Functions
- Concept 29: Video: CASE Statements
- Concept 30: Video: CASE & Aggregations
- Concept 31: Quiz: CASE
- Concept 32: Solutions: CASE
- Concept 33: Text: Recap
-
Lesson 04: SQL Subqueries & Temporary Tables
In this lesson, you will be learning to answer much more complex business questions using nested querying methods - also known as subqueries.
- Concept 01: Video: Introduction
- Concept 02: Video: Introduction to Subqueries
- Concept 03: Video + Quiz: Write Your First Subquery
- Concept 04: Solutions: Write Your First Subquery
- Concept 05: Text: Subquery Formatting
- Concept 06: Video: More On Subqueries
- Concept 07: Quiz: More On Subqueries
- Concept 08: Solutions: More On Subqueries
- Concept 09: Quiz: Subquery Mania
- Concept 10: Solution: Subquery Mania
- Concept 11: Video: WITH
- Concept 12: Text + Quiz: WITH vs. Subquery
- Concept 13: Quiz: WITH
- Concept 14: Solutions: WITH
- Concept 15: Video: Subquery Conclusion
-
Lesson 05: SQL Data Cleaning
Cleaning data is an important part of the data analysis process. You will be learning how to perform data cleaning using SQL in this lesson.
- Concept 01: Video: Introduction to SQL Data Cleaning
- Concept 02: Video: LEFT & RIGHT
- Concept 03: Quiz: LEFT & RIGHT
- Concept 04: Solutions: LEFT & RIGHT
- Concept 05: Video: POSITION, STRPOS, & SUBSTR
- Concept 06: Quiz: POSITION, STRPOS, & SUBSTR - AME DATA AS QUIZ 1
- Concept 07: Solutions: POSITION, STRPOS, & SUBSTR
- Concept 08: Video: CONCAT
- Concept 09: Quiz: CONCAT
- Concept 10: Solutions: CONCAT
- Concept 11: Video: CAST
- Concept 12: Quiz: CAST
- Concept 13: Solutions: CAST
- Concept 14: Video: COALESCE
- Concept 15: Quiz: COALESCE
- Concept 16: Solutions: COALESCE
- Concept 17: Video + Text: Recap
-
Lesson 06: [Advanced] SQL Window Functions
Compare one row to another without doing any joins using one of the most powerful concepts in SQL data analysis: window functions.
- Concept 01: Video: Introduction to Window Functions
- Concept 02: Video: Window Functions 1
- Concept 03: Quiz: Window Functions 1
- Concept 04: Solutions: Window Functions 1
- Concept 05: Quiz: Window Functions 2
- Concept 06: Solutions: Window Functions 2
- Concept 07: Video: ROW_NUMBER & RANK
- Concept 08: Quiz: ROW_NUMBER & RANK
- Concept 09: Solutions: ROW_NUMBER & RANK
- Concept 10: Video: Aggregates in Window Functions
- Concept 11: Quiz: Aggregates in Window Functions
- Concept 12: Solutions: Aggregates in Window Functions
- Concept 13: Video: Aliases for Multiple Window Functions
- Concept 14: Quiz: Aliases for Multiple Window Functions
- Concept 15: Solutions: Aliases for Multiple Window Functions
- Concept 16: Video: Comparing a Row to Previous Row
- Concept 17: Quiz: Comparing a Row to Previous Row
- Concept 18: Solutions: Comparing a Row to Previous Row
- Concept 19: Video: Introduction to Percentiles
- Concept 20: Video: Percentiles
- Concept 21: Quiz: Percentiles
- Concept 22: Solutions: Percentiles
- Concept 23: Video: Recap
-
Lesson 07: [Advanced] SQL Advanced JOINs & Performance Tuning
Learn advanced joins and how to make queries that run quickly across giant datasets. Most of the examples in the lesson involve edge cases, some of which come up in interviews.
- Concept 01: Video: Introduction to Advanced SQL
- Concept 02: Text + Images: FULL OUTER JOIN
- Concept 03: Quiz: FULL OUTER JOIN
- Concept 04: Solutions: FULL OUTER JOIN
- Concept 05: Video: JOINs with Comparison Operators
- Concept 06: Quiz: JOINs with Comparison Operators
- Concept 07: Solutions: JOINs with Comparison Operators
- Concept 08: Video: Self JOINs
- Concept 09: Quiz: Self JOINs
- Concept 10: Solutions: Self JOINs
- Concept 11: Video: UNION
- Concept 12: Quiz: UNION
- Concept 13: Solutions: UNION
- Concept 14: Video: Performance Tuning Motivation
- Concept 15: Video + Quiz: Performance Tuning 1
- Concept 16: Video: Performance Tuning 2
- Concept 17: Video: Performance Tuning 3
- Concept 18: Video: JOINing Subqueries
- Concept 19: Video: SQL Completion Congratulations
-
Part 03 : Data Visualization in Python
Welcome to Visualize the data in Python.
-
Module 01: Data Visualization in Python
-
Lesson 01: Data Visualization in Data Analysis
In this lesson, see the motivations for why data visualization is an important part of the data analysis process and where it fits in.
- Concept 01: Introduction to Data Visualization
- Concept 02: Motivation for Data Visualization
- Concept 03: Further Motivation
- Concept 04: Exploratory vs. Explanatory Analyses
- Concept 05: Quiz: Exploratory vs. Explanatory
- Concept 06: Visualization in Python
- Concept 07: Course Structure
- Concept 08: Lesson Summary
-
Lesson 02: Design of Visualizations
Learn about elements of visualization design, especially to avoid those elements that can cause a visualization to fail.
- Concept 01: Introduction
- Concept 02: What Makes a Bad Visual?
- Concept 03: Levels of Measurement & Types of Data
- Concept 04: Quiz: Data Types (Quantitative vs. Categorical)
- Concept 05: Text + Quiz: Data Types (Ordinal vs. Nominal)
- Concept 06: Data Types (Continuous vs. Discrete)
- Concept 07: Identifying Data Types
- Concept 08: What Experts Say About Visual Encodings
- Concept 09: Chart Junk
- Concept 10: Data Ink Ratio
- Concept 11: Design Integrity
- Concept 12: Bad Visual Quizzes (Part I)
- Concept 13: Bad Visual Quizzes (Part II)
- Concept 14: Using Color
- Concept 15: Designing for Color Blindness
- Concept 16: Shape, Size, & Other Tools
- Concept 17: Good Visual
- Concept 18: Lesson Summary
-
Lesson 03: Univariate Exploration of Data
In this lesson, you will see how you can use matplotlib and seaborn to produce informative visualizations of single variables.
- Concept 01: Introduction
- Concept 02: Tidy Data
- Concept 03: Bar Charts
- Concept 04: Absolute vs. Relative Frequency
- Concept 05: Counting Missing Data
- Concept 06: Bar Chart Practice
- Concept 07: Pie Charts
- Concept 08: Histograms
- Concept 09: Histogram Practice
- Concept 10: Figures, Axes, and Subplots
- Concept 11: Choosing a Plot for Discrete Data
- Concept 12: Descriptive Statistics, Outliers and Axis Limits
- Concept 13: Scales and Transformations
- Concept 14: Scales and Transformations Practice
- Concept 15: Lesson Summary
- Concept 16: Extra: Kernel Density Estimation
- Concept 17: Extra: Waffle Plots
-
Lesson 04: Bivariate Exploration of Data
In this lesson, build up from your understanding of individual variables and learn how to use matplotlib and seaborn to look at relationships between two variables.
- Concept 01: Introduction
- Concept 02: Scatterplots and Correlation
- Concept 03: Overplotting, Transparency, and Jitter
- Concept 04: Heat Maps
- Concept 05: Scatterplot Practice
- Concept 06: Violin Plots
- Concept 07: Box Plots
- Concept 08: Violin and Box Plot Practice
- Concept 09: Clustered Bar Charts
- Concept 10: Categorical Plot Practice
- Concept 11: Faceting
- Concept 12: Adaptation of Univariate Plots
- Concept 13: Line Plots
- Concept 14: Additional Plot Practice
- Concept 15: Lesson Summary
- Concept 16: Extra: Q-Q Plots
- Concept 17: Extra: Swarm Plots
- Concept 18: Extra: Rug and Strip Plots
- Concept 19: Extra: Stacked Plots
- Concept 20: Extra: Ridgeline Plots
-
Lesson 05: Multivariate Exploration of Data
In this lesson, see how you can use matplotlib and seaborn to visualize relationships and interactions between three or more variables.
- Concept 01: Introduction
- Concept 02: Non-Positional Encodings for Third Variables
- Concept 03: Color Palettes
- Concept 04: Encodings Practice
- Concept 05: Faceting in Two Directions
- Concept 06: Other Adaptations of Bivariate Plots
- Concept 07: Adapted Plot Practice
- Concept 08: Plot Matrices
- Concept 09: Feature Engineering
- Concept 10: How Much is Too Much?
- Concept 11: Additional Plot Practice
- Concept 12: Lesson Summary
-
Lesson 06: Explanatory Visualizations
Previous lessons covered how you could use visualizations to learn about your data. In this lesson, see how to polish up those plots to convey your findings to others!
- Concept 01: Introduction
- Concept 02: Revisiting the Data Analysis Process
- Concept 03: Tell A Story
- Concept 04: Same Data, Different Stories
- Concept 05: Quizzes on Data Story Telling
- Concept 06: Polishing Plots
- Concept 07: Polishing Plots Practice
- Concept 08: Creating a Slide Deck with Jupyter
- Concept 09: Getting and Using Feedback
- Concept 10: Lesson Summary
-
Lesson 07: Visualization Case Study
Put to practice the concepts you've learned about exploratory and explanatory data visualization in this case study on factors that impact diamond prices.
-
Part 04 : Command Line Essentials
Welcome to Command Line Essentials.
-
Module 01: Command Line Essentials
-
Lesson 01: Shell Workshop
The Unix shell is a powerful tool for developers of all sorts. In this lesson, you'll get a quick introduction to the very basics of using it on your own computer.
- Concept 01: The Command Line
- Concept 02: Intro to the Shell
- Concept 03: Windows: Installing Git Bash
- Concept 04: Opening a terminal
- Concept 05: Your first command (echo)
- Concept 06: Navigating directories (ls, cd, ..)
- Concept 07: Current working directory (pwd)
- Concept 08: Parameters and options (ls -l)
- Concept 09: Organizing your files (mkdir, mv)
- Concept 10: Downloading (curl)
- Concept 11: Viewing files (cat, less)
- Concept 12: Removing things (rm, rmdir)
- Concept 13: Searching and pipes (grep, wc)
- Concept 14: Shell and environment variables
- Concept 15: Startup files (.bash_profile)
- Concept 16: Controlling the shell prompt ($PS1)
- Concept 17: Aliases
- Concept 18: Keep learning!
-
Part 05 : Git & Github
Welcome to Git & Github.
-
Module 01: Git and Github
-
Lesson 01: What is Version Control?
Version control is an incredibly important part of a professional programmer's life. In this lesson, you'll learn about the benefits of version control and install the version control tool Git!
-
Lesson 02: Create A Git Repo
Now that you've learned the benefits of Version Control and gotten Git installed, it's time you learn how to create a repository.
-
Lesson 03: Review a Repo's History
Knowing how to review an existing Git repository's history of commits is extremely important. You'll learn how to do just that in this lesson.
-
Lesson 04: Add Commits To A Repo
A repository is nothing without commits. In this lesson, you'll learn how to make commits, write descriptive commit messages, and verify the changes you're about to save to the repository.
-
Lesson 05: Tagging, Branching, and Merging
Being able to work on your project in isolation from other changes will multiply your productivity. You'll learn how to do this isolated development with Git's branches.
-
Lesson 06: Undoing Changes
Help! Disaster has struck! You don't have to worry, though, because your project is tracked in version control! You'll learn how to undo and modify changes that have been saved to the repository.
-
Lesson 07: Working With Remotes
You'll learn how to create remote repositories on GitHub and how to get and send changes to the remote repository.
-
Lesson 08: Working On Another Developer's Repository
In this lesson, you'll learn how to fork another developer's project. Collaborating with other developers can be a tricky process, so you'll learn how to contribute to a public project.
-
Lesson 09: Staying In Sync With A Remote Repository
You'll learn how to send suggested changes to another developer by using pull requests. You'll also learn how to use the powerful
git rebase
command to squash commits together.
-
Part 06 : Practical Statistics
Welcome to Practical Statistics.
-
Module 01: Practical Stats
-
Lesson 01: Descriptive Statistics - Part I
In this lesson, you will learn about data types, measures of center, and the basics of statistical notation.
- Concept 01: Introduce Instructors
- Concept 02: Text: Optional Lessons Note
- Concept 03: Video: Welcome!
- Concept 04: Video: What is Data? Why is it important?
- Concept 05: Video: Data Types (Quantitative vs. Categorical)
- Concept 06: Quiz: Data Types (Quantitative vs. Categorical)
- Concept 07: Video: Data Types (Ordinal vs. Nominal)
- Concept 08: Video: Data Types (Continuous vs. Discrete)
- Concept 09: Video: Data Types Summary
- Concept 10: Text + Quiz: Data Types (Ordinal vs. Nominal)
- Concept 11: Data Types (Continuous vs. Discrete)
- Concept 12: Video: Introduction to Summary Statistics
- Concept 13: Video: Measures of Center (Mean)
- Concept 14: Measures of Center (Mean)
- Concept 15: Video: Measures of Center (Median)
- Concept 16: Measures of Center (Median)
- Concept 17: Video: Measures of Center (Mode)
- Concept 18: Measures of Center (Mode)
- Concept 19: Video: What is Notation?
- Concept 20: Video: Random Variables
- Concept 21: Quiz: Variable Types
- Concept 22: Video: Capital vs. Lower
- Concept 23: Quiz: Introduction to Notation
- Concept 24: Video: Better Way?
- Concept 25: Video: Summation
- Concept 26: Video: Notation for the Mean
- Concept 27: Quiz: Summation
- Concept 28: Quiz: Notation for the Mean
- Concept 29: Text: Summary on Notation
-
Lesson 02: Descriptive Statistics - Part II
In this lesson, you will learn about measures of spread, shape, and outliers as associated with quantitative data. You will also get a first look at inferential statistics.
- Concept 01: Video: What are Measures of Spread?
- Concept 02: Video: Histograms
- Concept 03: Video: Weekdays vs. Weekends: What is the Difference
- Concept 04: Video: Introduction to Five Number Summary
- Concept 05: Quiz: 5 Number Summary Practice
- Concept 06: Video: What if We Only Want One Number?
- Concept 07: Video: Introduction to Standard Deviation and Variance
- Concept 08: Video: Standard Deviation Calculation
- Concept 09: Measures of Spread (Calculation and Units)
- Concept 10: Text: Introduction to the Standard Deviation and Variance
- Concept 11: Video: Why the Standard Deviation?
- Concept 12: Video: Important Final Points
- Concept 13: Advanced: Standard Deviation and Variance
- Concept 14: Quiz: Applied Standard Deviation and Variance
- Concept 15: Homework 1: Final Quiz on Measures Spread
- Concept 16: Text: Measures of Center and Spread Summary
- Concept 17: Video: Shape
- Concept 18: Video: The Shape For Data In The World
- Concept 19: Quiz: Shape and Outliers (What's the Impact?)
- Concept 20: Video: Shape and Outliers
- Concept 21: Video: Working With Outliers
- Concept 22: Video: Working With Outliers My Advice
- Concept 23: Quiz: Shape and Outliers (Comparing Distributions)
- Concept 24: Quiz: Shape and Outliers (Visuals)
- Concept 25: Quiz: Shape and Outliers (Final Quiz)
- Concept 26: Text: Descriptive Statistics Summary
- Concept 27: Video: Descriptive vs. Inferential Statistics
- Concept 28: Quiz: Descriptive vs. Inferential (Udacity Students)
- Concept 29: Quiz: Descriptive vs. Inferential (Bagels)
- Concept 30: Text: Descriptive vs. Inferential Summary
- Concept 31: Video: Summary
-
Lesson 03: Admissions Case Study
Learn to ask the right questions, as you learn about Simpson's Paradox.
- Concept 01: Admissions Case Study Introduction
- Concept 02: Admissions 1
- Concept 03: Admissions 2
- Concept 04: Admissions 3
- Concept 05: Admissions 4
- Concept 06: Gender Bias
- Concept 07: Aggregation
- Concept 08: Aggregation 2
- Concept 09: Aggregation 3
- Concept 10: Gender Bias Revisited
- Concept 11: Dangers of Statistics
- Concept 12: Text: Recap + Next Steps
- Concept 13: Case Study in Python
- Concept 14: Conclusion
-
Lesson 04: Probability
Gain the basics of probability using coins and die.
- Concept 01: Introduction to Probability
- Concept 02: Flipping Coins
- Concept 03: Fair Coin
- Concept 04: Loaded Coin 1
- Concept 05: Loaded Coin 2
- Concept 06: Loaded Coin 3
- Concept 07: Complementary Outcomes
- Concept 08: Two Flips 1
- Concept 09: Two Flips 2
- Concept 10: Two Flips 3
- Concept 11: Two Flips 4
- Concept 12: Two Flips 5
- Concept 13: One Head 1
- Concept 14: One Head 2
- Concept 15: One Of Three 1
- Concept 16: One Of Three 2
- Concept 17: Even Roll
- Concept 18: Doubles
- Concept 19: Probability Conclusion
- Concept 20: Text: Recap + Next Steps
-
Lesson 05: Binomial Distribution
Learn about one of the most popular distributions in probability - the Binomial Distribution.
- Concept 01: Binomial
- Concept 02: Heads Tails
- Concept 03: Heads Tails 2
- Concept 04: 5 Flips 1 Head
- Concept 05: 5 Flips 2 Heads
- Concept 06: 5 Flips 3 Heads
- Concept 07: 10 Flips 5 Heads
- Concept 08: Formula
- Concept 09: Arrangements
- Concept 10: Binomial 1
- Concept 11: Binomial 2
- Concept 12: Binomial 3
- Concept 13: Binomial 4
- Concept 14: Binomial 5
- Concept 15: Binomial 6
- Concept 16: Binomial Conclusion
- Concept 17: Text: Recap + Next Steps
-
Lesson 06: Conditional Probability
Not all events are independent. Learn the probability rules for dependent events.
- Concept 01: Introduction to Conditional Probability
- Concept 02: Medical Example 1
- Concept 03: Medical Example 2
- Concept 04: Medical Example 3
- Concept 05: Medical Example 4
- Concept 06: Medical Example 5
- Concept 07: Medical Example 6
- Concept 08: Medical Example 7
- Concept 09: Medical Example 8
- Concept 10: Total Probability
- Concept 11: Two Coins 1
- Concept 12: Two Coins 2
- Concept 13: Two Coins 3
- Concept 14: Two Coins 4
- Concept 15: Summary
- Concept 16: Text: Summary
-
Lesson 07: Bayes Rule
Learn one of the most popular rules in all of statistics - Bayes rule.
- Concept 01: Bayes Rule
- Concept 02: Cancer Test
- Concept 03: Prior And Posterior
- Concept 04: Normalizing 1
- Concept 05: Normalizing 2
- Concept 06: Normalizing 3
- Concept 07: Total Probability
- Concept 08: Bayes Rule Diagram
- Concept 09: Equivalent Diagram
- Concept 10: Cancer Probabilities
- Concept 11: Probability Given Test
- Concept 12: Normalizer
- Concept 13: Normalizing Probability
- Concept 14: Disease Test 1
- Concept 15: Disease Test 2
- Concept 16: Disease Test 3
- Concept 17: Disease Test 4
- Concept 18: Disease Test 5
- Concept 19: Disease Test 6
- Concept 20: Bayes Rule Summary
- Concept 21: Robot Sensing 1
- Concept 22: Robot Sensing 2
- Concept 23: Robot Sensing 3
- Concept 24: Robot Sensing 4
- Concept 25: Robot Sensing 5
- Concept 26: Robot Sensing 6
- Concept 27: Robot Sensing 7
- Concept 28: Robot Sensing 8
- Concept 29: Generalizing
- Concept 30: Sebastian At Home
- Concept 31: Learning Objectives - Conditional Probability
- Concept 32: Reducing Uncertainty
- Concept 33: Bayes' Rule and Robotics
- Concept 34: Learning from Sensor Data
- Concept 35: Using Sensor Data
- Concept 36: Learning Objectives - Bayes' Rule
- Concept 37: Bayes Rule Conclusion
-
Lesson 08: Python Probability Practice
Take what you have learned in the last lessons and put it to practice in Python.
-
Lesson 09: Normal Distribution Theory
Learn the mathematics behind moving from a coin flip to a normal distribution.
- Concept 01: Maximum Probability
- Concept 02: Shape
- Concept 03: Better Formula
- Concept 04: Quadratics
- Concept 05: Quadratics 2
- Concept 06: Quadratics 3
- Concept 07: Quadratics 4
- Concept 08: Maximum
- Concept 09: Maximum Value
- Concept 10: Minimum
- Concept 11: Minimum Value
- Concept 12: Normalizer
- Concept 13: Formula Summary
- Concept 14: Central Limit Theorem
- Concept 15: Summary
-
Lesson 10: Sampling distributions and the Central Limit Theorem
Learn all about the underpinning of confidence intervals and hypothesis testing - sampling distributions.
- Concept 01: Introduction
- Concept 02: Video: Descriptive vs. Inferential Statistics
- Concept 03: Quiz: Descriptive vs. Inferential (Udacity Students)
- Concept 04: Quiz: Descriptive vs. Inferential (Bagels)
- Concept 05: Text: Descriptive vs. Inferential Statistics
- Concept 06: Video + Quiz: Introduction to Sampling Distributions Part I
- Concept 07: Video + Quiz: Introduction to Sampling Distributions Part II
- Concept 08: Video: Introduction to Sampling Distributions Part III
- Concept 09: Notebook + Quiz: Sampling Distributions & Python
- Concept 10: Text: Sampling Distribution Notes
- Concept 11: Video: Introduction to Notation
- Concept 12: Video: Notation for Parameters vs. Statistics
- Concept 13: Quiz: Notation
- Concept 14: Video: Other Sampling Distributions
- Concept 15: Video: Two Useful Theorems - Law of Large Numbers
- Concept 16: Notebook + Quiz: Law of Large Numbers
- Concept 17: Video: Two Useful Theorems - Central Limit Theorem
- Concept 18: Notebook + Quiz: Central Limit Theorem
- Concept 19: Notebook + Quiz: Central Limit Theorem - Part II
- Concept 20: Video: When Does the Central Limit Theorem Not Work?
- Concept 21: Notebook + Quiz: Central Limit Theorem - Part III
- Concept 22: Video: Bootstrapping
- Concept 23: Video: Bootstrapping & The Central Limit Theorem
- Concept 24: Notebook + Quiz: Bootstrapping
- Concept 25: Video: The Background of Bootstrapping
- Concept 26: Video: Why are Sampling Distributions Important
- Concept 27: Quiz + Text: Recap & Next Steps
-
Lesson 11: Confidence Intervals
Learn how to use sampling distributions and bootstrapping to create a confidence interval for any parameter of interest.
- Concept 01: Video: Introduction
- Concept 02: Video: From Sampling Distributions to Confidence Intervals
- Concept 03: ScreenCast: Sampling Distributions and Confidence Intervals
- Concept 04: Notebook + Quiz: Building Confidence Intervals
- Concept 05: ScreenCast: Difference In Means
- Concept 06: Notebook + Quiz: Difference in Means
- Concept 07: Video: Confidence Interval Applications
- Concept 08: Video: Statistical vs. Practical Significance
- Concept 09: Statistical vs. Practical Significance
- Concept 10: Video: Traditional Confidence Intervals
- Concept 11: ScreenCast: Traditional Confidence Interval Methods
- Concept 12: Video: Other Language Associated with Confidence Intervals
- Concept 13: Other Language Associated with Confidence Intervals
- Concept 14: Video: Correct Interpretations of Confidence Intervals
- Concept 15: Correct Interpretations of Confidence Intervals
- Concept 16: Video: Confidence Intervals & Hypothesis Tests
- Concept 17: Text: Recap + Next Steps
-
Lesson 12: Hypothesis Testing
Learn the necessary skills to create and analyze the results in hypothesis testing.
- Concept 01: Introduction
- Concept 02: Hypothesis Testing
- Concept 03: Setting Up Hypothesis Tests - Part I
- Concept 04: Setting Up Hypotheses
- Concept 05: Setting Up Hypothesis Tests - Part II
- Concept 06: Quiz: Setting Up Hypothesis Tests
- Concept 07: Types of Errors - Part I
- Concept 08: Quiz: Types of Errors - Part I
- Concept 09: Types of Errors - Part II
- Concept 10: Quiz: Types of Errors - Part II(a)
- Concept 11: Quiz: Types of Errors - Part II(b)
- Concept 12: Types of Errors - Part III
- Concept 13: Quiz: Types of Errors - Part III
- Concept 14: Common Types of Hypothesis Tests
- Concept 15: Quiz: More Hypothesis Testing Practice
- Concept 16: How Do We Choose Between Hypotheses?
- Concept 17: Video: Simulating from the Null
- Concept 18: Notebook + Quiz: Simulating from the Null
- Concept 19: What is a p-value Anyway?
- Concept 20: Video: Calculating the p-value
- Concept 21: Quiz: What is a p-value Anyway?
- Concept 22: Quiz: Calculating a p-value
- Concept 23: Connecting Errors and P-Values
- Concept 24: Conclusions in Hypothesis Testing
- Concept 25: Quiz: Connecting Errors and P-Values
- Concept 26: Notebook + Quiz: Drawing Conclusions
- Concept 27: Other Things to Consider - Impact of Large Sample Size
- Concept 28: Other Things to Consider - What If We Test More Than Once?
- Concept 29: Other Things to Consider - How Do CIs and HTs Compare?
- Concept 30: Notebook + Quiz: Impact of Sample Size
- Concept 31: Notebook + Quiz: Multiple Tests
- Concept 32: Hypothesis Testing Conclusion
- Concept 33: Quiz + Text: Recap
-
Lesson 13: Case Study: A/B tests
Work through a case study of how A/B testing works for an online education company called Audacity.
- Concept 01: Introduction
- Concept 02: A/B Testing
- Concept 03: A/B Testing
- Concept 04: Business Example
- Concept 05: Experiment I
- Concept 06: Quiz: Experiment I
- Concept 07: Metric - Click Through Rate
- Concept 08: Click Through Rate
- Concept 09: Experiment II
- Concept 10: Metric - Enrollment Rate
- Concept 11: Metric - Average Reading Duration
- Concept 12: Metric - Average Classroom Time
- Concept 13: Metric - Completion Rate
- Concept 14: Analyzing Multiple Metrics
- Concept 15: Quiz: Analyzing Multiple Metrics
- Concept 16: Drawing Conclusions
- Concept 17: Quiz: Difficulties in A/B Testing
- Concept 18: Conclusion
-
Lesson 14: Regression
Use python to fit linear regression models, as well as understand how to interpret the results of linear models.
- Concept 01: Video: Introduction
- Concept 02: Video: Introduction to Machine Learning
- Concept 03: Quiz: Machine Learning Big Picture
- Concept 04: Video: Introduction to Linear Regression
- Concept 05: Quiz: Linear Regression Language
- Concept 06: Scatter Plots
- Concept 07: Quizzes On Scatter Plots
- Concept 08: Correlation Coefficients
- Concept 09: Correlation Coefficient Quizzes
- Concept 10: Video: What Defines A Line?
- Concept 11: Quiz: What Defines A Line? - Notation Quiz
- Concept 12: Quiz: What Defines A Line? - Line Basics Quiz
- Concept 13: Video: Fitting A Regression Line
- Concept 14: Text: The Regression Closed Form Solution
- Concept 15: Screencast: Fitting A Regression Line in Python
- Concept 16: Video: How to Interpret the Results?
- Concept 17: Video: Does the Line Fit the Data Well?
- Concept 18: Notebook + Quiz: How to Interpret the Results
- Concept 19: Notebook + Quiz: Regression - Your Turn - Part I
- Concept 20: Notebook + Quiz: Your Turn - Part II
- Concept 21: Video: Recap
- Concept 22: Text: Recap + Next Steps
-
Lesson 15: Multiple Linear Regression
Learn to apply multiple linear regression models in python. Learn to interpret the results and understand if your model fits well.
- Concept 01: Video: Introduction
- Concept 02: Video: Multiple Linear Regression
- Concept 03: Screencast: Fitting A Multiple Linear Regression Model
- Concept 04: Notebook + Quiz: Fitting A MLR Model
- Concept 05: Screencast + Text: How Does MLR Work?
- Concept 06: Video: Multiple Linear Regression Model Results
- Concept 07: Quiz: Interpreting Coefficients in MLR
- Concept 08: Video: Dummy Variables
- Concept 09: Text: Dummy Variables
- Concept 10: Dummy Variables
- Concept 11: Screencast: Dummy Variables
- Concept 12: Notebook + Quiz: Dummy Variables
- Concept 13: Video: Dummy Variables Recap
- Concept 14: [Optional] Notebook + Quiz: Other Encodings
- Concept 15: Video: Potential Problems
- Concept 16: [Optional] Text: Linear Model Assumptions
- Concept 17: Screencast: Multicollinearity & VIFs
- Concept 18: Video: Multicollinearity & VIFs
- Concept 19: Notebook + Quiz: Multicollinearity & VIFs
- Concept 20: Video: Higher Order Terms
- Concept 21: Text: Higher Order Terms
- Concept 22: Screencast: How to Add Higher Order Terms
- Concept 23: Video: Interpreting Interactions
- Concept 24: Text: Interpreting Interactions
- Concept 25: Notebook + Quiz: Interpreting Model Coefficients
- Concept 26: Video: Recap
- Concept 27: Text: Recap
-
Lesson 16: Logistic Regression
Learn to apply logistic regression models in python. Learn to interpret the results and understand if your model fits well.
- Concept 01: Video: Introduction
- Concept 02: Video: Fitting Logistic Regression
- Concept 03: Quiz: Logistic Regression Quick Check
- Concept 04: Video: Fitting Logistic Regression in Python
- Concept 05: Notebook + Quiz: Fitting Logistic Regression in Python
- Concept 06: Video: Interpreting Results - Part I
- Concept 07: Video (ScreenCast): Interpret Results - Part II
- Concept 08: Notebook + Quiz: Interpret Results
- Concept 09: Video: Model Diagnostics + Performance Metrics
- Concept 10: Confusion Matrices
- Concept 11: Confusion Matrix Practice 1
- Concept 12: Confusion Matrix Practice 2
- Concept 13: Filling in a Confusion Matrix
- Concept 14: Confusion Matrix: False Alarms
- Concept 15: Confusion Matrix for Eigenfaces
- Concept 16: How Many Schroeders
- Concept 17: How Many Schroeder Predictions
- Concept 18: Classifying Chavez Correctly 1
- Concept 19: Classifying Chavez Correctly 2
- Concept 20: Precision and Recall
- Concept 21: Powell Precision and Recall
- Concept 22: Bush Precision and Recall
- Concept 23: True Positives in Eigenfaces
- Concept 24: False Positives in Eigenfaces
- Concept 25: False Negatives in Eigenfaces
- Concept 26: Practicing TP, FP, FN with Rumsfeld
- Concept 27: Equation for Precision
- Concept 28: Equation for Recall
- Concept 29: Screencast: Model Diagnostics in Python - Part I
- Concept 30: Notebook + Quiz: Model Diagnostics
- Concept 31: Video: Final Thoughts On Shifting to Machine Learning
- Concept 32: Text: Recap
- Concept 33: Video: Congratulations
-
Part 07 : Numpy, Pandas, Matplotlib
Let's focus on library packages for Python, such as : Numpy (which adds support for large data),Pandas (which is used for data manipulation and analysis)And Matplotlib (which is used for data visualization).
-
Module 01: Lessons
-
Lesson 01: Anaconda
Anaconda is a package and environment manager built specifically for data. Learn how to use Anaconda to improve your data analysis workflow.
-
Lesson 02: Jupyter Notebooks
Learn how to use Jupyter Notebooks to create documents combining code, text, images, and more.
- Concept 01: Instructor
- Concept 02: What are Jupyter notebooks?
- Concept 03: Installing Jupyter Notebook
- Concept 04: Launching the notebook server
- Concept 05: Notebook interface
- Concept 06: Code cells
- Concept 07: Markdown cells
- Concept 08: Keyboard shortcuts
- Concept 09: Magic keywords
- Concept 10: Converting notebooks
- Concept 11: Creating a slideshow
- Concept 12: Finishing up
-
Lesson 03: NumPy
Learn the basics of NumPy and how to use it to create and manipulate arrays.
- Concept 01: Instructors
- Concept 02: Introduction to NumPy
- Concept 03: Why Use NumPy?
- Concept 04: Creating and Saving NumPy ndarrays
- Concept 05: Using Built-in Functions to Create ndarrays
- Concept 06: Create an ndarray
- Concept 07: Accessing, Deleting, and Inserting Elements Into ndarrays
- Concept 08: Slicing ndarrays
- Concept 09: Boolean Indexing, Set Operations, and Sorting
- Concept 10: Manipulating ndarrays
- Concept 11: Arithmetic operations and Broadcasting
- Concept 12: Creating ndarrays with Broadcasting
- Concept 13: Getting Set Up for the Mini-Project
- Concept 14: Mini-Project: Mean Normalization and Data Separation
-
Lesson 04: Pandas
Learn the basics of Pandas Series and DataFrames and how to use them to load and process data.
- Concept 01: Instructors
- Concept 02: Introduction to pandas
- Concept 03: Why Use pandas?
- Concept 04: Creating pandas Series
- Concept 05: Accessing and Deleting Elements in pandas Series
- Concept 06: Arithmetic Operations on pandas Series
- Concept 07: Manipulate a Series
- Concept 08: Creating pandas DataFrames
- Concept 09: Accessing Elements in pandas DataFrames
- Concept 10: Dealing with NaN
- Concept 11: Manipulate a DataFrame
- Concept 12: Loading Data into a pandas DataFrame
- Concept 13: Getting Set Up for the Mini-Project
- Concept 14: Mini-Project: Statistics From Stock Data
-
Lesson 05: Matplotlib and Seaborn Part 1
Learn how to use matplotlib and seaborn to visualize your data. In this lesson, you will learn how to create visualizations to depict the distributions of single variables.
- Concept 01: Instructor
- Concept 02: Introduction
- Concept 03: Tidy Data
- Concept 04: Bar Charts
- Concept 05: Absolute vs. Relative Frequency
- Concept 06: Counting Missing Data
- Concept 07: Bar Chart Practice
- Concept 08: Pie Charts
- Concept 09: Histograms
- Concept 10: Histogram Practice
- Concept 11: Figures, Axes, and Subplots
- Concept 12: Choosing a Plot for Discrete Data
- Concept 13: Descriptive Statistics, Outliers and Axis Limits
- Concept 14: Scales and Transformations
- Concept 15: Scales and Transformations Practice
- Concept 16: Lesson Summary
- Concept 17: Extra: Kernel Density Estimation
-
Lesson 06: Matplotlib and Seaborn Part 2
In this lesson, you will use matplotlib and seaborn to create visualizations to depict the relationships between two variables.
- Concept 01: Introduction
- Concept 02: Scatterplots and Correlation
- Concept 03: Overplotting, Transparency, and Jitter
- Concept 04: Heat Maps
- Concept 05: Scatterplot Practice
- Concept 06: Violin Plots
- Concept 07: Box Plots
- Concept 08: Violin and Box Plot Practice
- Concept 09: Clustered Bar Charts
- Concept 10: Categorical Plot Practice
- Concept 11: Faceting
- Concept 12: Adaptation of Univariate Plots
- Concept 13: Line Plots
- Concept 14: Additional Plot Practice
- Concept 15: Lesson Summary
- Concept 16: Postscript: Multivariate Visualization
- Concept 17: Extra: Swarm Plots
- Concept 18: Extra: Rug and Strip Plots
- Concept 19: Extra: Stacked Plots
-
Part 08 : Linear Algebra Essentials
Learn the basics of the beautiful world of Linear Algebra andwhy it is such an important mathematical tool in the world of AI.
-
Module 01: Lessons
-
Lesson 01: Introduction
Take a sneak peek into the beautiful world of Linear Algebra and learn why it is such an important mathematical tool.
-
Lesson 02: Vectors
Learn about vectors, the basic building block of Linear Algebra.
- Concept 01: What's a Vector?
- Concept 02: Vectors, what even are they? Part 2
- Concept 03: Vectors, what even are they? Part 3
- Concept 04: Vectors- Mathematical definition
- Concept 05: Transpose
- Concept 06: Magnitude and Direction
- Concept 07: Vectors- Quiz 1
- Concept 08: Operations in the Field
- Concept 09: Vector Addition
- Concept 10: Vectors- Quiz 2
- Concept 11: Scalar by Vector Multiplication
- Concept 12: Vectors Quiz 3
- Concept 13: Vectors Quiz Answers
-
Lesson 03: Linear Combination
Learn how to scale and add vectors and how to visualize the process.
- Concept 01: Linear Combination. Part 1
- Concept 02: Linear Combination. Part 2
- Concept 03: Linear Combination and Span
- Concept 04: Linear Combination -Quiz 1
- Concept 05: Linear Dependency
- Concept 06: Solving a Simplified Set of Equations
- Concept 07: Linear Combination - Quiz 2
- Concept 08: Linear Combination - Quiz 3
-
Lesson 04: Linear Transformation and Matrices
What is a linear transformation and how is it directly related to matrices? Learn how to apply the math and visualize the concept.
- Concept 01: What is a Matrix?
- Concept 02: Matrix Addition
- Concept 03: Matrix Addition Quiz
- Concept 04: Scalar Multiplication of Matrix and Quiz
- Concept 05: Multiplication of a Square Matrices
- Concept 06: Square Matrix Multiplication Quiz
- Concept 07: Matrix Multiplication - General
- Concept 08: Matrix Multiplication Quiz
- Concept 09: Linear Transformation and Matrices . Part 1
- Concept 10: Linear Transformation and Matrices. Part 2
- Concept 11: Linear Transformation and Matrices. Part 3
- Concept 12: Linear Transformation Quiz Answers
-
-
Module 02: Labs
-
Lesson 01: Vectors Lab
Learn how to graph 2D vectors.
-
Lesson 02: Linear Combination Lab
Learn how to computationally determine a vector's span and solve a simple system of equations.
-
Lesson 03: Linear Mapping Lab
Learn how to solve some problems computationally using vectors and matrices.
-
Lesson 04: Linear Algebra in Neural Networks
Take a peek into the world of Neural Networks and see how it related directly to Linear Algebra!
-
Part 09 : Calculus Essentials
Welcome to Calculus Essentials.
-
Module 01: Lessons
-
Lesson 01: Calculus
- Concept 01: Our Goal
- Concept 02: Instructor
- Concept 03: Introduction Video
- Concept 04: Derivatives
- Concept 05: Derivatives Through Geometry
- Concept 06: The Chain Rule
- Concept 07: Derivatives of exponentials
- Concept 08: Implicit Differentiation
- Concept 09: Limits
- Concept 10: Integrals
- Concept 11: More on Integrals
- Concept 12: The Taylor Series (optional)
- Concept 13: Multivariable Chain Rule
-
-
Module 02: Calculus in Neural Networks
Part 10 : Supervised Learning
Learn to build supervised machine learning models to make data-informed decisions. Learn to evaluate and validate the quality of your models.
-
Module 01: Supervised Learning
-
Lesson 01: Machine Learning Bird's Eye View
Before diving into the many algorithms of machine learning, it is important to take a step back and understand the big picture associated with the entire field.
- Concept 01: Introduction
- Concept 02: History - A Statistician's Perspective
- Concept 03: History - A Computer Scientist's Perspective
- Concept 04: Types of Machine Learning - Supervised
- Concept 05: Types of Machine Learning - Unsupervised & Reinforcement
- Concept 06: Deep Learning
- Concept 07: Scikit Learn
- Concept 08: Ethics in Machine Learning
- Concept 09: What's Ahead
- Concept 10: Text: Recap
-
Lesson 02: Linear Regression
Linear regression is one of the most fundamental algorithms in machine learning. In this lesson, learn how linear regression works!
- Concept 01: Intro
- Concept 02: Quiz: Housing Prices
- Concept 03: Solution: Housing Prices
- Concept 04: Fitting a Line Through Data
- Concept 05: Moving a Line
- Concept 06: Absolute Trick
- Concept 07: Square Trick
- Concept 08: Quiz: Absolute and Square Trick
- Concept 09: Gradient Descent
- Concept 10: Mean Absolute Error
- Concept 11: Mean Squared Error
- Concept 12: Quiz: Mean Absolute & Squared Errors
- Concept 13: Minimizing Error Functions
- Concept 14: Mean vs Total Error
- Concept 15: Mini-batch Gradient Descent
- Concept 16: Quiz: Mini-Batch Gradient Descent
- Concept 17: Absolute Error vs Squared Error
- Concept 18: Linear Regression in scikit-learn
- Concept 19: Higher Dimensions
- Concept 20: Multiple Linear Regression
- Concept 21: Closed Form Solution
- Concept 22: (Optional) Closed form Solution Math
- Concept 23: Linear Regression Warnings
- Concept 24: Polynomial Regression
- Concept 25: Quiz: Polynomial Regression
- Concept 26: Regularization
- Concept 27: Quiz: Regularization
- Concept 28: Feature Scaling
- Concept 29: Outro
-
Lesson 03: Perceptron Algorithm
The perceptron algorithm is an algorithm for classifying data. It is the building block of neural networks.
- Concept 01: Intro
- Concept 02: Classification Problems 1
- Concept 03: Classification Problems 2
- Concept 04: Linear Boundaries
- Concept 05: Higher Dimensions
- Concept 06: Perceptrons
- Concept 07: Perceptrons as Logical Operators
- Concept 08: Perceptron Trick
- Concept 09: Perceptron Algorithm
- Concept 10: Outro
-
Lesson 04: Decision Trees
Decision trees are a structure for decision-making where each decision leads to a set of consequences or additional decisions.
- Concept 01: Intro
- Concept 02: Recommending Apps 1
- Concept 03: Recommending Apps 2
- Concept 04: Recommending Apps 3
- Concept 05: Quiz: Student Admissions
- Concept 06: Solution: Student Admissions
- Concept 07: Entropy
- Concept 08: Entropy Formula 1
- Concept 09: Entropy Formula 2
- Concept 10: Entropy Formula 3
- Concept 11: Quiz: Do You Know Your Entropy?
- Concept 12: Multiclass Entropy
- Concept 13: Quiz: Information Gain
- Concept 14: Solution: Information Gain
- Concept 15: Maximizing Information Gain
- Concept 16: Calculating Information Gain on a Dataset
- Concept 17: Hyperparameters
- Concept 18: Decision Trees in sklearn
- Concept 19: Titanic Survival Model with Decision Trees
- Concept 20: [Solution] Titanic Survival Model
- Concept 21: Outro
-
Lesson 05: Naive Bayes
Naive Bayesian Algorithms are powerful tools for creating classifiers for incoming labeled data. Specifically Naive Bayes is frequently used with text data and classification problems.
- Concept 01: Intro
- Concept 02: Guess the Person
- Concept 03: Known and Inferred
- Concept 04: Guess the Person Now
- Concept 05: Bayes Theorem
- Concept 06: Quiz: False Positives
- Concept 07: Solution: False Positives
- Concept 08: Bayesian Learning 1
- Concept 09: Bayesian Learning 2
- Concept 10: Bayesian Learning 3
- Concept 11: Naive Bayes Algorithm 1
- Concept 12: Naive Bayes Algorithm 2
- Concept 13: Quiz: Bayes Rule
- Concept 14: Building a Spam Classifier
- Concept 15: Spam Classifier - Workspace
- Concept 16: Outro
-
Lesson 06: Support Vector Machines
Support vector machines are a common method used for classification problems. They have been proven effective using what is known as the 'kernel' trick!
- Concept 01: Intro
- Concept 02: Which line is better?
- Concept 03: Minimizing Distances
- Concept 04: Error Function Intuition
- Concept 05: Perceptron Algorithm
- Concept 06: Classification Error
- Concept 07: Margin Error
- Concept 08: (Optional) Margin Error Calculation
- Concept 09: Error Function
- Concept 10: The C Parameter
- Concept 11: Polynomial Kernel 1
- Concept 12: Polynomial Kernel 2
- Concept 13: Polynomial Kernel 3
- Concept 14: RBF Kernel 1
- Concept 15: RBF Kernel 2
- Concept 16: RBF Kernel 3
- Concept 17: SVMs in sklearn
- Concept 18: Recap & Additional Resources
-
Lesson 07: Ensemble Methods
Bagging and boosting are two common ensemble methods for combining simple algorithms to make more advanced models that work better than the simple algorithms would on their own.
- Concept 01: Intro
- Concept 02: Ensembles
- Concept 03: Random Forests
- Concept 04: Bagging
- Concept 05: AdaBoost
- Concept 06: Weighting the Data
- Concept 07: Weighting the Models 1
- Concept 08: Weighting the Models 2
- Concept 09: Weighting the Models 3
- Concept 10: Combining the Models
- Concept 11: AdaBoost in sklearn
- Concept 12: More Spam Classifying
- Concept 13: Recap & Additional Resources
-
Lesson 08: Model Evaluation Metrics
Learn the main metrics to evaluate models, such as accuracy, precision, recall, and more!
- Concept 01: Intro
- Concept 02: Outline
- Concept 03: Testing your models
- Concept 04: Confusion Matrix
- Concept 05: Confusion Matrix 2
- Concept 06: Accuracy
- Concept 07: Accuracy 2
- Concept 08: When accuracy won't work
- Concept 09: False Negatives and Positives
- Concept 10: Precision and Recall
- Concept 11: Precision
- Concept 12: Recall
- Concept 13: F1 Score
- Concept 14: F-beta Score
- Concept 15: ROC Curve
- Concept 16: Sklearn Practice (Classification)
- Concept 17: Regression Metrics
- Concept 18: Sklearn Practice (Regression)
- Concept 19: Text: Recap
- Concept 20: Summary
-
Lesson 09: Training and Tuning
Learn the main types of errors that can occur during training, and several methods to deal with them and optimize your machine learning models.
- Concept 01: Types of Errors
- Concept 02: Model Complexity Graph
- Concept 03: Cross Validation
- Concept 04: K-Fold Cross Validation
- Concept 05: Learning Curves
- Concept 06: Detecting Overfitting and Underfitting with Learning Curves
- Concept 07: Solution: Detecting Overfitting and Underfitting
- Concept 08: Grid Search
- Concept 09: Grid Search in sklearn
- Concept 10: Grid Search Lab
- Concept 11: [Solution] Grid Search Lab
- Concept 12: Putting It All Together
- Concept 13: Outro
-
Lesson 10: Finding Donors Project
You've covered a wide variety of methods for performing supervised learning -- now it's time to put those into action!
-
Part 11 : Deep Learning
Gain a solid foundation in neural networks, deep learning, and PyTorch.
-
Module 01: Deep Learning
-
Lesson 01: Introduction to Neural Networks
In this lesson, Luis will give you solid foundations on deep learning and neural networks. You'll also implement gradient descent and backpropagation in python right here in the classroom.
- Concept 01: Instructor
- Concept 02: Introduction
- Concept 03: Classification Problems 1
- Concept 04: Classification Problems 2
- Concept 05: Linear Boundaries
- Concept 06: Higher Dimensions
- Concept 07: Perceptrons
- Concept 08: Why "Neural Networks"?
- Concept 09: Perceptrons as Logical Operators
- Concept 10: Perceptron Trick
- Concept 11: Perceptron Algorithm
- Concept 12: Non-Linear Regions
- Concept 13: Error Functions
- Concept 14: Log-loss Error Function
- Concept 15: Discrete vs Continuous
- Concept 16: Softmax
- Concept 17: One-Hot Encoding
- Concept 18: Maximum Likelihood
- Concept 19: Maximizing Probabilities
- Concept 20: Cross-Entropy 1
- Concept 21: Cross-Entropy 2
- Concept 22: Multi-Class Cross Entropy
- Concept 23: Logistic Regression
- Concept 24: Gradient Descent
- Concept 25: Logistic Regression Algorithm
- Concept 26: Pre-Lab: Gradient Descent
- Concept 27: Notebook: Gradient Descent
- Concept 28: Perceptron vs Gradient Descent
- Concept 29: Continuous Perceptrons
- Concept 30: Non-linear Data
- Concept 31: Non-Linear Models
- Concept 32: Neural Network Architecture
- Concept 33: Feedforward
- Concept 34: Backpropagation
- Concept 35: Pre-Lab: Analyzing Student Data
- Concept 36: Notebook: Analyzing Student Data
- Concept 37: Outro
-
Lesson 02: Implementing Gradient Descent
Mat will introduce you to a different error function and guide you through implementing gradient descent using numpy matrix multiplication.
- Concept 01: Mean Squared Error Function
- Concept 02: Gradient Descent
- Concept 03: Gradient Descent: The Math
- Concept 04: Gradient Descent: The Code
- Concept 05: Implementing Gradient Descent
- Concept 06: Multilayer Perceptrons
- Concept 07: Backpropagation
- Concept 08: Implementing Backpropagation
- Concept 09: Further Reading
-
Lesson 03: Training Neural Networks
Now that you know what neural networks are, in this lesson you will learn several techniques to improve their training.
- Concept 01: Instructor
- Concept 02: Training Optimization
- Concept 03: Testing
- Concept 04: Overfitting and Underfitting
- Concept 05: Early Stopping
- Concept 06: Regularization
- Concept 07: Regularization 2
- Concept 08: Dropout
- Concept 09: Local Minima
- Concept 10: Random Restart
- Concept 11: Vanishing Gradient
- Concept 12: Other Activation Functions
- Concept 13: Batch vs Stochastic Gradient Descent
- Concept 14: Learning Rate Decay
- Concept 15: Momentum
- Concept 16: Error Functions Around the World
-
Lesson 04: Keras
In this section you'll get a hands-on introduction to Keras. You'll learn to apply it to analyze movie reviews.
-
Lesson 05: Deep Learning with PyTorch (updated version)
Learn how to use PyTorch for building deep learning models.
- Concept 01: This is a New Updated Version of this Lesson
- Concept 02: Welcome
- Concept 03: Pre-Notebook
- Concept 04: Notebook Workspace
- Concept 05: Single layer neural networks
- Concept 06: Single layer neural networks solution
- Concept 07: Networks Using Matrix Multiplication
- Concept 08: Multilayer Networks Solution
- Concept 09: Neural Networks in PyTorch
- Concept 10: Neural Networks Solution
- Concept 11: Implementing Softmax Solution
- Concept 12: Network Architectures in PyTorch
- Concept 13: Network Architectures Solution
- Concept 14: Training a Network Solution
- Concept 15: Classifying Fashion-MNIST
- Concept 16: Fashion-MNIST Solution
- Concept 17: Inference and Validation
- Concept 18: Validation Solution
- Concept 19: Dropout Solution
- Concept 20: Saving and Loading Models
- Concept 21: Loading Image Data
- Concept 22: Loading Image Data Solution
- Concept 23: Pre-Notebook with GPU
- Concept 24: Notebook Workspace w/ GPU
- Concept 25: Transfer Learning
- Concept 26: Transfer Learning Solution
- Concept 27: Tips, Tricks, and Other Notes
-
Lesson 06: Deep Learning with PyTorch
Learn how to use PyTorch for building deep learning models
- Concept 01: Instructor
- Concept 02: Introducing PyTorch
- Concept 03: PyTorch Tensors
- Concept 04: Defining Networks
- Concept 05: Training Networks
- Concept 06: Fashion-MNIST Exercise
- Concept 07: Inference & Validation
- Concept 08: Saving and Loading Trained Networks
- Concept 09: Loading Data Sets with Torchvision
- Concept 10: Transfer Learning
- Concept 11: Transfer Learning Solution
-
Lesson 07: Image Classifier Project
In this project, you'll build a Python application that can train an image classifier on a dataset, then predict new images using the trained model.
-
Part 12 : Unsupervised Learning
Learn to build unsupervised machine learning models, and use essential data processing techniques like scaling and PCA.
-
Module 01: Unsupervised Learning
-
Lesson 01: Clustering
Clustering is one of the most common methods of unsupervised learning. Here, we'll discuss the K-means clustering algorithm.
- Concept 01: Video: Introduction
- Concept 02: Text: Course Outline
- Concept 03: Video: Two Types of Unsupervised Learning
- Concept 04: Video: K-Means Use Cases
- Concept 05: Video: K-Means
- Concept 06: Quiz: Identifying Clusters
- Concept 07: Video: Changing K
- Concept 08: Video: Elbow Method
- Concept 09: Screencast: K-Means in Scikit Learn
- Concept 10: Notebook: Your Turn
- Concept 11: Screencast: Solution
- Concept 12: Video: How Does K-Means Work?
- Concept 13: Screencast + Text: How Does K-Means Work?
- Concept 14: How Does K-Means Work?
- Concept 15: Video: Is that the Optimal Solution?
- Concept 16: Video: Feature Scaling
- Concept 17: Video: Feature Scaling Example
- Concept 18: Notebook: Feature Scaling Example
- Concept 19: Notebook: Feature Scaling
- Concept 20: Screencast: Solution
- Concept 21: Video: Outro
- Concept 22: Text: Recap
-
Lesson 02: Hierarchical and Density Based Clustering
We continue to look at clustering methods. Here, we'll discuss hierarchical clustering and density-based clustering (DBSCAN).
- Concept 01: K-means considerations
- Concept 02: Overview of other clustering methods
- Concept 03: Hierarchical clustering: single-link
- Concept 04: Examining single-link clustering
- Concept 05: Complete-link, average-link, Ward
- Concept 06: Hierarchical clustering implementation
- Concept 07: [Lab] Hierarchical clustering
- Concept 08: [Lab Solution] Hierarchical Clustering
- Concept 09: HC examples and applications
- Concept 10: [Quiz] Hierarchical clustering
- Concept 11: DBSCAN
- Concept 12: DBSCAN implementation
- Concept 13: [Lab] DBSCAN
- Concept 14: [Lab Solution] DBSCAN
- Concept 15: DBSCAN examples & applications
- Concept 16: [Quiz] DBSCAN
-
Lesson 03: Gaussian Mixture Models and Cluster Validation
In this lesson, we discuss Gaussian mixture model clustering. We then talk about the cluster analysis process and how to validate clustering results.
- Concept 01: Intro
- Concept 02: Gaussian Mixture Model (GMM) Clustering
- Concept 03: Gaussian Distribution in One Dimension
- Concept 04: GMM Clustering in One Dimension
- Concept 05: Gaussian Distribution in 2D
- Concept 06: GMM in 2D
- Concept 07: Quiz: Gaussian Mixtures
- Concept 08: Overview of The Expectation Maximization (EM) Algorithm
- Concept 09: Expectation Maximization Part 1
- Concept 10: Expectation Maximization Part 2
- Concept 11: Visual Example of EM Progress
- Concept 12: Expectation Maximization
- Concept 13: GMM Implementation
- Concept 14: GMM Examples & Applications
- Concept 15: Cluster Analysis Process
- Concept 16: Cluster Validation
- Concept 17: External Validation Indices
- Concept 18: Quiz: Adjusted Rand Index
- Concept 19: Internal Validation Indices
- Concept 20: Silhouette Coefficient
- Concept 21: GMM & Cluster Validation Lab
- Concept 22: GMM & Cluster Validation Lab Solution
-
Lesson 04: Dimensionality Reduction and PCA
Often we need to reduce a large number of features in our data to a smaller, more relevant set. Principal Component Analysis, or PCA, is a method of feature extraction and dimensionality reduction.
- Concept 01: Video: Introduction
- Concept 02: Video: Lesson Topics
- Concept 03: Text: Lesson Topics
- Concept 04: Video: Latent Features
- Concept 05: Latent Features
- Concept 06: Video: How to Reduce Features?
- Concept 07: Video: Dimensionality Reduction
- Concept 08: Video: PCA Properties
- Concept 09: Quiz: How Does PCA Work?
- Concept 10: Screencast: PCA
- Concept 11: Notebook: PCA - Your Turn
- Concept 12: Screencast: PCA Solution
- Concept 13: Screencast: Interpret PCA Results
- Concept 14: Notebook: Interpretation
- Concept 15: Screencast: Interpretation Solution
- Concept 16: Text: What Are EigenValues & EigenVectors?
- Concept 17: Video: When to Use PCA?
- Concept 18: Video: Recap
- Concept 19: Notebook: Mini-Project
- Concept 20: Mini-Project Solution
- Concept 21: Video: Outro
- Concept 22: Text: Recap
-
Lesson 05: Random Projection and ICA
In this lesson, we will look at two other methods for feature extraction and dimensionality reduction: Random Projection and Independent Component Analysis (ICA).
- Concept 01: Random Projection
- Concept 02: Random Projection
- Concept 03: Random Projection in sklearn
- Concept 04: Independent Component Analysis (ICA)
- Concept 05: FastICA Algorithm
- Concept 06: ICA
- Concept 07: ICA in sklearn
- Concept 08: [Lab] Independent Component Analysis
- Concept 09: [Solution] Independent Component Analysis
- Concept 10: ICA Applications
-
Lesson 06: Project: Identify Customer Segments
In this project, you'll apply your unsupervised learning skills to two demographics datasets, to identify segments and clusters in the population, and see how customers of a company map to them.
Project Description - Creating Customer Segments with Arvato
-
Part 13 : Convolutional Neural Networks
Learn how to build convolutional networks and use them to classify images (faces, melanomas, etc.) based on patterns and objects that appear in them. Use these networks to learn data compression and image de-noising.
-
Module 01: Convolutional Neural Networks
-
Lesson 01: Convolutional Neural Networks
Convolutional Neural Networks allow for spatial pattern recognition. Alexis and Cezanne go over how they help us dramatically improve performance in image classification.
- Concept 01: Introducing Alexis
- Concept 02: Applications of CNNs
- Concept 03: Lesson Outline
- Concept 04: MNIST Dataset
- Concept 05: How Computers Interpret Images
- Concept 06: MLP Structure & Class Scores
- Concept 07: Do Your Research
- Concept 08: Loss & Optimization
- Concept 09: Defining a Network in PyTorch
- Concept 10: Training the Network
- Concept 11: Pre-Notebook: MLP Classification, Exercise
- Concept 12: Notebook: MLP Classification, MNIST
- Concept 13: One Solution
- Concept 14: Model Validation
- Concept 15: Validation Loss
- Concept 16: Image Classification Steps
- Concept 17: MLPs vs CNNs
- Concept 18: Local Connectivity
- Concept 19: Filters and the Convolutional Layer
- Concept 20: Filters & Edges
- Concept 21: Frequency in Images
- Concept 22: High-pass Filters
- Concept 23: Quiz: Kernels
- Concept 24: OpenCV & Creating Custom Filters
- Concept 25: Notebook: Finding Edges
- Concept 26: Convolutional Layer
- Concept 27: Convolutional Layers (Part 2)
- Concept 28: Stride and Padding
- Concept 29: Pooling Layers
- Concept 30: Notebook: Layer Visualization
- Concept 31: Capsule Networks
- Concept 32: Increasing Depth
- Concept 33: CNNs for Image Classification
- Concept 34: Convolutional Layers in PyTorch
- Concept 35: Feature Vector
- Concept 36: Pre-Notebook: CNN Classification
- Concept 37: Notebook: CNNs for CIFAR Image Classification
- Concept 38: CIFAR Classification Example
- Concept 39: CNNs in PyTorch
- Concept 40: Image Augmentation
- Concept 41: Augmentation Using Transformations
- Concept 42: Groundbreaking CNN Architectures
- Concept 43: Visualizing CNNs (Part 1)
- Concept 44: Visualizing CNNs (Part 2)
- Concept 45: Summary of CNNs
-
Lesson 02: Cloud Computing
Take advantage of Amazon's GPUs to train your neural network faster. In this lesson, you'll setup an instance on AWS and train a neural network on a GPU.
-
Lesson 03: Transfer Learning
Learn how to apply a pre-trained network to a new problem with transfer learning.
-
Lesson 04: Weight Initialization
In this lesson, you'll learn how to find good initial weights for a neural network. Having good initial weights can place the neural network closer to the optimal solution.
- Concept 01: Weight Initialization
- Concept 02: Constant Weights
- Concept 03: Random Uniform
- Concept 04: General Rule
- Concept 05: Normal Distribution
- Concept 06: Pre-Notebook: Weight Initialization, Normal Distribution
- Concept 07: Notebook: Normal & No Initialization
- Concept 08: Solution and Default Initialization
- Concept 09: Additional Material
-
Lesson 05: Autoencoders
Autoencoders are neural networks used for data compression, image de-noising, and dimensionality reduction. Here, you'll build autoencoders using PyTorch.
- Concept 01: Autoencoders
- Concept 02: A Linear Autoencoder
- Concept 03: Pre-Notebook: Linear Autoencoder
- Concept 04: Notebook: Linear Autoencoder
- Concept 05: Defining & Training an Autoencoder
- Concept 06: A Simple Solution
- Concept 07: Learnable Upsampling
- Concept 08: Transpose Convolutions
- Concept 09: Convolutional Autoencoder
- Concept 10: Pre-Notebook: Convolutional Autoencoder
- Concept 11: Notebook: Convolutional Autoencoder
- Concept 12: Convolutional Solution
- Concept 13: Upsampling & Denoising
- Concept 14: De-noising
- Concept 15: Pre-Notebook: De-noising Autoencoder
- Concept 16: Notebook: De-noising Autoencoder
-
-
Module 02: Style Transfer
-
Lesson 01: Style Transfer
Learn how to use a pre-trained network to extract content and style features from an image. Implement style transfer with your own images!
- Concept 01: Style Transfer
- Concept 02: Separating Style & Content
- Concept 03: VGG19 & Content Loss
- Concept 04: Gram Matrix
- Concept 05: Style Loss
- Concept 06: Loss Weights
- Concept 07: VGG Features
- Concept 08: Pre-Notebook: Style Transfer
- Concept 09: Notebook: Style Transfer
- Concept 10: Features & Gram Matrix
- Concept 11: Gram Matrix Solution
- Concept 12: Defining the Loss
- Concept 13: Total Loss & Complete Solution
-
-
Module 03: Project: Dog-Breed Classifier
-
Lesson 01: Project: Dog-Breed Classifier
In this project, you will learn how to build a pipeline to process real-world, user-supplied images. Given an image of a dog, your algorithm will identify an estimate of the canine’s breed.
-
-
Module 04: Deep Learning for Cancer Detection
-
Lesson 01: Deep Learning for Cancer Detection
In this lesson, Sebastian Thrun teaches us about his groundbreaking work detecting skin cancer with convolutional neural networks.
- Concept 01: Intro
- Concept 02: Skin Cancer
- Concept 03: Survival Probability of Skin Cancer
- Concept 04: Medical Classification
- Concept 05: The data
- Concept 06: Image Challenges
- Concept 07: Quiz: Data Challenges
- Concept 08: Solution: Data Challenges
- Concept 09: Training the Neural Network
- Concept 10: Quiz: Random vs Pre-initialized Weights
- Concept 11: Solution: Random vs Pre-initialized Weight
- Concept 12: Validating the Training
- Concept 13: Quiz: Sensitivity and Specificity
- Concept 14: Solution: Sensitivity and Specificity
- Concept 15: More on Sensitivity and Specificity
- Concept 16: Quiz: Diagnosing Cancer
- Concept 17: Solution: Diagnosing Cancer
- Concept 18: Refresh on ROC Curves
- Concept 19: Quiz: ROC Curve
- Concept 20: Solution: ROC Curve
- Concept 21: Comparing our Results with Doctors
- Concept 22: Visualization
- Concept 23: What is the network looking at?
- Concept 24: Refresh on Confusion Matrices
- Concept 25: Confusion Matrix
- Concept 26: Conclusion
- Concept 27: Useful Resources
- Concept 28: Mini Project Introduction
- Concept 29: Mini Project: Dermatologist AI
-
-
Module 05: Jobs in Deep Learning
-
Lesson 01: Jobs in Deep Learning
To kick off your industry research, learn about real world applications of Deep Learning and common questions about jobs in this field.
-
Lesson 02: Optimize Your GitHub Profile
Other professionals are collaborating on GitHub and growing their network. Submit your profile to ensure your profile is on par with leaders in your field.
- Concept 01: Prove Your Skills With GitHub
- Concept 02: Introduction
- Concept 03: GitHub profile important items
- Concept 04: Good GitHub repository
- Concept 05: Interview with Art - Part 1
- Concept 06: Identify fixes for example “bad” profile
- Concept 07: Quick Fixes #1
- Concept 08: Quick Fixes #2
- Concept 09: Writing READMEs with Walter
- Concept 10: Interview with Art - Part 2
- Concept 11: Commit messages best practices
- Concept 12: Reflect on your commit messages
- Concept 13: Participating in open source projects
- Concept 14: Interview with Art - Part 3
- Concept 15: Participating in open source projects 2
- Concept 16: Starring interesting repositories
- Concept 17: Next Steps
-
Part 14 : Recurrent Neural Networks
Build your own recurrent networks and long short-term memory networks with PyTorch; perform sentiment analysis and use recurrent networks to generate new text from TV scripts.
-
Module 01: Recurrent Neural Networks
-
Lesson 01: Recurrent Neural Networks
Explore how memory can be incorporated into a deep learning model using recurrent neural networks (RNNs). Learn how RNNs can learn from and generate ordered sequences of data.
- Concept 01: RNN Examples
- Concept 02: RNN Introduction
- Concept 03: RNN History
- Concept 04: RNN Applications
- Concept 05: Feedforward Neural Network-Reminder
- Concept 06: The Feedforward Process
- Concept 07: Feedforward Quiz
- Concept 08: Backpropagation- Theory
- Concept 09: Backpropagation - Example (part a)
- Concept 10: Backpropagation- Example (part b)
- Concept 11: Backpropagation Quiz
- Concept 12: RNN (part a)
- Concept 13: RNN (part b)
- Concept 14: RNN- Unfolded Model
- Concept 15: Unfolded Model Quiz
- Concept 16: RNN- Example
- Concept 17: Backpropagation Through Time (part a)
- Concept 18: Backpropagation Through Time (part b)
- Concept 19: Backpropagation Through Time (part c)
- Concept 20: BPTT Quiz 1
- Concept 21: BPTT Quiz 2
- Concept 22: BPTT Quiz 3
- Concept 23: Some more math
- Concept 24: RNN Summary
- Concept 25: From RNN to LSTM
- Concept 26: Wrap Up
-
Lesson 02: Long Short-Term Memory Networks (LSTMs)
Luis explains Long Short-Term Memory Networks (LSTM), and similar architectures which have the benefits of preserving long term memory.
- Concept 01: Intro to LSTM
- Concept 02: RNN vs LSTM
- Concept 03: Basics of LSTM
- Concept 04: Architecture of LSTM
- Concept 05: The Learn Gate
- Concept 06: The Forget Gate
- Concept 07: The Remember Gate
- Concept 08: The Use Gate
- Concept 09: Putting it All Together
- Concept 10: Quiz
- Concept 11: Other architectures
-
Lesson 03: Implementation of RNN & LSTM
Learn how to represent memory in code. Then define and train RNNs in PyTorch and apply them to tasks that involve sequential data.
- Concept 01: Implementing RNNs
- Concept 02: Time-Series Prediction
- Concept 03: Training & Memory
- Concept 04: Character-wise RNNs
- Concept 05: Sequence Batching
- Concept 06: Pre-Notebook: Character-Level RNN
- Concept 07: Notebook: Character-Level RNN
- Concept 08: Implementing a Char-RNN
- Concept 09: Batching Data, Solution
- Concept 10: Defining the Model
- Concept 11: Char-RNN, Solution
- Concept 12: Making Predictions
-
Lesson 04: Hyperparameters
Learn about a number of different hyperparameters that are used in defining and training deep learning models. We'll discuss starting values and intuitions for tuning each hyperparameter.
- Concept 01: Introducing Jay
- Concept 02: Introduction
- Concept 03: Learning Rate
- Concept 04: Learning Rate
- Concept 05: Minibatch Size
- Concept 06: Number of Training Iterations / Epochs
- Concept 07: Number of Hidden Units / Layers
- Concept 08: RNN Hyperparameters
- Concept 09: RNN Hyperparameters
- Concept 10: Sources & References
-
Lesson 05: Embeddings & Word2Vec
In this lesson, you'll learn about embeddings in neural networks by implementing the Word2Vec model.
- Concept 01: Word Embeddings
- Concept 02: Embedding Weight Matrix/Lookup Table
- Concept 03: Word2Vec Notebook
- Concept 04: Pre-Notebook: Word2Vec, SkipGram
- Concept 05: Notebook: Word2Vec, SkipGram
- Concept 06: Data & Subsampling
- Concept 07: Subsampling Solution
- Concept 08: Context Word Targets
- Concept 09: Batching Data, Solution
- Concept 10: Word2Vec Model
- Concept 11: Model & Validations
- Concept 12: Negative Sampling
- Concept 13: Pre-Notebook: Negative Sampling
- Concept 14: Notebook: Negative Sampling
- Concept 15: SkipGramNeg, Model Definition
- Concept 16: Complete Model & Custom Loss
-
Lesson 06: Sentiment Prediction RNN
Implement a sentiment prediction RNN for predicting whether a movie review is positive or negative!
- Concept 01: Sentiment RNN, Introduction
- Concept 02: Pre-Notebook: Sentiment RNN
- Concept 03: Notebook: Sentiment RNN
- Concept 04: Data Pre-Processing
- Concept 05: Encoding Words, Solution
- Concept 06: Getting Rid of Zero-Length
- Concept 07: Cleaning & Padding Data
- Concept 08: Padded Features, Solution
- Concept 09: TensorDataset & Batching Data
- Concept 10: Defining the Model
- Concept 11: Complete Sentiment RNN
- Concept 12: Training the Model
- Concept 13: Testing
- Concept 14: Inference, Solution
-
-
Module 02: Project: Generate TV Scripts
-
Lesson 01: Project: Generate TV Scripts
Generate a TV script by defining and training a recurrent neural network.
-
-
Module 03: Attention
-
Lesson 01: Attention
Attention is one of the most important recent innovations in deep learning. In this section, you'll learn how attention models work and go over a basic code implementation.
- Concept 01: Introduction to Attention
- Concept 02: Encoders and Decoders
- Concept 03: Sequence to Sequence Recap
- Concept 04: Encoding -- Attention Overview
- Concept 05: Decoding -- Attention Overview
- Concept 06: Attention Overview
- Concept 07: Attention Encoder
- Concept 08: Attention Decoder
- Concept 09: Attention Encoder & Decoder
- Concept 10: Bahdanau and Luong Attention
- Concept 11: Multiplicative Attention
- Concept 12: Additive Attention
- Concept 13: Additive and Multiplicative Attention
- Concept 14: Computer Vision Applications
- Concept 15: Other Attention Methods
- Concept 16: The Transformer and Self-Attention
- Concept 17: Notebook: Attention Basics
- Concept 18: [SOLUTION]: Attention Basics
- Concept 19: Outro
-
Part 15 : Generative Adversarial Networks
Learn to understand and implement a Deep Convolutional GAN (generative adversarial network) to generate realistic images, with Ian Goodfellow, the inventor of GANs, and Jun-Yan Zhu, the creator of CycleGANs.
-
Module 01: Generative Adversarial Networks
-
Lesson 01: Generative Adversarial Networks
Ian Goodfellow, the inventor of GANs, introduces you to these exciting models. You'll also implement your own GAN on the MNIST dataset.
- Concept 01: Introducing Ian GoodFellow
- Concept 02: Applications of GANs
- Concept 03: How GANs work
- Concept 04: Games and Equilibria
- Concept 05: Tips for Training GANs
- Concept 06: Generating Fake Images
- Concept 07: MNIST GAN
- Concept 08: GAN Notebook & Data
- Concept 09: Pre-Notebook: MNIST GAN
- Concept 10: Notebook: MNIST GAN
- Concept 11: The Complete Model
- Concept 12: Generator & Discriminator
- Concept 13: Hyperparameters
- Concept 14: Fake and Real Losses
- Concept 15: Optimization Strategy, Solution
- Concept 16: Training Two Networks
- Concept 17: Training Solution
-
Lesson 02: Deep Convolutional GANs
In this lesson you'll implement a Deep Convolution GAN to generate complex color images of house numbers.
- Concept 01: Deep Convolutional GANs
- Concept 02: DCGAN, Discriminator
- Concept 03: DCGAN Generator
- Concept 04: What is Batch Normalization?
- Concept 05: Pre-Notebook: Batch Norm
- Concept 06: Notebook: Batch Norm
- Concept 07: Benefits of Batch Normalization
- Concept 08: DCGAN Notebook & Data
- Concept 09: Pre-Notebook: DCGAN, SVHN
- Concept 10: Notebook: DCGAN, SVHN
- Concept 11: Scaling, Solution
- Concept 12: Discriminator
- Concept 13: Discriminator, Solution
- Concept 14: Generator
- Concept 15: Generator, Solution
- Concept 16: Optimization Strategy
- Concept 17: Optimization Solution & Samples
- Concept 18: Other Applications of GANs
-
-
Module 02: Pix2Pix & CycleGAN
-
Lesson 01: Pix2Pix & CycleGAN
Jun-Yan Zhu, on of the creators of the CycleGAN, will lead you through Pix2Pix and CycleGAN formulations that learn to do image-to-image translation tasks!
- Concept 01: Introducing Jun-Yan Zhu
- Concept 02: Image to Image Translation
- Concept 03: Designing Loss Functions
- Concept 04: GANs, a Recap
- Concept 05: Pix2Pix Generator
- Concept 06: Pix2Pix Discriminator
- Concept 07: CycleGANs & Unpaired Data
- Concept 08: Cycle Consistency Loss
- Concept 09: Why Does This Work?
- Concept 10: Beyond CycleGANs
-
Lesson 02: Implementing a CycleGAN
Cezanne will show you how to implement a CycleGAN in PyTorch and translate images from the summer to winter domains.
- Concept 01: CycleGAN Notebook & Data
- Concept 02: Pre-Notebook: CycleGAN
- Concept 03: Notebook: CycleGAN
- Concept 04: DC Discriminator
- Concept 05: DC Discriminator, Solution
- Concept 06: Generator & Residual Blocks
- Concept 07: CycleGAN Generator
- Concept 08: Blocks & Generator, Solution
- Concept 09: Adversarial & Cycle Consistency Losses
- Concept 10: Loss & Optimization, Solution
- Concept 11: Training Exercise
- Concept 12: Training Solution & Generated Samples
-
-
Module 03: Project: Generate Faces
-
Lesson 01: Project: Generate Faces
Define two adversarial networks, a generator and discriminator, and train them until you can generate realistic faces.
-
Part 16 : Â Deploying a Model
Train and deploy your own sentiment analysis model using Amazon's SageMaker. Deployment gives you the ability to use a trained model to analyze new, user input. Build a model, deploy it, and create a gateway for accessing it from a website.
-
Module 01: Deployment
-
Lesson 01: Introduction to Deployment
This lesson will familiarizing the student with cloud and deployment terminology along with demonstrating how deployment fits within the machine learning workflow.
- Concept 01: Welcome!
- Concept 02: What's Ahead?
- Concept 03: Problem Introduction
- Concept 04: Machine Learning Workflow
- Concept 05: Machine Learning Workflow
- Concept 06: What is Cloud Computing & Why Would We Use It?
- Concept 07: Why Cloud Computing?
- Concept 08: Machine Learning Applications
- Concept 09: Machine Learning Applications
- Concept 10: Paths to Deployment
- Concept 11: Paths to Deployment
- Concept 12: Production Environments
- Concept 13: Production Environments
- Concept 14: Endpoints & REST APIs
- Concept 15: Endpoints & REST APIs
- Concept 16: Containers
- Concept 17: Containers
- Concept 18: Containers - Straight From the Experts
- Concept 19: Characteristics of Modeling & Deployment
- Concept 20: Characteristics of Modeling & Deployment
- Concept 21: Comparing Cloud Providers
- Concept 22: Comparing Cloud Providers
- Concept 23: Closing Statements
- Concept 24: Summary
- Concept 25: [Optional] Cloud Computing Defined
- Concept 26: [Optional] Cloud Computing Explained
-
Lesson 02: Building a Model using SageMaker
Learn how to use Amazon’s SageMaker service to predict Boston housing prices using SageMaker’s built-in XGBoost algorithm.
- Concept 01: Introduction to Amazon SageMaker
- Concept 02: Create an AWS Account
- Concept 03: Checking GPU Access
- Concept 04: Setting up a Notebook Instance
- Concept 05: Cloning the Deployment Notebooks
- Concept 06: Is Everything Set Up?
- Concept 07: Boston Housing Example - Getting the Data Ready
- Concept 08: Boston Housing Example - Training the Model
- Concept 09: Boston Housing Example - Testing the Model
- Concept 10: Mini-Project: Building Your First Model
- Concept 11: Mini-Project: Solution
- Concept 12: Boston Housing In-Depth - Data Preparation
- Concept 13: Boston Housing In-Depth - Creating a Training Job
- Concept 14: Boston Housing In-Depth - Building a Model
- Concept 15: Boston Housing In-Depth - Creating a Batch Transform Job
- Concept 16: Summary
-
Lesson 03: Deploying and Using a Model
In this lesson students will learn how to deploy a model using SageMaker and how to make use of their deployed model with a simple web application.
- Concept 01: Deploying a Model in SageMaker
- Concept 02: Boston Housing Example - Deploying the Model
- Concept 03: Boston Housing In-Depth - Deploying the Model
- Concept 04: Deploying and Using a Sentiment Analysis Model
- Concept 05: Text Processing, Bag of Words
- Concept 06: Building and Deploying the Model
- Concept 07: How to Use a Deployed Model
- Concept 08: Creating and Using an Endpoint
- Concept 09: Building a Lambda Function
- Concept 10: Building an API
- Concept 11: Using the Final Web Application
- Concept 12: Summary
-
Lesson 04: Hyperparameter Tuning
In this lesson students will see how to use SageMaker’s automatic hyperparameter tuning tools on the Boston housing prices model from lesson 2 and with a sentiment analysis model.
- Concept 01: Hyperparameter Tuning
- Concept 02: Introduction to Hyperparameter Tuning
- Concept 03: Boston Housing Example - Tuning the Model
- Concept 04: Mini-Project: Tuning the Sentiment Analysis Model
- Concept 05: Mini-Project: Solution - Tuning the Model
- Concept 06: Mini-Project: Solution - Fixing the Error and Testing
- Concept 07: Boston Housing In-Depth - Creating a Tuning Job
- Concept 08: Boston Housing In-Depth - Monitoring the Tuning Job
- Concept 09: Boston Housing In-Depth - Building and Testing the Model
- Concept 10: Summary
-
Lesson 05: Updating a Model
In this lesson students will learn how to update their model to account for changes in the underlying data used to train their model.
- Concept 01: Updating a Model
- Concept 02: Building a Sentiment Analysis Model (XGBoost)
- Concept 03: Building a Sentiment Analysis Model (Linear Learner)
- Concept 04: Combining the Models
- Concept 05: Mini-Project: Updating a Sentiment Analysis Model
- Concept 06: Loading and Testing the New Data
- Concept 07: Exploring the New Data
- Concept 08: Building a New Model
- Concept 09: SageMaker Retrospective
- Concept 10: Cleaning Up Your AWS Account
- Concept 11: SageMaker Tips and Tricks
-
-
Module 02: Project: Deploying a Sentiment Analysis Model
-
Lesson 01: Project: Deploying a Sentiment Analysis Model
In this project, you will build and deploy a neural network which predicts the sentiment of a user-provided movie review. In addition, you will create a simple web app that uses your deployed model.
-
Part 17 : Machine Learning, Case Studies
Welcome to learn the Machine Learning, Case Studies.
-
Lesson 01: Population Segmentation
Train and deploy unsupervised models (PCA and k-means clustering) to group US counties by similarities and differences. Visualize the trained model attributes and interpret the results.
- Concept 01: Introducing Cezanne & Dan
- Concept 02: Interview Segment: What is SageMaker and Why Learn It?
- Concept 03: Course Outline, Case Studies
- Concept 04: Unsupervised v Supervised Learning
- Concept 05: Model Design
- Concept 06: Population Segmentation
- Concept 07: K-means, Overview
- Concept 08: Creating a Notebook Instance
- Concept 09: Create a SageMaker Notebook Instance
- Concept 10: Pre-Notebook: Population Segmentation
- Concept 11: Exercise: Data Loading & Processing
- Concept 12: Solution: Data Pre-Processing
- Concept 13: Exercise: Normalization
- Concept 14: Solution: Normalization
- Concept 15: PCA, Overview
- Concept 16: PCA Estimator & Training
- Concept 17: Exercise: PCA Model Attributes & Variance
- Concept 18: Solution: Variance
- Concept 19: Component Makeup
- Concept 20: Exercise: PCA Deployment & Data Transformation
- Concept 21: Solution: Creating Transformed Data
- Concept 22: Exercise: K-means Estimator & Selecting K
- Concept 23: Exercise: K-means Predictions (clusters)
- Concept 24: Solution: K-means Predictor
- Concept 25: Exercise: Get the Model Attributes
- Concept 26: Solution: Model Attributes
- Concept 27: Clean Up: All Resources
- Concept 28: AWS Workflow & Summary
-
Lesson 02: Payment Fraud Detection
Train a linear model to do credit card fraud detection. Improve the model by accounting for class imbalance in the training data and tuning for a specific performance metric.
- Concept 01: Fraud Detection
- Concept 02: Pre-Notebook: Payment Fraud Detection
- Concept 03: Exercise: Payment Transaction Data
- Concept 04: Solution: Data Distribution & Splitting
- Concept 05: LinearLearner & Class Imbalance
- Concept 06: Exercise: Define a LinearLearner
- Concept 07: Solution: Default LinearLearner
- Concept 08: Exercise: Format Data & Train the LinearLearner
- Concept 09: Solution: Training Job
- Concept 10: Precision & Recall, Overview
- Concept 11: Exercise: Deploy Estimator
- Concept 12: Solution: Deployment & Evaluation
- Concept 13: Model Improvements
- Concept 14: Improvement, Model Tuning
- Concept 15: Exercise: Improvement, Class Imbalance
- Concept 16: Solution: Accounting for Class Imbalance
- Concept 17: Exercise: Define a Model w/ Specifications
- Concept 18: One Solution: Tuned and Balanced LinearLearner
- Concept 19: Summary and Improvements
-
Lesson 03: Interview Segment: SageMaker as a Tool & the Future of ML
If you're interested in how SageMaker has developed to serve businesses and learners, take a look at this short interview segment with Dan Mbanga.
-
Lesson 04: Deploying Custom Models
Design and train a custom PyTorch classifier by writing a training script. This is an especially useful skill for tasks that cannot be easily solved by built-in algorithms.
- Concept 01: Pre-Notebook: Custom Models & Moon Data
- Concept 02: Moon Data & Custom Models
- Concept 03: Upload Data to S3
- Concept 04: Exercise: Custom PyTorch Classifier
- Concept 05: Solution: Simple Neural Network
- Concept 06: Exercise: Training Script
- Concept 07: Solution: Complete Training Script
- Concept 08: Custom SKLearn Model
- Concept 09: PyTorch Estimator
- Concept 10: Exercise: Create a PyTorchModel & Endpoint
- Concept 11: Solution: PyTorchModel & Evaluation
- Concept 12: Clean Up: All Resources
- Concept 13: Summary of Skills
-
Lesson 05: Time-Series Forecasting
Learn how to format time series data into context (input) and prediction (output) data, and train the built-in algorithm, DeepAR; this uses an RNN to find recurring patterns in time series data.
- Concept 01: Time-Series Forecasting
- Concept 02: Forecasting Energy Consumption, Notebook
- Concept 03: Pre-Notebook: Time-Series Forecasting
- Concept 04: Processing Energy Data
- Concept 05: Exercise: Creating Time Series
- Concept 06: Solution: Split Data
- Concept 07: Exercise: Convert to JSON
- Concept 08: Solution: Formatting JSON Lines & DeepAR Estimator
- Concept 09: Exercise: DeepAR Estimator
- Concept 10: Solution: Complete Estimator & Hyperparameters
- Concept 11: Making Predictions
- Concept 12: Exercise: Predicting the Future
- Concept 13: Solution: Predicting Future Data
- Concept 14: Clean Up: All Resources
Part 18 : Software Engineering
Software engineering skills are increasingly important for data scientists. In this course, you'll learn best practices for writing software. Then you'll work on your software skills by coding a Python package and a web data dashboard.
-
Module 01: Software Engineering
-
Lesson 01: Introduction to Software Engineering
Welcome to Software Engineering for Data Scientists! Learn about the course and meet your instructors.
-
Lesson 02: Software Engineering Practices Pt I
Learn software engineering practices and how they apply in data science. Part one covers clean and modular code, code efficiency, refactoring, documentation, and version control.
- Concept 01: Introduction
- Concept 02: Clean and Modular Code
- Concept 03: Refactoring Code
- Concept 04: Writing Clean Code
- Concept 05: Quiz: Clean Code
- Concept 06: Writing Modular Code
- Concept 07: Quiz: Refactoring - Wine Quality
- Concept 08: Solution: Refactoring - Wine Quality
- Concept 09: Efficient Code
- Concept 10: Optimizing - Common Books
- Concept 11: Quiz: Optimizing - Common Books
- Concept 12: Solution: Optimizing - Common Books
- Concept 13: Quiz: Optimizing - Holiday Gifts
- Concept 14: Solution: Optimizing - Holiday Gifts
- Concept 15: Documentation
- Concept 16: In-line Comments
- Concept 17: Docstrings
- Concept 18: Project Documentation
- Concept 19: Documentation
- Concept 20: Version Control in Data Science
- Concept 21: Scenario #1
- Concept 22: Scenario #2
- Concept 23: Scenario #3
- Concept 24: Model Versioning
- Concept 25: Conclusion
-
Lesson 03: Software Engineering Practices Pt II
Learn software engineering practices and how they apply in data science. Part two covers testing code, logging, and conducting code reviews.
- Concept 01: Introduction
- Concept 02: Testing
- Concept 03: Testing and Data Science
- Concept 04: Unit Tests
- Concept 05: Unit Testing Tools
- Concept 06: Quiz: Unit Tests
- Concept 07: Test Driven Development and Data Science
- Concept 08: Logging
- Concept 09: Log Messages
- Concept 10: Logging
- Concept 11: Code Review
- Concept 12: Questions to Ask Yourself When Conducting a Code Review
- Concept 13: Tips for Conducting a Code Review
- Concept 14: Conclusion
-
Lesson 04: Introduction to Object-Oriented Programming
Learn the basics of object-oriented programming so that you can build your own Python package.
- Concept 01: Introduction
- Concept 02: Procedural vs. Object-Oriented Programming
- Concept 03: Class, Object, Method and Attribute
- Concept 04: OOP Syntax
- Concept 05: Exercise: OOP Syntax Practice - Part 1
- Concept 06: A Couple of Notes about OOP
- Concept 07: Exercise: OOP Syntax Practice - Part 2
- Concept 08: Commenting Object-Oriented Code
- Concept 09: A Gaussian Class
- Concept 10: How the Gaussian Class Works
- Concept 11: Exercise: Code the Gaussian Class
- Concept 12: Magic Methods
- Concept 13: Exercise: Code Magic Methods
- Concept 14: Inheritance
- Concept 15: Exercise: Inheritance with Clothing
- Concept 16: Inheritance: Probability Distribution
- Concept 17: Demo: Inheritance Probability Distributions
- Concept 18: Advanced OOP Topics
- Concept 19: Organizing into Modules
- Concept 20: Demo: Modularized Code
- Concept 21: Making a Package
- Concept 22: Virtual Environments
- Concept 23: Exercise: Making a Package and Pip Installing
- Concept 24: Binomial Class
- Concept 25: Exercise: Binomial Class
- Concept 26: Scikit-learn Source Code
- Concept 27: Putting Code on PyPi
- Concept 28: Exercise: Upload to PyPi
- Concept 29: Lesson Summary
-
Lesson 05: Portfolio Exercise: Upload a Package to PyPi
Create your own Python package and upload your package to PyPi.
-
Lesson 06: Web Development
Develop a data dashboard using Flask, Bootstrap, Plotly and Pandas.
- Concept 01: Introduction
- Concept 02: Lesson Overview
- Concept 03: The Web
- Concept 04: Components of a Web App
- Concept 05: The Front-End
- Concept 06: HTML
- Concept 07: Exercise: HTML
- Concept 08: Div and Span
- Concept 09: IDs and Classes
- Concept 10: Exercise: HTML Div, Span, IDs, Classes
- Concept 11: CSS
- Concept 12: Exercise: CSS
- Concept 13: Bootstrap Library
- Concept 14: Exercise: Bootstrap
- Concept 15: JavaScript
- Concept 16: Exercise: JavaScript
- Concept 17: Plotly
- Concept 18: Exercise: Plotly
- Concept 19: The Backend
- Concept 20: Flask
- Concept 21: Exercise: Flask
- Concept 22: Flask + Pandas
- Concept 23: Example: Flask + Pandas
- Concept 24: Flask+Plotly+Pandas Part 1
- Concept 25: Flask+Plotly+Pandas Part 2
- Concept 26: Flask+Plotly+Pandas Part 3
- Concept 27: Flask+Plotly+Pandas Part 4
- Concept 28: Example: Flask + Plotly + Pandas
- Concept 29: Exercise: Flask + Plotly + Pandas
- Concept 30: Deployment
- Concept 31: Exercise: Deployment
- Concept 32: Lesson Summary
-
Lesson 07: Portfolio Exercise: Deploy a Data Dashboard
Customize the data dashboard from the previous lesson to make it your own. Upload the dashboard to the web.
- Concept 01: Introduction
- Concept 02: Workspace Portfolio Exercise
- Concept 03: Troubleshooting Possible Errors
- Concept 04: Congratulations
- Concept 05: APIs [advanced version]
- Concept 06: World Bank API [advanced version]
- Concept 07: Python and APIs [advanced version]
- Concept 08: World Bank Data Dashboard [advanced version]
-
Part 19 : Data Engineering
In data engineering for data scientists, you will practice building ETL, NLP, and machine learning pipelines.
-
Module 01: Data Engineering
-
Lesson 01: Introduction to Data Engineering
You will get an introduction to the data engineering for data scientists course and project. The lessons include ETL pipelines, natural language pipelines, and machine learning pipelines.
-
Lesson 02: ETL Pipelines
ETL stands for extract, transform, and load. This is the most common type of data pipeline, and you will practice each step in this lesson.
- Concept 01: Introduction
- Concept 02: Lesson Overview
- Concept 03: World Bank Datasets
- Concept 04: How to Tackle the Exercises
- Concept 05: Extract
- Concept 06: Exercise: CSV
- Concept 07: Exercise: JSON and XML
- Concept 08: Exercise: SQL Databases
- Concept 09: Extracting Text Data
- Concept 10: Exercise: APIs
- Concept 11: Transform
- Concept 12: Combining Data
- Concept 13: Exercise: Combining Data
- Concept 14: Cleaning Data
- Concept 15: Exercise: Cleaning Data
- Concept 16: Exercise: Data Types
- Concept 17: Exercise: Parsing Dates
- Concept 18: Matching Encodings
- Concept 19: Exercise: Matching Encodings
- Concept 20: Missing Data - Overview
- Concept 21: Missing Data - Delete
- Concept 22: Missing Data - Impute
- Concept 23: Exercise: Imputation
- Concept 24: SQL, optimization, and ETL - Robert Chang Airbnb
- Concept 25: Duplicate Data
- Concept 26: Exercise: Duplicate Data
- Concept 27: Dummy Variables
- Concept 28: Exercise: Dummy Variables
- Concept 29: Outliers - How to Find Them
- Concept 30: Exercise: Outliers Part 1
- Concept 31: Outliers - What to do
- Concept 32: Exercise: Outliers - Part 2
- Concept 33: AI and Data Engineering - Robert Chang Airbnb
- Concept 34: Scaling Data
- Concept 35: Exercise: Scaling Data
- Concept 36: Feature Engineering
- Concept 37: Exercise: Feature Engineering
- Concept 38: Bloopers
- Concept 39: Load
- Concept 40: Exercise: Load
- Concept 41: Putting It All Together
- Concept 42: Exercise: Putting It All Together
- Concept 43: Lesson Summary
-
Lesson 03: NLP Pipelines
In order to complete the project at the end of the course, you will need some natural language processing skills. Here you will practice engineering machine learning features from text data.
- Concept 01: NLP and Pipelines
- Concept 02: How NLP Pipelines Work
- Concept 03: Text Processing
- Concept 04: Cleaning
- Concept 05: Notebook: Cleaning
- Concept 06: Normalization
- Concept 07: Notebook: Normalization
- Concept 08: Tokenization
- Concept 09: Notebook: Tokenization
- Concept 10: Stop Word Removal
- Concept 11: Notebook: Stop Words
- Concept 12: Part-of-Speech Tagging
- Concept 13: Named Entity Recognition
- Concept 14: Notebook: POS and NER
- Concept 15: Stemming and Lemmatization
- Concept 16: Notebook: Stemming and Lemmatization
- Concept 17: Text Processing Summary
- Concept 18: Feature Extraction
- Concept 19: Bag of Words
- Concept 20: TF-IDF
- Concept 21: Notebook: Bag of Words and TF-IDF
- Concept 22: One-Hot Encoding
- Concept 23: Word Embeddings
- Concept 24: Modeling
- Concept 25: [OPTIONAL] Word2Vec
- Concept 26: [OPTIONAL] GloVe
- Concept 27: [OPTIONAL] Embeddings for Deep Learning
- Concept 28: [OPTIONAL] t-SNE
-
Lesson 04: Machine Learning Pipelines
You'll use the Scikit-Learn package to code a machine learning pipeline. With these skills, you can ingest data, create features, and train a machine learning algorithm in just one step.
- Concept 01: Introduction
- Concept 02: Corporate Messaging Case Study
- Concept 03: Case Study: Clean and Tokenize
- Concept 04: Solution: Clean and Tokenize
- Concept 05: Machine Learning Workflow
- Concept 06: Case Study: Machine Learning Workflow
- Concept 07: Solution: Machine Learning Workflow
- Concept 08: Using Pipeline
- Concept 09: Advantages of Using Pipeline
- Concept 10: Case Study: Build Pipeline
- Concept 11: Solution: Build Pipeline
- Concept 12: Pipelines and Feature Unions
- Concept 13: Using Feature Union
- Concept 14: Case Study: Add Feature Union
- Concept 15: Solution: Add Feature Union
- Concept 16: Creating Custom Transformers
- Concept 17: Case Study: Create Custom Transformer
- Concept 18: Solution: Create Custom Transformer
- Concept 19: Pipelines and Grid Search
- Concept 20: Using Grid Search with Pipelines
- Concept 21: Case Study: Grid Search Pipeline
- Concept 22: Solution: Grid Search Pipeline
- Concept 23: Conclusion
-
-
Module 02: Project
-
Lesson 01: Project: Disaster Response Pipeline
You’ll build a machine learning pipeline to categorize emergency messages based on the needs communicated by the sender.
-
-
Module 03: Career Support
-
Lesson 01: Take 30 Min to Improve your LinkedIn
Find your next job or connect with industry peers on LinkedIn. Ensure your profile attracts relevant leads that will grow your professional network.
- Concept 01: Get Opportunities with LinkedIn
- Concept 02: Use Your Story to Stand Out
- Concept 03: Why Use an Elevator Pitch
- Concept 04: Create Your Elevator Pitch
- Concept 05: Use Your Elevator Pitch on LinkedIn
- Concept 06: Create Your Profile With SEO In Mind
- Concept 07: Profile Essentials
- Concept 08: Work Experiences & Accomplishments
- Concept 09: Build and Strengthen Your Network
- Concept 10: Reaching Out on LinkedIn
- Concept 11: Boost Your Visibility
- Concept 12: Up Next
-
Part 20 : Experimental Design & Recommendations
Learn to design experiments and analyze A/B test results. Explore approaches for building recommendation systems.
-
Module 01: Intro to Experiment Design and Recommendation Engines
-
Lesson 01: Intro to Experiment Design and Recommendation Engines
Why do we care about experiment design and recommendation engines? In this lesson, you'll get an overview of the topics you'll learn in this course.
-
-
Module 02: Experiment Design & A/B Testing
-
Lesson 01: Concepts in Experiment Design
In this lesson, you will learn about conceptual topics that must be considered when designing and running an experiment, in order to ensure good, interpretable results.
- Concept 01: Lesson Introduction
- Concept 02: What is an Experiment?
- Concept 03: Types of Experiment
- Concept 04: Types of Sampling
- Concept 05: Measuring Outcomes
- Concept 06: Creating Metrics
- Concept 07: Controlling Variables
- Concept 08: Checking Validity
- Concept 09: Checking Bias
- Concept 10: Ethics in Experimentation
- Concept 11: A SMART Mnemonic for Experiment Design
- Concept 12: Lesson Conclusion
-
Lesson 02: Statistical Considerations in Testing
In this lesson, you will learn how statistics can be used to benefit the design of an experiment, as well as additional statistical tests that can be used to analyze results.
- Concept 01: Lesson Introduction
- Concept 02: Practice: Statistical Significance
- Concept 03: Statistical Significance - Solution
- Concept 04: Practical Significance
- Concept 05: Experiment Size
- Concept 06: Experiment Size - Solution
- Concept 07: Using Dummy Tests
- Concept 08: Non-Parametric Tests Part I
- Concept 09: Non-Parametric Tests Part I - Solution
- Concept 10: Non-Parametric Tests Part II
- Concept 11: Non-Parametric Tests Part II - Solution
- Concept 12: Analyzing Multiple Metrics
- Concept 13: Early Stopping
- Concept 14: Early Stopping - Solution
- Concept 15: Lesson Conclusion
-
Lesson 03: A/B Testing Case Study
In this lesson, you will go through an A/B Testing case study to see how the conceptual and statistical concepts covered in the previous lessons can be applied in experiment designs.
- Concept 01: Lesson Introduction
- Concept 02: Scenario Description
- Concept 03: Building a Funnel
- Concept 04: Building a Funnel - Discussion
- Concept 05: Deciding on Metrics - Part I
- Concept 06: Deciding on Metrics - Part II
- Concept 07: Deciding on Metrics - Discussion
- Concept 08: Experiment Sizing
- Concept 09: Experiment Sizing - Discussion
- Concept 10: Validity, Bias, and Ethics - Discussion
- Concept 11: Analyze Data
- Concept 12: Draw Conclusions
- Concept 13: Draw Conclusions - Discussion
- Concept 14: Lesson Conclusion
-
Lesson 04: Portfolio Exercise: Starbucks
In this lesson, you will analyze data that was originally used in screening interviews for data scientists at Starbucks.
-
-
Module 03: Recommendation Engines
-
Lesson 01: Introduction to Recommendation Engines
In this lesson, you will learn about the different methods used to create recommendation engines.
- Concept 01: Video: Intro
- Concept 02: Video + Text: Example Recommendation Engines
- Concept 03: Text: What's Ahead?
- Concept 04: Video: Introduction to MovieTweetings
- Concept 05: Notebook: MovieTweeting Data
- Concept 06: Screencast: Solution MovieTweeting Data
- Concept 07: Video: Ways to Recommend: Knowledge Based
- Concept 08: Notebook: Knowledge Based
- Concept 09: Screencast: Solution Knowledge Based
- Concept 10: Video: More Personalized Recommendations
- Concept 11: Video: Ways to Recommend: Collaborative Filtering
- Concept 12: Video + Quiz: Collaborative Filtering & Content Based Recs
- Concept 13: Video + Text: Measuring Similarity
- Concept 14: Notebook: Measuring Similarity
- Concept 15: Screencast: Solution Measuring Similarity
- Concept 16: Video: Identifying Recommendations
- Concept 17: Notebook: Collaborative Filtering
- Concept 18: Screencast: Solution Collaborative Filtering
- Concept 19: Screencast: Solutions for Collaborative Filtering
- Concept 20: Video: Ways to Recommend: Content Based
- Concept 21: Notebook: Content Based
- Concept 22: Screencast: Solution Content Based
- Concept 23: Video: Three Types of Recommendation Systems
- Concept 24: Text: More Recommendation Technniques
- Concept 25: Quiz: Recommendation Methods
- Concept 26: Video: Types of Ratings
- Concept 27: Video: Goals of Recommendation Systems
- Concept 28: Quiz: Types of Ratings & Goals of Recommendation Systems
- Concept 29: Video: Outro
- Concept 30: Text: Recap
-
Lesson 02: Matrix Factorization for Recommendations
In this lesson, you will learn how machine learning is being used to make recommendations.
- Concept 01: Video: Intro
- Concept 02: Text: What's Ahead
- Concept 03: Video: How Do We Know Our Recommendations Are Good?
- Concept 04: Text: Validating Your Recommendations
- Concept 05: Quiz: Regression Metrics
- Concept 06: Video: Why SVD?
- Concept 07: Video: Latent Factors
- Concept 08: Quiz: Latent Factors
- Concept 09: Video: Singular Value Decomposition
- Concept 10: Notebook: SVD Practice
- Concept 11: Screencast: SVD Practice Solution
- Concept 12: Video: SVD Practice Takeaways
- Concept 13: Text: SVD Closed Form Solution
- Concept 14: Video: FunkSVD
- Concept 15: Notebook: Implementing FunkSVD
- Concept 16: Screencast: Implementing FunkSVD
- Concept 17: Video: FunkSVD Review
- Concept 18: Notebook: How Are We Doing?
- Concept 19: Screencast: How Are We Doing?
- Concept 20: Video: The Cold Start Problem
- Concept 21: Notebook: The Cold Start Problem
- Concept 22: Screencast: The Cold Start Problem
- Concept 23: Video: Putting It All Together
- Concept 24: Screencast: Code Walkthrough
- Concept 25: Workspace: Recommender Module
- Concept 26: Video: Conclusion
- Concept 27: Text: Review
-
-
Module 04: Project: Recommendations with IBM
-
Lesson 01: Recommendation Engines
Put your skills to work to make recommendations for IBM Watson Studio's data platform.
-
Part 21 : Data Modeling
Learn to create relational and NoSQL data models to fit the diverse needs of data consumers. Use ETL to build databases in PostgreSQL and Apache Cassandra.
-
Module 01: Data Modeling
-
Lesson 01: Introduction to Data Modeling
In this lesson, students will learn the basic difference between relational and non-relational databases, and how each type of database fits the diverse needs of data consumers.
- Concept 01: Introduction to the Course
- Concept 02: What is Data Modeling?
- Concept 03: Why is Data Modeling Important?
- Concept 04: Who does this type of work?
- Concept 05: Intro to Relational Databases
- Concept 06: Relational Databases
- Concept 07: When to use a relational database?
- Concept 08: ACID Transactions
- Concept 09: When Not to Use a Relational Database
- Concept 10: What is PostgreSQL?
- Concept 11: Demos: Creating a Postgres Table
- Concept 12: Exercise 1: Creating a Table with Postgres
- Concept 13: Solution for Exercise 1: Create a Table with Postgres
- Concept 14: NoSQL Databases
- Concept 15: What is Apache Cassandra?
- Concept 16: When to Use a NoSql Database
- Concept 17: When Not to Use a NoSql Database
- Concept 18: Demo 2: Creating table with Cassandra
- Concept 19: Exercise 2: Create table with Cassandra
- Concept 20: Solution for Exercise 2: Create table with Cassandra
- Concept 21: Conclusion
-
Lesson 02: Relational Data Models
In this lesson, students understand the purpose of data modeling, the strengths and weaknesses of relational databases, and create schemas and tables in Postgres
- Concept 01: Learning Objective
- Concept 02: Databases
- Concept 03: Importance of Relational Databases
- Concept 04: OLAP vs OLTP
- Concept 05: Quiz 1
- Concept 06: Structuring the Database: Normalization
- Concept 07: Objectives of Normal Form
- Concept 08: Normal Forms
- Concept 09: Demo 1: Creating Normalized Tables
- Concept 10: Exercise 1: Creating Normalized Tables
- Concept 11: Solution: Exercise 1: Creating Normalized Tables
- Concept 12: Denormalization
- Concept 13: Demo 2: Creating Denormalized Tables
- Concept 14: Denormalization Vs. Normalization
- Concept 15: Exercise 2: Creating Denormalized Tables
- Concept 16: Solution: Exercise 2: Creating Denormalized Tables
- Concept 17: Fact and Dimension Tables
- Concept 18: Star Schemas
- Concept 19: Benefits of Star Schemas
- Concept 20: Snowflake Schemas
- Concept 21: Demo 3: Creating Fact and Dimension Tables
- Concept 22: Exercise 3: Creating Fact and Dimension Tables
- Concept 23: Solution: Exercise 3: Creating Fact and Dimension Tables
- Concept 24: Data Definition and Constraints
- Concept 25: Upsert
- Concept 26: Conclusion
-
Lesson 03: Project: Data Modeling with Postgres
Students will model user activity data to create a database and ETL pipeline in Postgres for a music streaming app. They will define Fact and Dimension tables and insert data into new tables.
-
Lesson 04: NoSQL Data Models
Students will understand when to use non-relational databases based on the data business needs, their strengths and weaknesses, and how to creates tables in Apache Cassandra.
- Concept 01: Learning Objectives
- Concept 02: Non-Relational Databases
- Concept 03: Distributed Databases
- Concept 04: CAP Theorem
- Concept 05: Quiz 1
- Concept 06: Denormalization in Apache Cassandra
- Concept 07: CQL
- Concept 08: Demo 1
- Concept 09: Exercise 1
- Concept 10: Exercise 1 Solution
- Concept 11: Primary Key
- Concept 12: Primary Key
- Concept 13: Demo 2
- Concept 14: Exercise 2
- Concept 15: Exercise 2: Solution
- Concept 16: Clustering Columns
- Concept 17: Demo 3
- Concept 18: Exercise 3
- Concept 19: Exercise 3: Solution
- Concept 20: WHERE Clause
- Concept 21: Demo 4
- Concept 22: Exercise 4
- Concept 23: Lesson Wrap Up
- Concept 24: Course Wrap Up
-
Lesson 05: Project: Data Modeling with Apache Cassandra
Students will model event data to create a non-relational database and ETL pipeline for a music streaming app. They will define queries and tables for a database built using Apache Cassandra.
-
Part 22 : Cloud Data Warehouses
Welcome to learn the Cloud Data Warehouses.
-
Module 01: Cloud Data Warehouses
-
Lesson 01: Introduction to Data Warehouses
In this lesson, you'll be introduced to data warehouses, the Cloud and AWS.
- Concept 01: Course Introduction
- Concept 02: Lesson Introduction
- Concept 03: Data Warehouse: Business Perspective
- Concept 04: Operational vs. Analytical Processes
- Concept 05: Data Warehouse: Technical Perspective
- Concept 06: Dimensional Modeling
- Concept 07: ETL Demo: Step 1 & 2
- Concept 08: Exercise 1: Step 1 & 2
- Concept 09: ETL Demo: Step 3
- Concept 10: Exercise 1: Step 3
- Concept 11: ETL Demo: Step 4
- Concept 12: Exercise 1: Step 4
- Concept 13: ETL Demo: Step 5
- Concept 14: Exercise 1: Step 5
- Concept 15: ETL Demo: Step 6
- Concept 16: Exercise 1: Step 6
- Concept 17: Exercise Solution 1: 3NF to Star Schema
- Concept 18: DWH Architecture: Kimball's Bus Architecture
- Concept 19: DWH Architecture: Independent Data Marts
- Concept 20: DWH Architecture: CIF
- Concept 21: DWH Architecture: Hybrid Bus & CIF
- Concept 22: OLAP Cubes
- Concept 23: OLAP Cubes: Roll-Up and Drill Down
- Concept 24: OLAP Cubes: Slice and Dice
- Concept 25: OLAP Cubes: Query Optimization
- Concept 26: OLAP Cubes Demo: Slicing & Dicing
- Concept 27: Exercise 2: Slicing & Dicing
- Concept 28: OLAP Cubes Demo: Roll-Up
- Concept 29: Exercise 2: Roll-Up & Drill Down
- Concept 30: OLAP Cubes Demo: Grouping Sets
- Concept 31: Exercise 2: Grouping Sets
- Concept 32: OLAP Cubes Demo: CUBE
- Concept 33: Exercise 2: CUBE
- Concept 34: Data Warehouse Technologies
- Concept 35: Demo: Column format in ROLAP
- Concept 36: Exercise 3: Column format in ROLAP
-
Lesson 02: Introduction to Cloud Computing and AWS
In this lesson, you'll be offered an introduction to cloud computing, and guided in setting up an AWS account and credits.
- Concept 01: Lesson Introduction
- Concept 02: Cloud Computing
- Concept 03: Amazon Web Services
- Concept 04: AWS Setup Instructions for Regular account
- Concept 05: Monitoring your AWS costs and credits
- Concept 06: Create an IAM Role
- Concept 07: Create Security Group
- Concept 08: Launch a Redshift Cluster
- Concept 09: Create an IAM User
- Concept 10: Delete a Redshift Cluster
- Concept 11: Create an S3 Bucket
- Concept 12: Upload to S3 Bucket
- Concept 13: Create PostgreSQL RDS
- Concept 14: Avoid Paying Unexpected Costs for AWS
-
Lesson 03: Implementing Data Warehouses on AWS
In this lesson, you'll learn to implement a data warehouse on AWS
- Concept 01: Lesson Introduction
- Concept 02: Data Warehouse: A Closer Look
- Concept 03: Choices for Implementing a Data Warehouse
- Concept 04: DWH Dimensional Model Storage on AWS
- Concept 05: Amazon Redshift Technology
- Concept 06: Amazon Redshift Architecture
- Concept 07: Redshift Architecture Example
- Concept 08: SQL to SQL ETL
- Concept 09: SQL to SQL ETL - AWS Case
- Concept 10: Redshift & ETL in Context
- Concept 11: Ingesting at Scale
- Concept 12: Redshift ETL Examples
- Concept 13: Redshift ETL Continued
- Concept 14: Redshift Cluster Quick Launcher
- Concept 15: Exercise 1: Launch Redshift Cluster
- Concept 16: Problems with the Quick Launcher
- Concept 17: Infrastructure as Code on AWS
- Concept 18: Enabling Programmatic Access fo IaC
- Concept 19: Demo: Infrastructure as Code
- Concept 20: Exercise 2: Infrastructure as Code
- Concept 21: Exercise Solution 2: Infrastructure as Code
- Concept 22: Demo: Parallel ETL
- Concept 23: Exercise 3: Parallel ETL
- Concept 24: Exercise Solution 3: Parallel ETL
- Concept 25: Optimizing Table Design
- Concept 26: Distribution Style: Even
- Concept 27: Distribution Style: All
- Concept 28: Distribution Syle: Auto
- Concept 29: Distribution Syle: Key
- Concept 30: Sorting Key
- Concept 31: Sorting Key Example
- Concept 32: Demo: Table Design
- Concept 33: Exercise 4: Table Design
- Concept 34: Exercise Solution 4: Table Design
- Concept 35: Conclusion
-
Lesson 04: Project: Data Warehouse
Students will build an ETL pipeline that extracts data from S3, stages them in Redshift, and transforms data into a set of dimensional tables for their analytics team.
-
Part 23 : Data Lakes with Spark
Welcome to learn the Data Lakes with Spark.
-
Module 01: Data Lakes with Spark
-
Lesson 01: The Power of Spark
In this lesson, you will learn about the problems that Apache Spark is designed to solve. You'll also learn about the greater Big Data ecosystem and how Spark fits into it.
- Concept 01: Introduction
- Concept 02: What is Big Data?
- Concept 03: Numbers Everyone Should Know
- Concept 04: Hardware: CPU
- Concept 05: Hardware: Memory
- Concept 06: Hardware: Storage
- Concept 07: Hardware: Network
- Concept 08: Hardware: Key Ratios
- Concept 09: Small Data Numbers
- Concept 10: Big Data Numbers
- Concept 11: Medium Data Numbers
- Concept 12: History of Distributed Computing
- Concept 13: The Hadoop Ecosystem
- Concept 14: MapReduce
- Concept 15: Hadoop MapReduce [Demo]
- Concept 16: The Spark Cluster
- Concept 17: Spark Use Cases
- Concept 18: Summary
-
Lesson 02: Data Wrangling with Spark
In this lesson, we'll dive into how to use Spark for cleaning and aggregating data.
- Concept 01: Introduction
- Concept 02: Functional Programming
- Concept 03: Why Use Functional Programming
- Concept 04: Procedural Example
- Concept 05: Procedural [Example Code]
- Concept 06: Pure Functions in the Bread Factory
- Concept 07: The Spark DAGs: Recipe for Data
- Concept 08: Maps and Lambda Functions
- Concept 09: Maps and Lambda Functions [Example Code]
- Concept 10: Data Formats
- Concept 11: Distributed Data Stores
- Concept 12: SparkSession
- Concept 13: Reading and Writing Data into Spark Data Frames
- Concept 14: Read and Write Data into Spark Data Frames [example code]
- Concept 15: Imperative vs Declarative programming
- Concept 16: Data Wrangling with DataFrames
- Concept 17: Data Wrangling with DataFrames Extra Tips
- Concept 18: Data Wrangling with Spark [Example Code]
- Concept 19: Quiz - Data Wrangling with DataFrames
- Concept 20: Quiz - Data Wrangling with DataFrames Jupyter Notebook
- Concept 21: Quiz [Solution Code]
- Concept 22: Spark SQL
- Concept 23: Example Spark SQL
- Concept 24: Example Spark SQL [Example Code]
- Concept 25: Quiz - Data Wrangling with SparkSQL
- Concept 26: Quiz [Spark SQL Solution Code]
- Concept 27: RDDs
- Concept 28: Summary
-
Lesson 03: Setting up Spark Clusters with AWS
In this lesson, you will learn to run Spark on a distributed cluster in AWS UI and AWS CLI.
- Concept 01: Introduction
- Concept 02: From Local to Standalone Mode
- Concept 03: Setup Instructions AWS
- Concept 04: Alternate ways to connect to AWS using CLI
- Concept 05: Create EMR Using AWS CLI
- Concept 06: Using Notebooks on Your Cluster
- Concept 07: Spark Scripts
- Concept 08: Submitting Spark Scripts
- Concept 09: Storing and Retrieving Data on the Cloud
- Concept 10: Reading and Writing to Amazon S3
- Concept 11: Understanding difference between HDFS and AWS S3
- Concept 12: Reading and Writing Data to HDFS
- Concept 13: Recap Local Mode to Cluster Mode
-
Lesson 04: Debugging and Optimization
In this lesson, you will learn best practices for debugging and optimizing your Spark applications.
- Concept 01: Debugging is Hard
- Concept 02: Intro: Syntax Errors
- Concept 03: Code Errors
- Concept 04: Data Errors
- Concept 05: Debugging your Code
- Concept 06: How to Use Accumulators
- Concept 07: Spark Broadcast
- Concept 08: Spark WebUI
- Concept 09: Connecting to the Spark Web UI
- Concept 10: Different types of Spark Functions
- Concept 11: Getting Familiar with the Spark UI
- Concept 12: Review of the Log Data
- Concept 13: Intro: Code Optimization
- Concept 14: Understanding Data Skewness
- Concept 15: Optimizing for Data Skewness
- Concept 16: Other Issues and How to Address Them
- Concept 17: Lesson Summary
-
Lesson 05: Introduction to Data Lakes
In this lesson, you'll learn to need for a data lake, how it's different from a data warehouse, and various options to implement it on AWS
- Concept 01: Introduction
- Concept 02: Lesson Overview
- Concept 03: Why Data Lakes: Evolution of the Data Warehouse
- Concept 04: Why Data Lakes: Unstructured & Big Data
- Concept 05: Why Data Lakes: New Roles & Advanced Analytics
- Concept 06: Big Data Effects: Low Costs, ETL Offloading
- Concept 07: Big Data Effects: Schema-on-Read
- Concept 08: Big Data Effects: (Un-/Semi-)Structured support
- Concept 09: Demo: Schema On Read Pt 1
- Concept 10: Demo: Schema On Read Pt 2
- Concept 11: Demo: Schema On Read Pt 3
- Concept 12: Demo: Schema On Read Pt 4
- Concept 13: Exercise 1: Schema On Read
- Concept 14: Demo: Advanced Analytics NLP Pt 1
- Concept 15: Demo: Advanced Analytics NLP Pt 2
- Concept 16: Demo: Advanced Analytics NLP Pt 3
- Concept 17: Exercise 2: Advanced Analytics NLP
- Concept 18: Data Lake Implementation Introduction
- Concept 19: Data Lake Concepts
- Concept 20: Data Lake vs Data Warehouse
- Concept 21: AWS Setup
- Concept 22: Data Lake Options on AWS
- Concept 23: AWS Options: EMR (HDFS + Spark)
- Concept 24: AWS Options: EMR: S3 + Spark
- Concept 25: AWS Options: Athena
- Concept 26: Demo: Data Lake on S3 Pt 1
- Concept 27: Demo: Data Lake on S3 Pt 2
- Concept 28: Exercise 3: Data Lake on S3
- Concept 29: Demo: Data Lake on EMR Pt 1
- Concept 30: Demo: Data Lake on EMR Pt 2
- Concept 31: Demo: Data Lake on Athena Pt 1
- Concept 32: Demo: Data Lake on Athena Pt 2
- Concept 33: Data Lake Issues
- Concept 34: [AWS] Launch EMR Cluster and Notebook
- Concept 35: [AWS] Avoid Paying Unexpected Costs
-
Lesson 06: Project: Data Lake
Students will build a data lake and an ETL pipeline in Spark that loads data from S3, processes the data into analytics tables, and loads them back into S3.
-
Lesson 07: Optimize Your GitHub Profile
Other professionals are collaborating on GitHub and growing their network. Submit your profile to ensure your profile is on par with leaders in your field.
- Concept 01: Prove Your Skills With GitHub
- Concept 02: Introduction
- Concept 03: GitHub profile important items
- Concept 04: Good GitHub repository
- Concept 05: Interview with Art - Part 1
- Concept 06: Identify fixes for example “bad” profile
- Concept 07: Quick Fixes #1
- Concept 08: Quick Fixes #2
- Concept 09: Writing READMEs with Walter
- Concept 10: Interview with Art - Part 2
- Concept 11: Commit messages best practices
- Concept 12: Reflect on your commit messages
- Concept 13: Participating in open source projects
- Concept 14: Interview with Art - Part 3
- Concept 15: Participating in open source projects 2
- Concept 16: Starring interesting repositories
- Concept 17: Next Steps
-
Part 24 : Data Pipelines with Airflow
Welcome to learn the Data Pipelines with Airflow.
-
Module 01: Data Pipelines with Airflow
-
Lesson 01: Data Pipelines
Students will get an introduction to data pipelines, Apache Airflow as a data pipeline solution, how Airflow works, how to configure and scheduling data pipelines with Airflow and debug a pipeline job
- Concept 01: Welcome
- Concept 02: AWS Account and Credits
- Concept 03: What is a Data Pipeline?
- Concept 04: Data Validation
- Concept 05: DAGs and Data Pipelines
- Concept 06: Bikeshare DAG
- Concept 07: Introduction to Apache Airflow
- Concept 08: Demo 1: Airflow DAGs
- Concept 09: Workspace Instructions
- Concept 10: Exercise 1: Airflow DAGs
- Concept 11: Solution 1: Airflow DAGs
- Concept 12: How Airflow Works
- Concept 13: Airflow Runtime Architecture
- Concept 14: Building a Data Pipeline
- Concept 15: Demo 2: Run the Schedules
- Concept 16: Exercise 2: Run the Schedules
- Concept 17: Solution 2: Run the Schedules
- Concept 18: Operators and Tasks
- Concept 19: Demo 3: Task Dependencies
- Concept 20: Exercise 3: Task Dependencies
- Concept 21: Solution: Task Dependencies
- Concept 22: Airflow Hooks
- Concept 23: Demo 4: Connections and Hooks
- Concept 24: Exercise 4: Connections and Hooks
- Concept 25: Solution 4: Connections and Hooks
- Concept 26: Demo 5: Context and Templating
- Concept 27: Exercise 5: Context and Templating
- Concept 28: Solution 5: Context and Templating
- Concept 29: Quiz: Review of Pipeline Components
- Concept 30: Demo: Exercise 6: Building the S3 to Redshift DAG
- Concept 31: Exercise 6: Build the S3 to Redshift DAG
- Concept 32: Solution 6: Build the S3 to Redshift DAG
- Concept 33: Conclusion
-
Lesson 02: Data Quality
Students will learn how to track data lineage and set up data pipeline schedules, partition data to optimize pipelines, investigating Data Quality issues, and write tests to ensure data quality.
- Concept 01: What we are going to learn?
- Concept 02: What is Data Lineage?
- Concept 03: Visualizing Data Lineage
- Concept 04: Demo 1: Data Lineage in Airflow
- Concept 05: Exercise 1: Data Lineage in Airflow
- Concept 06: Solution 1: Data Lineage in Airflow
- Concept 07: Data Pipeline Schedules
- Concept 08: Scheduling in Airflow
- Concept 09: Updating DAGs
- Concept 10: Demo 2: Schedules and Backfills in Airflow
- Concept 11: Exercise 2: Schedules and Backfills in Airflow
- Concept 12: Solution 2: : Schedules and Backfills in Airflow
- Concept 13: Data Partitioning
- Concept 14: Goals of Data Partitioning
- Concept 15: Demo 3: Data Partitioning
- Concept 16: Exercise 3: Data Partitioning
- Concept 17: Solution 3: Data Partitioning
- Concept 18: Data Quality
- Concept 19: Demo 4: Data Quality
- Concept 20: Exercise 4: Data Quality
- Concept 21: Solution 4: Data Quality
- Concept 22: Conclusion
-
Lesson 03: Production Data Pipelines
In this last lesson, students will learn how to build Pipelines with maintainability and reusability in mind. They will also learn about pipeline monitoring.
- Concept 01: Lesson Introduction
- Concept 02: Extending Airflow with Plugins
- Concept 03: Extending Airflow Hooks & Contrib
- Concept 04: Demo 1: Operator Plugins
- Concept 05: Exercise 1: Operator Plugins
- Concept 06: Solution 1: Operator Plugins
- Concept 07: Best Practices for Data Pipeline Steps - Task Boundaries
- Concept 08: Demo 2: Task Boundaries
- Concept 09: Exercise 2: Refactor a DAG
- Concept 10: Solution 2: Refactor a DAG
- Concept 11: Subdags: Introduction and When to Use Them
- Concept 12: SubDAGs: Drawbacks of SubDAGs
- Concept 13: Quiz: Subdags
- Concept 14: Demo 3: SubDAGs
- Concept 15: Exercise 3: SubDAGs
- Concept 16: Solution 3: Subdags
- Concept 17: Monitoring
- Concept 18: Monitoring
- Concept 19: Exercise 4: Building a Full DAG
- Concept 20: Solution 4: Building a Full Pipeline
- Concept 21: Conclusion
- Concept 22: Additional Resources: Data Pipeline Orchestrators
-
Lesson 04: Project: Data Pipelines
Students continue to work on the music streaming company’s data infrastructure by creating and automating a set of data pipelines with Airflow, monitoring and debugging production pipelines
-
Part 25 : C++ Programming
Welcome to learn the beauty of C++ Programming.
-
Module 01: C++ Programming
-
Lesson 01: C++ Getting Started
The differences between C++ and Python and how to write C++ code.
- Concept 01: Introduction
- Concept 02: Lesson Overview
- Concept 03: Elecia White
- Concept 04: Why C++
- Concept 05: Python and C++ Comparison
- Concept 06: Static vs Dynamic Typing
- Concept 07: C++ - A Statically Typed Language
- Concept 08: Basic Data Types
- Concept 09: Floating versus Double [demonstration]
- Concept 10: Doubles are Bigger
- Concept 11: Common Errors and Error Messages
- Concept 12: C++ Functions
- Concept 13: Anatomy of a Function
- Concept 14: Multiple Outputs
- Concept 15: Two Functions Same Name
- Concept 16: Function Signatures 1
- Concept 17: Function Signatures 2
- Concept 18: If and Boolean Logic
- Concept 19: While and For Loops
- Concept 20: Switch Statement
- Concept 21: Libraries
- Concept 22: Forge on!
-
Lesson 02: C++ Vectors
To program matrix algebra operations and translate your Python code, you will need to use C++ Vectors. These vectors are similar to Python lists, but the syntax can be somewhat tricky.
- Concept 01: C++ Vectors
- Concept 02: Namespaces
- Concept 03: Python Lists vs. C++ Vectors
- Concept 04: Initializing Vector Values
- Concept 05: Vector Methods
- Concept 06: Vectors and For Loops
- Concept 07: Math and Vectors
- Concept 08: 1D Vector Playground
- Concept 09: 2D Vectors
- Concept 10: 2D Vectors and For Loops
- Concept 11: 2D Vector Playground
- Concept 12: Next Lesson
-
Lesson 03: Practical C++
Learn how to write C++ code on your own computer and compile it into a executable program without running into too many compilation errors.
-
Lesson 04: C++ Object Oriented Programming
Learn the syntax of C++ object oriented programming as well as some of the additional OOP features provided by the language.
- Concept 01: Introduction
- Concept 02: Python vs. C++
- Concept 03: Why use Object Oriented Programming?
- Concept 04: Using a Class in C++ [Demo]
- Concept 05: Explanation of the Main.cpp File
- Concept 06: Practice Using a Class
- Concept 07: Review: Anatomy of a Class
- Concept 08: Other Facets of C++ Classes
- Concept 09: Private and Public
- Concept 10: Header Files
- Concept 11: Inclusion Guards
- Concept 12: Implement a Class
- Concept 13: Class Variables
- Concept 14: Class Function Declarations
- Concept 15: Constructor Functions
- Concept 16: Set and Get Functions
- Concept 17: Matrix Functions
- Concept 18: Use an Inclusion Guard
- Concept 19: Instantiate an Object
- Concept 20: Running your Program Locally
-
Lesson 05: Python and C++ Speed
In this lesson, we'll compare the execution times of C++ and Python programs.
-
Lesson 06: C++ Intro to Optimization
Optimizing C++ involves understanding how a computer actually runs your programs. You'll learn how C++ uses the CPU and RAM to execute your code and get a sense for what can slow things down.
- Concept 01: Course Introduction
- Concept 02: Empathize with the Computer
- Concept 03: Intro to Computer Hardware
- Concept 04: Embedded Terminal Explanation
- Concept 05: Demo: Machine Code
- Concept 06: Assembly Language
- Concept 07: Binary
- Concept 08: Demo: Binary
- Concept 09: Demo: Binary Floats
- Concept 10: Memory and the CPU
- Concept 11: Demo: Stack vs Heap
- Concept 12: Outro
-
Lesson 07: C++ Optimization Practice
Now you understand how C++ programs execute. It's time to learn specific optimization techniques and put them into practice. This lesson will prepare you for the lesson's code optimization project.
- Concept 01: Introduction
- Concept 02: Software Development and Optimization
- Concept 03: Optimization Techniques
- Concept 04: Dead Code
- Concept 05: Exercise: Remove Dead Code
- Concept 06: If Statements
- Concept 07: Exercise: If Statements
- Concept 08: For Loops
- Concept 09: Exercise: For Loops
- Concept 10: Intermediate Variables
- Concept 11: Exercise: Intermediate Variables
- Concept 12: Vector Storage
- Concept 13: Exercise: Vector Storage
- Concept 14: References
- Concept 15: Exercise: References
- Concept 16: Sebastian's Synchronization Story
- Concept 17: Static Keyword
- Concept 18: Exercise: Static Keyword
- Concept 19: Speed Challenge
-
Lesson 08: Project: Optimize Histogram Filter
Get ready to optimize some C++ code. You are provided with a working 2-dimensional histogram filter; your job is to get the histogram filter code to run faster!
-
Part 26 : Computer Vision
Welcome to learn the beauty of Computer Vision.
-
Lesson 01: Welcome to Computer Vision
Welcome to the Computer Vision Nanodegree program!
- Concept 01: Welcome to the Computer Vision Nanodegree Program
- Concept 02: Computer Vision in Industry
- Concept 03: Course Outline
- Concept 04: Projects and Topics
- Concept 05: Partnership with Industry
- Concept 06: Learning in the Classroom
- Concept 07: Community Guidelines
- Concept 08: Moving Forward!
- Concept 09: Career Services
-
Lesson 04: Image Representation & Classification
Learn how images are represented numerically and implement image processing techniques, such as color masking and binary classification.
- Concept 01: Intro to Pattern Recognition
- Concept 02: Emotional Intelligence
- Concept 03: Computer Vision Pipeline
- Concept 04: Training a Model
- Concept 05: Separating Data
- Concept 06: AffdexMe Demo
- Concept 07: Image Formation
- Concept 08: Images as Grids of Pixels
- Concept 09: Notebook: Images as Numerical Data
- Concept 10: Color Images
- Concept 11: Color or Grayscale?
- Concept 12: Notebook: Visualizing RGB Channels
- Concept 13: Color Thresholds
- Concept 14: Coding a Blue Screen
- Concept 15: Notebook: Blue Screen
- Concept 16: Notebook: Green Screen
- Concept 17: Color Spaces and Transforms
- Concept 18: Notebook: Color Conversion
- Concept 19: Day and Night Classification Challenge
- Concept 20: Notebook: Load and Visualize the Data
- Concept 21: Labeled Data and Accuracy
- Concept 22: Distinguishing Traits
- Concept 23: Features
- Concept 24: Standardizing Output
- Concept 25: Notebook: Standardizing Day and Night Images
- Concept 26: Average Brightness
- Concept 27: Notebook: Average Brightness Feature Extraction
- Concept 28: Classification
- Concept 29: Notebook: Classification
- Concept 30: Evaluation Metrics
- Concept 31: Notebook: Accuracy and Misclassification
- Concept 32: Review and the Computer Vision Pipeline
-
Lesson 05: Convolutional Filters and Edge Detection
Learn about frequency in images and implement your own image filters for detecting edges and shapes in an image. Use a computer vision library to perform face detection.
- Concept 01: Filters and Finding Edges
- Concept 02: Frequency in Images
- Concept 03: Notebook: Fourier Transforms
- Concept 04: Quiz: Fourier Tranform Image
- Concept 05: High-pass Filters
- Concept 06: Quiz: Kernels
- Concept 07: Creating a Filter
- Concept 08: Gradients and Sobel Filters
- Concept 09: Notebook: Finding Edges
- Concept 10: Low-pass Filters
- Concept 11: Gaussian Blur
- Concept 12: Notebook: Gaussian Blur
- Concept 13: Notebook: Fourier Transforms of Filters
- Concept 14: Convolutional Layer
- Concept 15: Canny Edge Detector
- Concept 16: Notebook: Canny Edge Detection
- Concept 17: Shape Detection
- Concept 18: Hough Transform
- Concept 19: Quiz: Hough Space
- Concept 20: Hough Line Detection
- Concept 21: Notebook: Hough Detections
- Concept 22: Object Recognition & Introducing Haar Cascades
- Concept 23: Haar Cascades
- Concept 24: Notebook: Haar Cascade Face Detection
- Concept 25: Face Recognition and the Dangers of Bias
- Concept 26: Beyond Edges, Selecting Different Features
-
Lesson 06: Types of Features & Image Segmentation
Program a corner detector and learn techniques, like k-means clustering, for segmenting an image into unique parts.
- Concept 01: Types of Features
- Concept 02: Corner Detectors
- Concept 03: Notebook: Find the Corners
- Concept 04: Dilation and Erosion
- Concept 05: Image Segmentation
- Concept 06: Image Contours
- Concept 07: Notebook: Find Contours and Features
- Concept 08: Solution: Find Contours and Features
- Concept 09: K-means Clustering
- Concept 10: K-means Implementation
- Concept 11: Notebook: K-means Clustering
-
Lesson 07: Feature Vectors
Learn how to describe objects and images using feature vectors.
- Concept 01: Corners and Object Detection
- Concept 02: Feature Vectors
- Concept 03: Real-Time Feature Detection
- Concept 04: Introduction to ORB
- Concept 05: FAST
- Concept 06: Quiz: FAST Keypoints
- Concept 07: BRIEF
- Concept 08: Scale and Rotation Invariance
- Concept 09: Notebook: Image Pyramids
- Concept 10: Feature Matching
- Concept 11: ORB in Video
- Concept 12: Notebook: Implementing ORB
- Concept 13: HOG
- Concept 14: Notebook: Implementing HOG
- Concept 15: Learning to Find Features
-
Lesson 08: CNN Layers and Feature Visualization
Define and train your own convolution neural network for clothing recognition. Use feature visualization techniques to see what a network has learned.
- Concept 01: Introduction to CNN Layers
- Concept 02: Review: Training a Neural Network
- Concept 03: Lesson Outline and Data
- Concept 04: CNN Architecture, VGG-16
- Concept 05: Convolutional Layers
- Concept 06: Defining Layers in PyTorch
- Concept 07: Notebook: Visualizing a Convolutional Layer
- Concept 08: Pooling, VGG-16 Architecture
- Concept 09: Pooling Layers
- Concept 10: Notebook: Visualizing a Pooling Layer
- Concept 11: Fully-Connected Layers, VGG-16
- Concept 12: Notebook: Visualizing FashionMNIST
- Concept 13: Training in PyTorch
- Concept 14: Notebook: Fashion MNIST Training Exercise
- Concept 15: Notebook: FashionMNIST, Solution 1
- Concept 16: Review: Dropout
- Concept 17: Notebook: FashionMNIST, Solution 2
- Concept 18: Network Structure
- Concept 19: Feature Visualization
- Concept 20: Feature Maps
- Concept 21: First Convolutional Layer
- Concept 22: Visualizing CNNs (Part 2)
- Concept 23: Visualizing Activations
- Concept 24: Notebook: Feature Viz for FashionMNIST
- Concept 25: Last Feature Vector and t-SNE
- Concept 26: Occlusion, Saliency, and Guided Backpropagation
- Concept 27: Summary of Feature Viz
- Concept 28: Image Classification & Regression Challenges
-
Lesson 09: Project: Facial Keypoint Detection
Apply your knowledge of image processing and deep learning to create a CNN for facial keypoint (eyes, mouth, nose, etc.) detection.
Part 27 : Cloud Computing
Learn how to build and train your data in Cloud Computing
-
Module 01: Optional: Cloud Computing
-
Lesson 01: Cloud Computing with Google Cloud
Learn how to leverage GPUs on Google Cloud for machine learning and scientific computing.
-
Part 28 : Advanced Computer Vision & Deep Learning
Learn the Advanced Computer Vision & Deep Learning.
-
Module 01: Advanced Computer Vision & Deep Learning
-
Lesson 01: Advanced CNN Architectures
Learn about advances in CNN architectures and see how region-based CNN’s, like Faster R-CNN, have allowed for fast, localized object recognition in images.
- Concept 01: CNN's and Scene Understanding
- Concept 02: More than Classification
- Concept 03: Classification and Localization
- Concept 04: Bounding Boxes and Regression
- Concept 05: Quiz: Loss Values
- Concept 06: Region Proposals
- Concept 07: R-CNN
- Concept 08: Fast R-CNN
- Concept 09: Faster R-CNN
- Concept 10: Detection With and Without Proposals
-
Lesson 02: YOLO
Learn about the YOLO (You Only Look Once) multi-object detection model and work with a YOLO implementation.
- Concept 01: Introduction to YOLO
- Concept 02: YOLO Output
- Concept 03: Sliding Windows, Revisited
- Concept 04: CNN & Sliding Windows
- Concept 05: Using a Grid
- Concept 06: Training on a Grid
- Concept 07: Generating Bounding Boxes
- Concept 08: Quiz: Generating Boxes and Detecting Objects
- Concept 09: Too Many Boxes
- Concept 10: Intersection over Union (IoU)
- Concept 11: Quiz: IoU and Overlap Limits
- Concept 12: Non-Maximal Suppression
- Concept 13: Anchor Boxes
- Concept 14: YOLO Algorithm
- Concept 15: Notebook: YOLO Implementation
-
Lesson 03: RNN's
Explore how memory can be incorporated into a deep learning model using recurrent neural networks (RNNs). Learn how RNNs can learn from and generate ordered sequences of data.
- Concept 01: RNN's in Computer Vision
- Concept 02: RNN Introduction
- Concept 03: RNN History
- Concept 04: RNN Applications
- Concept 05: Feedforward Neural Network-Reminder
- Concept 06: The Feedforward Process
- Concept 07: Feedforward Quiz
- Concept 08: Backpropagation- Theory
- Concept 09: Backpropagation - Example (part a)
- Concept 10: Backpropagation- Example (part b)
- Concept 11: Backpropagation Quiz
- Concept 12: RNN (part a)
- Concept 13: RNN (part b)
- Concept 14: RNN- Unfolded Model
- Concept 15: Unfolded Model Quiz
- Concept 16: RNN- Example
- Concept 17: Backpropagation Through Time (part a)
- Concept 18: Backpropagation Through Time (part b)
- Concept 19: Backpropagation Through Time (part c)
- Concept 20: BPTT Quiz 1
- Concept 21: BPTT Quiz 2
- Concept 22: BPTT Quiz 3
- Concept 23: Some more math
- Concept 24: RNN Summary
- Concept 25: From RNN to LSTM
- Concept 26: Wrap Up
-
Lesson 04: Long Short-Term Memory Networks (LSTMs)
Luis explains Long Short-Term Memory Networks (LSTM), and similar architectures which have the benefits of preserving long term memory.
- Concept 01: Intro to LSTM
- Concept 02: RNN vs LSTM
- Concept 03: Basics of LSTM
- Concept 04: Architecture of LSTM
- Concept 05: Notebook: LSTM Structure and Hidden State, PyTorch
- Concept 06: The Learn Gate
- Concept 07: The Forget Gate
- Concept 08: The Remember Gate
- Concept 09: The Use Gate
- Concept 10: Putting it All Together
- Concept 11: Quiz
- Concept 12: Notebook: LSTM for Part of Speech Tagging
- Concept 13: Character-Level RNN
- Concept 14: Sequence Batching
- Concept 15: Notebook: Character-Level LSTM
- Concept 16: Other architectures
-
Lesson 05: Hyperparameters
Learn about a number of different hyperparameters that are used in defining and training deep learning models. We'll discuss starting values and intuitions for tuning each hyperparameter.
- Concept 01: Introducing Jay
- Concept 02: Introduction
- Concept 03: Learning Rate
- Concept 04: Learning Rate
- Concept 05: Minibatch Size
- Concept 06: Number of Training Iterations / Epochs
- Concept 07: Number of Hidden Units / Layers
- Concept 08: RNN Hyperparameters
- Concept 09: RNN Hyperparameters
- Concept 10: Sources & References
-
Lesson 06: Optional: Attention Mechanisms
Attention is one of the most important recent innovations in deep learning. In this section, you'll learn how attention models work and go over a basic code implementation.
- Concept 01: Introduction to Attention
- Concept 02: Encoders and Decoders
- Concept 03: Elective: Text Sentiment Analysis
- Concept 04: Sequence to Sequence Recap
- Concept 05: Encoding -- Attention Overview
- Concept 06: Decoding -- Attention Overview
- Concept 07: Attention Overview
- Concept 08: Attention Encoder
- Concept 09: Attention Decoder
- Concept 10: Attention Encoder & Decoder
- Concept 11: Bahdanau and Luong Attention
- Concept 12: Multiplicative Attention
- Concept 13: Additive Attention
- Concept 14: Additive and Multiplicative Attention
- Concept 15: Computer Vision Applications
- Concept 16: Other Attention Methods
- Concept 17: The Transformer and Self-Attention
- Concept 18: Notebook: Attention Basics
- Concept 19: [SOLUTION]: Attention Basics
- Concept 20: Outro
-
Lesson 07: Image Captioning
Learn how to combine CNNs and RNNs to build a complex, automatic image captioning model.
- Concept 01: Introduction to Image Captioning
- Concept 02: Leveraging Neural Networks
- Concept 03: Captions and the COCO Dataset
- Concept 04: Visualize the Dataset
- Concept 05: CNN-RNN Model
- Concept 06: The Glue, Feature Vector
- Concept 07: Tokenizing Captions
- Concept 08: Tokenizing Words
- Concept 09: RNN Training
- Concept 10: Video Captioning
- Concept 11: On to the Project!
-
Lesson 08: Project: Image Captioning
Train a CNN-RNN model to predict captions for a given image. Your main task will be to implement an effective RNN decoder for a CNN encoder.
-
Lesson 09: Take 30 Min to Improve your LinkedIn
Find your next job or connect with industry peers on LinkedIn. Ensure your profile attracts relevant leads that will grow your professional network.
- Concept 01: Get Opportunities with LinkedIn
- Concept 02: Use Your Story to Stand Out
- Concept 03: Why Use an Elevator Pitch
- Concept 04: Create Your Elevator Pitch
- Concept 05: Use Your Elevator Pitch on LinkedIn
- Concept 06: Create Your Profile With SEO In Mind
- Concept 07: Profile Essentials
- Concept 08: Work Experiences & Accomplishments
- Concept 09: Build and Strengthen Your Network
- Concept 10: Reaching Out on LinkedIn
- Concept 11: Boost Your Visibility
- Concept 12: Up Next
-
Part 29 : Object Tracking and Localization
Welcome to Object Tracking and Localization.
-
Module 01: Object Tracking and Localization
-
Lesson 01: Introduction to Motion
This lesson introduces a way to represent motion mathematically, outlines what you'll learn in this section, and introduces optical flow.
-
Lesson 02: Robot Localization
Learn to implement a Bayesian filter to locate a robot in space and represent uncertainty in robot motion.
- Concept 01: Probability Review
- Concept 02: Uncertainty and Bayes' Rule
- Concept 03: Reducing Uncertainty
- Concept 04: Probability Distributions
- Concept 05: Localization
- Concept 06: Total Probability
- Concept 07: Notebook: 1D Robot World
- Concept 08: Probability After Sense
- Concept 09: Notebook: Probability After Sense
- Concept 10: Normalize Distribution
- Concept 11: Sense Function
- Concept 12: Notebook: Sense Function
- Concept 13: Answer: Sense Function
- Concept 14: Normalized Sense Function
- Concept 15: Notebook: Normalized Sense Function
- Concept 16: Answer: Normalized Sense Function
- Concept 17: Test Sense Function
- Concept 18: Multiple Measurements
- Concept 19: Notebook: Multiple Measurements
- Concept 20: Answer: Multiple Measurements
- Concept 21: Exact Motion
- Concept 22: Move Function
- Concept 23: Notebook: Move Function
- Concept 24: Answer: Move Function
- Concept 25: Inexact Motion
- Concept 26: Inexact Move Function
- Concept 27: Notebook: Inexact Move Function
- Concept 28: Answer: Inexact Move Function
- Concept 29: Limit Distribution
- Concept 30: Move Twice
- Concept 31: Move 1000
- Concept 32: Notebook: Multiple Moves
- Concept 33: Sense and Move
- Concept 34: Notebook: Sense and Move Cycle
- Concept 35: Answer: Sense and Move
- Concept 36: Sense and Move 2
- Concept 37: Localization Summary
- Concept 38: C++ Elective & Implementation
-
Lesson 03: Mini-project: 2D Histogram Filter
Write sense and move functions (and debug) a 2D histogram filter!
-
Lesson 04: Introduction to Kalman Filters
Learn the intuition behind the Kalman Filter, a vehicle tracking algorithm, and implement a one-dimensional tracker of your own.
- Concept 01: Kalman Filters and Linear Algebra
- Concept 02: Introduction
- Concept 03: Tracking Intro
- Concept 04: Answer: Tracking Intro
- Concept 05: Gaussian Intro
- Concept 06: Answer: Gaussian Intro
- Concept 07: Quiz: Variance and Preferred Gaussian
- Concept 08: Answer: Variance and Preferred Gaussian
- Concept 09: Gaussian Function and Maximum
- Concept 10: Quiz: Shifting the Mean
- Concept 11: Answer: Shifting the Mean
- Concept 12: Quiz: Predicting the Peak
- Concept 13: Answer: Predicting the Peak
- Concept 14: Quiz: Parameter Update
- Concept 15: Answer: Parameter Update
- Concept 16: Notebook: New Mean and Variance
- Concept 17: Solution: New Mean and Variance
- Concept 18: Quiz: Gaussian Motion
- Concept 19: Answer: Gaussian Motion
- Concept 20: Predict Function
- Concept 21: Notebook: Predict Function
- Concept 22: Answer: Predict Function
- Concept 23: Kalman Filter Code
- Concept 24: Notebook: 1D Kalman Filter
- Concept 25: Answer: 1D Kalman Filter
- Concept 26: Kalman Prediction
- Concept 27: Next: Motion Models and State
-
Lesson 05: Representing State and Motion
Learn about representing the state of a car in a vector that can be modified using linear algebra.
- Concept 01: Localization Steps
- Concept 02: Intro to State
- Concept 03: Motion Models
- Concept 04: Quiz: Predicting State
- Concept 05: A Different Model
- Concept 06: Kinematics
- Concept 07: Quantifying State
- Concept 08: Lesson Outline
- Concept 09: Always Moving
- Concept 10: Car Object
- Concept 11: Interacting with a Car Object
- Concept 12: Look at the Class Code
- Concept 13: Turn Right
- Concept 14: Adding Color
- Concept 15: Instantiate Multiple Cars
- Concept 16: Color Class
- Concept 17: Overloading Functions
- Concept 18: Overloading Color Addition
- Concept 19: State Vector
- Concept 20: State Transformation Matrix
- Concept 21: Matrix Multiplication
- Concept 22: 1D State Vector and More Multiplication
- Concept 23: Modify Predict State
- Concept 24: Working with Matrices
-
Lesson 06: Matrices and Transformation of State
Linear Algebra is a rich branch of math and a useful tool. In this lesson you'll learn about the matrix operations that underly multidimensional Kalman Filters.
- Concept 01: Kalman Filter Land
- Concept 02: Kalman Filter Prediction
- Concept 03: Another Prediction
- Concept 04: More Kalman FIlters
- Concept 05: A Note on Notation
- Concept 06: Kalman Filter Design
- Concept 07: Let's Look at Where We Are
- Concept 08: The Kalman Filter Equations
- Concept 09: Simplifying the Kalman Filter Equations
- Concept 10: The Rest of the Lesson
- Concept 11: Representing State with Matrices
- Concept 12: Kalman Equation Reference
- Concept 13: What is a vector?
- Concept 14: Vectors in Python
- Concept 15: Coding Vectors
- Concept 16: Coding Vectors (solution)
- Concept 17: Guide to Mathematical Notation
- Concept 18: Matrices in Python
- Concept 19: Coding Matrices
- Concept 20: Coding Matrices (Solution)
- Concept 21: Matrix Addition
- Concept 22: Coding Matrix Addition
- Concept 23: Matrix Multiplication
- Concept 24: Coding Matrix Multiplication
- Concept 25: Transpose of a Matrix
- Concept 26: Coding the Transpose
- Concept 27: The Identity Matrix
- Concept 28: Coding Identity Matrix
- Concept 29: Matrix Inverse
- Concept 30: Coding Matrix Inverse
- Concept 31: What to Take Away from this Lesson
-
Lesson 07: Simultaneous Localization and Mapping
Learn how to implement SLAM: simultaneously localize an autonomous vehicle and create a map of landmarks in an environment.
- Concept 01: Introduction to SLAM
- Concept 02: Quiz: Graph SLAM
- Concept 03: Answer: Graph SLAM
- Concept 04: Quiz: Implementing Constraints
- Concept 05: Answer: Implementing Constraints
- Concept 06: Quiz: Adding Landmarks
- Concept 07: Answer: Adding Landmarks
- Concept 08: Quiz: Matrix Modification
- Concept 09: Answer: Matrix Modification
- Concept 10: Quiz: Untouched Fields
- Concept 11: Answer: Untouched Fields
- Concept 12: Quiz: Omega and Xi
- Concept 13: Notebook: Omega and Xi
- Concept 14: Quiz: Landmark Position
- Concept 15: Answer: Landmark Position
- Concept 16: Notebook: Including Sensor Measurements
- Concept 17: Quiz: Introducing Noise
- Concept 18: Answer: Introducing Noise
- Concept 19: Confident Measurements
- Concept 20: Notebook: Confident Measurements
- Concept 21: SLAM Summary
-
Lesson 08: Optional: Vehicle Motion and Calculus
Review the basics of calculus and see how to derive the x and y components of a self-driving car's motion from sensor measurements and other data.
- Concept 01: Introduction to Odometry
- Concept 02: Inertial Navigation Sensors
- Concept 03: Plotting Position vs. Time
- Concept 04: Interpreting Position vs. Time Graphs
- Concept 05: A "Typical" Calculus Problem
- Concept 06: How Odometers Work
- Concept 07: Speed from Position Data
- Concept 08: Position, Velocity, and Acceleration
- Concept 09: Implement an Accelerometer
- Concept 10: Differentiation Recap
- Concept 11: Acceleration Basics
- Concept 12: Plotting Elevator Acceleration
- Concept 13: Reasoning About Two Peaks
- Concept 14: The Integral: Area Under a Curve
- Concept 15: Approximating the Integral
- Concept 16: Approximating Integrals with Code
- Concept 17: Integrating Accelerometer Data
- Concept 18: Rate Gyros
- Concept 19: Integrating Rate Gyro Data
- Concept 20: Working with Real Data
- Concept 21: Accumulating Errors
- Concept 22: Sensor Strengths and Weaknesses
- Concept 23: Summary and Back to Trigonometry
- Concept 24: Trigonometry and Vehicle Motion
- Concept 25: Solving Trig Problems
- Concept 26: Keeping Track of x and y
- Concept 27: Keeping Track of x and y (solution)
- Concept 28: Conclusion
- Concept 29: Project Overview
- Concept 30: Lab - Reconstructing Trajectories
-
Lesson 09: Project: Landmark Detection & Tracking (SLAM)
Implement SLAM, a robust method for tracking an object over time and mapping out its surrounding environment, using elements of probability, motion models, and linear algebra.
-
Part 30 : Natural Language Processing
This section provides an overview of the program and introduces the fundamentals of Natural Language Processing through symbolic manipulation, including text cleaning, normalization, and tokenization. You'll then build a part of speech tagger using hidden Markov models.
-
Module 01: Introduction to Natural Language Processing
-
Lesson 01: Welcome to Natural Language Processing
Welcome to the Natural Language Processing Nanodegree program!
-
Lesson 04: Intro to NLP
Arpan will give you an overview of how to build a Natural Language Processing pipeline.
- Concept 01: Introducing Arpan
- Concept 02: NLP Overview
- Concept 03: Structured Languages
- Concept 04: Grammar
- Concept 05: Unstructured Text
- Concept 06: Counting Words
- Concept 07: Context Is Everything
- Concept 08: NLP and Pipelines
- Concept 09: How NLP Pipelines Work
- Concept 10: Text Processing
- Concept 11: Feature Extraction
- Concept 12: Modeling
-
Lesson 05: Text Processing
Learn to prepare text obtained from different sources for further processing, by cleaning, normalizing and splitting it into individual words or tokens.
- Concept 01: Text Processing
- Concept 02: Coding Exercises
- Concept 03: Introduction to GPU Workspaces
- Concept 04: Workspaces: Best Practices
- Concept 05: Text Processing Coding Examples
- Concept 06: Capturing Text Data
- Concept 07: Cleaning
- Concept 08: Normalization
- Concept 09: Tokenization
- Concept 10: Stop Word Removal
- Concept 11: Part-of-Speech Tagging
- Concept 12: Named Entity Recognition
- Concept 13: Stemming and Lemmatization
- Concept 14: Summary
-
Lesson 06: Spam Classifier with Naive Bayes
In this section, you'll learn how to build a spam e-mail classifier using the naive Bayes algorithm.
- Concept 01: Intro
- Concept 02: Guess the Person
- Concept 03: Known and Inferred
- Concept 04: Guess the Person Now
- Concept 05: Bayes Theorem
- Concept 06: Quiz: False Positives
- Concept 07: Solution: False Positives
- Concept 08: Bayesian Learning 1
- Concept 09: Bayesian Learning 2
- Concept 10: Bayesian Learning 3
- Concept 11: Naive Bayes Algorithm 1
- Concept 12: Naive Bayes Algorithm 2
- Concept 13: Building a Spam Classifier
- Concept 14: Project
- Concept 15: Spam Classifier - Workspace
- Concept 16: Outro
-
Lesson 07: Part of Speech Tagging with HMMs
Luis will give you an overview of several part-of-speech tagging, including a deeper dive on hidden Markov models.
- Concept 01: Intro
- Concept 02: Part of Speech Tagging
- Concept 03: Lookup Table
- Concept 04: Bigrams
- Concept 05: When bigrams won't work
- Concept 06: Hidden Markov Models
- Concept 07: Quiz: How many paths?
- Concept 08: Solution: How many paths
- Concept 09: Quiz: How many paths now?
- Concept 10: Quiz: Which path is more likely?
- Concept 11: Solution: Which path is more likely?
- Concept 12: Viterbi Algorithm Idea
- Concept 13: Viterbi Algorithm
- Concept 14: Further Reading
- Concept 15: Outro
-
Lesson 08: Project: Part of Speech Tagging
In this project, you'll build a hidden Markov model for part of speech tagging with a universal tagset.
-
Part 31 : Computing with Natural Language
Learn how to Computing with Natural Language.
-
Module 01: Computing with Natural Language
-
Lesson 01: Feature extraction and embeddings
Transform text using methods like Bag-of-Words, TF-IDF, Word2Vec and GloVE to extract features that you can use in machine learning models.
-
Lesson 02: Topic Modeling
In this section, you'll learn to split a collection of documents into topics using Latent Dirichlet Analysis (LDA). In the lab, you'll be able to apply this model to a dataset of news articles.
- Concept 01: Intro
- Concept 02: References
- Concept 03: Bag of Words
- Concept 04: Latent Variables
- Concept 05: Matrix Multiplication
- Concept 06: Matrices
- Concept 07: Quiz: Picking Topics
- Concept 08: Solution: Picking Topics
- Concept 09: Beta Distributions
- Concept 10: Dirichlet Distributions
- Concept 11: Latent Dirichlet Allocation
- Concept 12: Sample a Topic
- Concept 13: Sample a Word
- Concept 14: Combining the Models
- Concept 15: Outro
- Concept 16: Notebook: Topic Modeling
- Concept 17: [SOLUTION] Topic Modeling
- Concept 18: Next Steps
-
Lesson 03: Sentiment Analysis
Learn about using several machine learning classifiers, including Recurrent Neural Networks, to predict the sentiment in text. Apply this to a dataset of movie reviews.
- Concept 01: Intro
- Concept 02: Sentiment Analysis with a Regular Classifier
- Concept 03: Notebook: Sentiment Analysis with a regular classifier
- Concept 04: [SOLUTION]: Sentiment Analysis with a regular clas
- Concept 05: Sentiment Analysis with RNN
- Concept 06: Notebook: Sentiment Analysis with an RNN
- Concept 07: [SOLUTION]: Sentiment Analysis with an RNN
- Concept 08: Optional Material
- Concept 09: Outro
-
Lesson 04: Sequence to Sequence
Here you'll learn about a specific architecture of RNNs for generating one sequence from another sequence. These RNNs are useful for chatbots, machine translation, and more!
-
Lesson 05: Deep Learning Attention
Attention is one of the most important recent innovations in deep learning. In this section, you'll learn attention, and you'll go over a basic implementation of it in the lab.
- Concept 01: Introduction to Attention
- Concept 02: Sequence to Sequence Recap
- Concept 03: Encoding -- Attention Overview
- Concept 04: Decoding -- Attention Overview
- Concept 05: Attention Overview
- Concept 06: Attention Encoder
- Concept 07: Attention Decoder
- Concept 08: Attention Encoder & Decoder
- Concept 09: Bahdanau and Luong Attention
- Concept 10: Multiplicative Attention
- Concept 11: Additive Attention
- Concept 12: Additive and Multiplicative Attention
- Concept 13: Computer Vision Applications
- Concept 14: NLP Application: Google Neural Machine Translation
- Concept 15: Other Attention Methods
- Concept 16: The Transformer and Self-Attention
- Concept 17: Notebook: Attention Basics
- Concept 18: [SOLUTION]: Attention Basics
- Concept 19: Outro
-
Lesson 06: RNN Keras Lab
This section will prepare you for the Machine Translation project. Here you will get hands-on practice with RNNs in Keras.
-
Lesson 07: Project: Machine Translation
Apply the skills you've learnt in Natural Language Processing to the challenging and extremely rewarding task of Machine Translation. Bonne chance!
-
Part 32 : Communicating with Natural Language
Learn how to Communicating with Natural Language.
-
Module 01: Communicating with Natural Language
-
Lesson 01: Intro to Voice User Interfaces
Get acquainted with the principles and applications of VUI, and get introduced to Alexa skills.
-
Lesson 02: (Optional) Alexa History Skill
Build your own Alexa skill and deploy it!
-
Lesson 03: Speech Recognition
Learn how an ASR pipeline works.
- Concept 01: Intro
- Concept 02: Challenges in ASR
- Concept 03: Signal Analysis
- Concept 04: References: Signal Analysis
- Concept 05: Quiz: FFT
- Concept 06: Feature Extraction with MFCC
- Concept 07: References: Feature Extraction
- Concept 08: Quiz: MFCC
- Concept 09: Phonetics
- Concept 10: References: Phonetics
- Concept 11: Quiz: Phonetics
- Concept 12: Voice Data Lab Introduction
- Concept 13: Lab: Voice Data
- Concept 14: Acoustic Models and the Trouble with Time
- Concept 15: HMMs in Speech Recognition
- Concept 16: Language Models
- Concept 17: N-Grams
- Concept 18: Quiz: N-Grams
- Concept 19: References: Traditional ASR
- Concept 20: A New Paradigm
- Concept 21: Deep Neural Networks as Speech Models
- Concept 22: Connectionist Tempora Classification (CTC)
- Concept 23: References: Deep Neural Network ASR
- Concept 24: Outro
-
Lesson 04: Project: DNN Speech Recognizer
Build a deep neural network that functions as part of an end-to-end automatic speech recognition pipeline.
-
Part 33 : Introduction to Deep Reinforcement Learning
Welcome to Learn the beauty of Deep Reinforcement Learning.
-
Module 01: Introduction to Deep Reinforcement Learning
-
Lesson 05: Introduction to RL
Reinforcement learning is a type of machine learning where the machine or software agent learns how to maximize its performance at a task.
-
Lesson 06: The RL Framework: The Problem
Learn how to mathematically formulate tasks as Markov Decision Processes.
- Concept 01: Introduction
- Concept 02: The Setting, Revisited
- Concept 03: Episodic vs. Continuing Tasks
- Concept 04: Quiz: Test Your Intuition
- Concept 05: Quiz: Episodic or Continuing?
- Concept 06: The Reward Hypothesis
- Concept 07: Goals and Rewards, Part 1
- Concept 08: Goals and Rewards, Part 2
- Concept 09: Quiz: Goals and Rewards
- Concept 10: Cumulative Reward
- Concept 11: Discounted Return
- Concept 12: Quiz: Pole-Balancing
- Concept 13: MDPs, Part 1
- Concept 14: MDPs, Part 2
- Concept 15: Quiz: One-Step Dynamics, Part 1
- Concept 16: Quiz: One-Step Dynamics, Part 2
- Concept 17: MDPs, Part 3
- Concept 18: Finite MDPs
- Concept 19: Summary
-
Lesson 07: The RL Framework: The Solution
In reinforcement learning, agents learn to prioritize different decisions based on the rewards and punishments associated with different outcomes.
- Concept 01: Introduction
- Concept 02: Policies
- Concept 03: Quiz: Interpret the Policy
- Concept 04: Gridworld Example
- Concept 05: State-Value Functions
- Concept 06: Bellman Equations
- Concept 07: Quiz: State-Value Functions
- Concept 08: Optimality
- Concept 09: Action-Value Functions
- Concept 10: Quiz: Action-Value Functions
- Concept 11: Optimal Policies
- Concept 12: Quiz: Optimal Policies
- Concept 13: Summary
-
Lesson 08: Monte Carlo Methods
Write your own implementation of Monte Carlo control to teach an agent to play Blackjack!
- Concept 01: Review
- Concept 02: Gridworld Example
- Concept 03: Monte Carlo Methods
- Concept 04: MC Prediction - Part 1
- Concept 05: MC Prediction - Part 2
- Concept 06: MC Prediction - Part 3
- Concept 07: OpenAI Gym: BlackJackEnv
- Concept 08: Workspace - Introduction
- Concept 09: Coding Exercise
- Concept 10: Workspace
- Concept 11: Greedy Policies
- Concept 12: Epsilon-Greedy Policies
- Concept 13: MC Control
- Concept 14: Exploration vs. Exploitation
- Concept 15: Incremental Mean
- Concept 16: Constant-alpha
- Concept 17: Coding Exercise
- Concept 18: Workspace
- Concept 19: Summary
-
Lesson 09: Temporal-Difference Methods
Learn about how to apply temporal-difference methods such as SARSA, Q-Learning, and Expected SARSA to solve both episodic and continuing tasks.
- Concept 01: Introduction
- Concept 02: Review: MC Control Methods
- Concept 03: Quiz: MC Control Methods
- Concept 04: TD Control: Sarsa
- Concept 05: Quiz: Sarsa
- Concept 06: TD Control: Q-Learning
- Concept 07: Quiz: Q-Learning
- Concept 08: TD Control: Expected Sarsa
- Concept 09: Quiz: Expected Sarsa
- Concept 10: TD Control: Theory and Practice
- Concept 11: OpenAI Gym: CliffWalkingEnv
- Concept 12: Workspace - Introduction
- Concept 13: Coding Exercise
- Concept 14: Workspace
- Concept 15: Analyzing Performance
- Concept 16: Quiz: Check Your Understanding
- Concept 17: Summary
-
Lesson 10: Solve OpenAI Gym's Taxi-v2 Task
With reinforcement learning now in your toolbox, you're ready to explore a mini project using OpenAI Gym!
-
Lesson 11: RL in Continuous Spaces
Learn how to adapt traditional algorithms to work with continuous spaces.
- Concept 01: Introducing Arpan
- Concept 02: Lesson Overview
- Concept 03: Discrete vs. Continuous Spaces
- Concept 04: Quiz: Space Representations
- Concept 05: Discretization
- Concept 06: Exercise: Discretization
- Concept 07: Workspace: Discretization
- Concept 08: Tile Coding
- Concept 09: Exercise: Tile Coding
- Concept 10: Workspace: Tile Coding
- Concept 11: Coarse Coding
- Concept 12: Function Approximation
- Concept 13: Linear Function Approximation
- Concept 14: Kernel Functions
- Concept 15: Non-Linear Function Approximation
- Concept 16: Summary
-
Part 34 : Value-Based Methods
Learn how Value-Based Methods works.
-
Module 01: Value-Based Methods
-
Lesson 01: Study Plan
Obtain helpful resources to accelerate your learning in the second part of the Nanodegree program.
-
Lesson 02: Deep Q-Networks
Extend value-based reinforcement learning methods to complex problems using deep neural networks.
- Concept 01: From RL to Deep RL
- Concept 02: Deep Q-Networks
- Concept 03: Experience Replay
- Concept 04: Fixed Q-Targets
- Concept 05: Deep Q-Learning Algorithm
- Concept 06: Coding Exercise
- Concept 07: Workspace
- Concept 08: Deep Q-Learning Improvements
- Concept 09: Double DQN
- Concept 10: Prioritized Experience Replay
- Concept 11: Dueling DQN
- Concept 12: Rainbow
- Concept 13: Summary
-
Lesson 03: Navigation
Train an agent to navigate a large world and collect yellow bananas, while avoiding blue bananas.
- Concept 01: Unity ML-Agents
- Concept 02: The Environment - Introduction
- Concept 03: The Environment - Play
- Concept 04: The Environment - Explore
- Concept 05: Project Instructions
- Concept 06: Benchmark Implementation
- Concept 07: Not sure where to start?
- Concept 08: Collaborate!
- Concept 09: Workspace
- Concept 10: (Optional) Challenge: Learning from Pixels
-
Lesson 04: Opportunities in Deep Reinforcement Learning
Learn about common career opportunities in deep reinforcement learning, and get tips on how to stay active in the community.
-
Part 35 : Policy-Based Methods
Learn how Policy-Based Methods works.
-
Module 01: Policy-Based Methods
-
Lesson 01: Study Plan
Obtain helpful resources to accelerate your learning in the third part of the Nanodegree program.
-
Lesson 02: Introduction to Policy-Based Methods
Policy-based methods try to directly optimize for the optimal policy.
- Concept 01: Policy-Based Methods
- Concept 02: Policy Function Approximation
- Concept 03: More on the Policy
- Concept 04: Hill Climbing
- Concept 05: Hill Climbing Pseudocode
- Concept 06: Beyond Hill Climbing
- Concept 07: More Black-Box Optimization
- Concept 08: Coding Exercise
- Concept 09: Workspace
- Concept 10: OpenAI Request for Research
- Concept 11: Why Policy-Based Methods?
- Concept 12: Summary
-
Lesson 03: Policy Gradient Methods
Policy gradient methods search for the optimal policy through gradient ascent.
-
Lesson 04: Proximal Policy Optimization
Learn what Proximal Policy Optimization (PPO) is and how it can improve policy gradients. Also learn how to implement the algorithm by training a computer to play the Atari Pong game.
- Concept 01: Instructor Introduction
- Concept 02: Lesson Preview
- Concept 03: Beyond REINFORCE
- Concept 04: Noise Reduction
- Concept 05: Credit Assignment
- Concept 06: Policy Gradient Quiz
- Concept 07: pong with REINFORCE (code walkthrough)
- Concept 08: pong with REINFORCE (workspace)
- Concept 09: Importance Sampling
- Concept 10: PPO part 1- The Surrogate Function
- Concept 11: PPO part 2- Clipping Policy Updates
- Concept 12: PPO summary
- Concept 13: pong with PPO (code walkthrough)
- Concept 14: pong with PPO (workspace)
-
Lesson 05: Actor-Critic Methods
Miguel Morales explains how to combine value-based and policy-based methods, bringing together the best of both worlds, to solve challenging reinforcement learning problems.
- Concept 01: Introduction
- Concept 02: Motivation
- Concept 03: Bias and Variance
- Concept 04: Two Ways for Estimating Expected Returns
- Concept 05: Baselines and Critics
- Concept 06: Policy-based, Value-Based, and Actor-Critic
- Concept 07: A Basic Actor-Critic Agent
- Concept 08: A3C: Asynchronous Advantage Actor-Critic, N-step
- Concept 09: A3C: Asynchronous Advantage Actor-Critic, Parallel Training
- Concept 10: A3C: Asynchronous Advantage Actor-Critic, Off- vs On-policy
- Concept 11: A2C: Advantage Actor-Critic
- Concept 12: A2C Code Walk-through
- Concept 13: GAE: Generalized Advantage Estimation
- Concept 14: DDPG: Deep Deterministic Policy Gradient, Continuous Actions
- Concept 15: DDPG: Deep Deterministic Policy Gradient, Soft Updates
- Concept 16: DDPG Code Walk-through
- Concept 17: Summary
-
Lesson 06: Deep RL for Finance (Optional)
Learn how to apply deep reinforcement learning techniques for optimal execution of portfolio transactions.
- Concept 01: Introduction
- Concept 02: High Frequency Trading
- Concept 03: Challenges of Supervised Learning
- Concept 04: Advantages of RL for Trading
- Concept 05: Optimal Liquidation Problem - Part 1 - Introduction
- Concept 06: Optimal Liquidation Problem - Part 2 - Market Impact
- Concept 07: Optimal Liquidation Problem - Part 3 - Price Model
- Concept 08: Optimal Liquidation Problem - Part 4 - Expected Shortfall
- Concept 09: Almgren and Chriss Model
- Concept 10: Trading Lists
- Concept 11: The Efficient Frontier
- Concept 12: DRL for Optimal Execution of Portfolio Transactions
-
Lesson 07: Continuous Control
Train a double-jointed arm to reach target locations.
- Concept 01: Unity ML-Agents
- Concept 02: The Environment - Introduction
- Concept 03: The Environment - Real World
- Concept 04: The Environment - Explore
- Concept 05: Project Instructions
- Concept 06: Benchmark Implementation
- Concept 07: Not sure where to start?
- Concept 08: General Advice
- Concept 09: Collaborate!
- Concept 10: Workspace
- Concept 11: (Optional) Challenge: Crawl
-
Part 36 : Introduction to Artificial Intelligence
Welcome to the Introduction to Artificial Intelligence.
-
Module 01: Introduction to Artificial Intelligence
-
Lesson 04: Intro to Artificial Intelligence
An introduction to basic AI concepts and the challenge of answering "what is AI?"
- Concept 01: Welcome to AI!
- Concept 02: Navigation
- Concept 03: Game Playing
- Concept 04: Quiz: Tic Tac Toe
- Concept 05: Tic Tac Toe: Heuristics
- Concept 06: Quiz: Monty Hall Problem
- Concept 07: Monty Hall Problem: Explained
- Concept 08: Quiz: What is Intelligence?
- Concept 09: Defining Intelligence
- Concept 10: Agent, Environment And State
- Concept 11: Perception, Action and Cognition
- Concept 12: Quiz: Types of AI Problems
- Concept 13: Rational Behavior And Bounded Optimality
-
Lesson 05: Solving Sudoku With AI
In this lesson, you'll dive right in and apply Artificial Intelligence to solve every Sudoku puzzle.
- Concept 01: Intro
- Concept 02: Solving a Sudoku
- Concept 03: Setting up the Board
- Concept 04: Encoding the Board
- Concept 05: Strategy 1: Elimination
- Concept 06: Strategy 2: Only Choice
- Concept 07: Constraint Propagation
- Concept 08: Harder Sudoku
- Concept 09: Strategy 3: Search
- Concept 10: Coding the Solution
-
Lesson 08: Build a Sudoku Solver
Use constraint propagation and search to build an agent that reasons like a human would to efficiently solve any Sudoku puzzle.
-
Lesson 09: Jobs in AI
Learn about common jobs in artificial intelligence, and get tips on how to stay active in the community.
-
Part 37 : Constraint Satisfaction Problems
Take a deep dive into the constraint satisfaction problem framework and further explore constraint propagation, backtracking search, and other CSP techniques. Complete a classroom exercise using a powerful CSP solver on a variety of problems to gain experience framing new problems as CSPs.
-
Module 01: Constraint Satisfaction Problems
-
Lesson 01: Constraint Satisfaction Problems
Expand from the constraint propagation technique used in the Sudoku project to the Constraint Satisfaction Problem framework that can be used to solve a wide range of general problems.
- Concept 01: Lesson Plan - Week 2
- Concept 02: Introduction
- Concept 03: CSP Examples
- Concept 04: Map Coloring
- Concept 05: Constraint Graph
- Concept 06: Map Coloring Quiz
- Concept 07: Constraint Types
- Concept 08: Backtracking Search
- Concept 09: Why Backtracking?
- Concept 10: Improving Backtracking Efficiency
- Concept 11: Backtracking Optimization Quiz
- Concept 12: Forward Checking
- Concept 13: Constraint Propagation and Arc Consistency
- Concept 14: Constraint Propagation Quiz
- Concept 15: Structured CSPs
-
Lesson 02: CSP Coding Exercise
Practice formulating some classical example problems as CSPs, and then to explore using a powerful open source constraint satisfaction tool called Z3 from Microsoft Research to solve them.
-
Lesson 03: Additional Readings
Reading list of applications and additional topics related to CSPs.
-
Part 38 : Classical Search
Learn classical graph search algorithms--including uninformed search techniques like breadth-first and depth-first search and informed search with heuristics including A*. These algorithms are at the heart of many classical AI techniques, and have been used for planning, optimization, problem solving, and more. Complete the lesson by teaching PacMan to search with these techniques to solve increasingly complex domains.
-
Module 01: Classical Search
-
Lesson 01: Introduction
Peter Norvig, co-author of Artificial Intelligence: A Modern Approach, explains a framework for search problems, and introduces uninformed & informed search strategies to solve them.
-
Lesson 02: Uninformed Search
Peter introduces uninformed search strategies—which can only solve problems by generating successor states and distinguishing between goal and non-goal states.
- Concept 01: Intro to Uninformed Search
- Concept 02: Example: Route Finding
- Concept 03: Quiz: Tree Search
- Concept 04: Tree Search Continued
- Concept 05: Quiz: Graph Search
- Concept 06: Quiz: Breadth First Search 1
- Concept 07: Breadth First Search 2
- Concept 08: Quiz: Breadth First Search 3
- Concept 09: Breadth First Search 4
- Concept 10: Breadth First Search 5
- Concept 11: Uniform Cost Search
- Concept 12: Uniform Cost Search 1
- Concept 13: Uniform Cost Search 2
- Concept 14: Uniform Cost Search 3
- Concept 15: Uniform Cost Search 4
- Concept 16: Uniform Cost Search 5
- Concept 17: Quiz: Search Comparison
- Concept 18: Search Comparison 1
- Concept 19: Quiz: Search Comparison 2
- Concept 20: Search Comparison 3
- Concept 21: Recap
-
Lesson 03: Informed Search
Peter introduces informed search strategies, which means that they use problem-specific knowledge to find solutions more efficiently than an uninformed search.
- Concept 01: Intro to Informed Search
- Concept 02: On Uniform Cost
- Concept 03: A* Search
- Concept 04: A* Search 1
- Concept 05: A* Search 2
- Concept 06: A* Search 3
- Concept 07: A* Search 4
- Concept 08: A* Search 5
- Concept 09: Optimistic Heuristic
- Concept 10: Quiz: State Spaces
- Concept 11: State Spaces 1
- Concept 12: Quiz: State Spaces 2
- Concept 13: State Spaces 3
- Concept 14: Quiz: Sliding Blocks Puzzle
- Concept 15: Sliding Blocks Puzzle 1
- Concept 16: Sliding Blocks Puzzle 2
- Concept 17: A Note on Implementation
- Concept 18: Recap
-
Lesson 04: Classroom Exercise: Search
Complete a practice exercise where you'll implement informed and uninformed search strategies for the game PacMan.
-
Lesson 05: Additional Search Topics
References to additional readings on search.
-
Part 39 : Automated Planning
Learn to represent general problem domains with symbolic logic and use search to find optimal plans for achieving your agent’s goals. Planning & scheduling systems power modern automation & logistics operations, and aerospace applications like the Hubble telescope & NASA Mars rovers.
-
Module 01: Automated Planning
-
Lesson 01: Symbolic Logic & Reasoning
Peter Norvig returns to explain propositional logic and first-order logic, which provide a symbolic logic framework that enables AI agents to reason about their actions.
- Concept 01: Lesson Plan - Week 4
- Concept 02: Introduction
- Concept 03: Background and Expert Systems
- Concept 04: Propositional Logic
- Concept 05: Truth Tables
- Concept 06: Truth Table Question
- Concept 07: Propositional Logic Question
- Concept 08: Terminology
- Concept 09: Propositional Logic Limitations
- Concept 10: First Order Logic
- Concept 11: Models
- Concept 12: Syntax
- Concept 13: Vacuum World
- Concept 14: FOL Question
- Concept 15: FOL Question 2
- Concept 16: Recap
-
Lesson 02: Introduction to Planning
Peter Norvig defines automated planning problems in comparison to more general problem solving techniques to set the stage for classical planning algorithms in the next lesson.
- Concept 01: Problem Solving vs Planning
- Concept 02: Planning vs Execution
- Concept 03: Vacuum Cleaner Example
- Concept 04: Quiz: Sensorless Vacuum Cleaner Problem
- Concept 05: Partially Observable Vacuum Cleaner Example
- Concept 06: Quiz: Stochastic Environment Problem
- Concept 07: Infinite Sequences
- Concept 08: Finding a Successful Plan
- Concept 09: Quiz: Finding a Successful Plan Question
- Concept 10: Problem Solving via Mathematical Notation
- Concept 11: Tracking the-Predict Update Cycle
-
Lesson 03: Classical Planning
Peter presents a survey of Classical Planning techniques: forward planning (progression search) & backward planning (regression search).
-
Lesson 04: Build a Forward-Planning Agent
In this project you’ll use experiment with search and symbolic logic to build an agent that automatically develops and executes plans to achieve their goals.
-
Lesson 05: Additional Planning Topics
Peter discusses plan space search & situational calculus. Finish the lesson with readings on advanced planning topics & modern applications of automated planning.
-
Part 40 : Optimization Problems
Learn about iterative improvement optimization problems and classical algorithms emphasizing gradient-free methods for solving them. These techniques can often be used on intractable problems to find solutions that are "good enough" for practical purposes, and have been used extensively in fields like Operations Research & logistics. Finish the lesson by completing a classroom exercise comparing the different algorithms' performance on a variety of problems.
-
Module 01: Optimization Problems
-
Lesson 01: Introduction
Thad Starner introduces the concept of iterative improvement problems, a class of optimization problems that can be solved with global optimization or local search techniques covered in this lesson.
-
Lesson 02: Hill Climbing
Thad introduces Hill Climbing, a very simple local search optimization technique that works well on many iterative improvement problems.
-
Lesson 03: Simulated Annealing
Thad explains Simulated Annealing, a classical global optimization technique for optimization.
-
Lesson 04: Genetic Algorithms
Thad introduces another optimization technique: Genetic Algorithms, which uses a population of samples to make iterative improvements towards the goal.
-
Lesson 05: Optimization Exercise
Complete a classroom exercise implementing simulated annealing to solve the traveling salesman problem.
-
Lesson 06: Additional Optimization Topics
Review similarities of the techniques introduced in this lesson with links to readings on advanced optimization topics, then complete an optimization exercise in the classroom.
-
Part 41 : Adversarial Search -
Learn how to search in multi-agent environments (including decision making in competitive environments) using the minimax theorem from game theory. Then build an agent that can play games better than any human.
-
Module 01: Adversarial Search
-
Lesson 01: Search in Multiagent Domains
Thad returns to teach search in multi-agent domains, using the Minimax theorem to solve adversarial problems and build agents that make better decisions than humans.
- Concept 01: Lesson Plan - Week 8
- Concept 02: Overview
- Concept 03: The Minimax Algorithm
- Concept 04: Isolation
- Concept 05: Building a Game Tree
- Concept 06: Coding: Building a Game Class
- Concept 07: Which of These Are Valid Moves?
- Concept 08: Coding: Game Class Functionality
- Concept 09: Building a Game Tree (Contd.)
- Concept 10: Isolation Game Tree with Leaf Values
- Concept 11: How Do We Tell the Computer Not to Lose?
- Concept 12: MIN and MAX Levels
- Concept 13: Coding: Scoring Min & Max Levels
- Concept 14: Propagating Values Up the Tree
- Concept 15: Computing MIN MAX Values
- Concept 16: Computing MIN MAX Solution
- Concept 17: Choosing the Best Branch
- Concept 18: Coding: Minimax Search
- Concept 19: Max Number of Nodes Visited
- Concept 20: Max Moves
- Concept 21: The Branching Factor
- Concept 22: Number of Nodes in a Game Tree
- Concept 23: The Branching Factor (Contd.)
- Concept 24: Max Number of Nodes
-
Lesson 02: Optimizing Minimax Search
Thad explains some of the limitations of minimax search and introduces optimizations & changes that make it practical in more complex domains.
- Concept 01: Lesson Plan - Week 9
- Concept 02: Minimax Quiz
- Concept 03: Depth-Limited Search
- Concept 04: Coding: Depth-Limited Search
- Concept 05: Evaluation Function Intro
- Concept 06: Testing the Evaluation Function
- Concept 07: Testing the Evaluation Function Part 2
- Concept 08: Testing Evaluation Functions
- Concept 09: Testing the Evaluation Function Part 3
- Concept 10: Coding: #my_moves Heuristic
- Concept 11: Quiescent Search
- Concept 12: A Problem
- Concept 13: Iterative Deepening
- Concept 14: Understanding Exponential Time
- Concept 15: Exponential b=3
- Concept 16: Varying the Branching Factor
- Concept 17: Coding: Iterative Deepening
- Concept 18: Horizon Effect
- Concept 19: Horizon Effect (Contd.)
- Concept 20: Good Evaluation Functions
- Concept 21: Evaluating Evaluation Functions
- Concept 22: Alpha-Beta Pruning
- Concept 23: Alpha-Beta Pruning Quiz 1
- Concept 24: Alpha-Beta Pruning Quiz 2
- Concept 25: Coding: Alpha-Beta Pruning
- Concept 26: Solving 5x5 Isolation
- Concept 27: Coding: Opening Book
- Concept 28: Thad’s Asides
-
Lesson 03: Build an Adversarial Game Playing Agent
Extend classical search to adversarial domains, to build agents that make good decisions without any human intervention—such as the DeepMind AlphaGo agent.
Project Description - Build an Adversarial Game Playing Agent
-
Lesson 04: Extending Minimax Search
Thad introduces extensions to minimax search to support more than two players and non-deterministic domains.
- Concept 01: Introduction
- Concept 02: 3-Player Games
- Concept 03: 3-Player Games Quiz
- Concept 04: 3-Player Alpha-Beta Pruning
- Concept 05: Multi-player Alpha-Beta Pruning Reading
- Concept 06: Probabilistic Games
- Concept 07: Sloppy Isolation
- Concept 08: Sloppy Isolation Expectimax
- Concept 09: Expectimax Alpha-Beta Pruning
- Concept 10: Probabilistic Alpha-Beta Pruning
-
Lesson 05: Additional Adversarial Search Topics
Introduce Monte Carlo Tree Search, a highly-successful search technique in game domains, along with a reading list for other advanced adversarial search topics.
-
Part 42 : Probabilistic Models
Learn to use Bayes Nets to represent complex probability distributions, and algorithms for sampling from those distributions. Then learn the algorithms used to train, predict, and evaluate Hidden Markov Models for pattern recognition. HMMs have been used for gesture recognition in computer vision, gene sequence identification in bioinformatics, speech generation & part of speech tagging in natural language processing, and more.
-
Module 01: Probabilistic Models
-
Lesson 01: Probability
Sebastian Thrun briefly reviews basic probability theory including discrete distributions, independence, joint probabilities, and conditional distributions to model uncertainty in the real world.
- Concept 01: Lesson Plan - Week 10
- Concept 02: Intro to Probability and Bayes Nets
- Concept 03: Quiz: Probability / Coin Flip
- Concept 04: Quiz: Coin Flip 2
- Concept 05: Quiz: Coin Flip 3
- Concept 06: Quiz: Coin Flip 4
- Concept 07: Quiz: Coin Flip 5
- Concept 08: Probability Summary
- Concept 09: Quiz: Dependence
- Concept 10: What We Learned
- Concept 11: Quiz: Weather
- Concept 12: Quiz: Weather 2
- Concept 13: Quiz: Weather 3
- Concept 14: Quiz: Cancer
- Concept 15: Quiz: Cancer 2
- Concept 16: Quiz: Cancer 3
- Concept 17: Quiz: Cancer 4
- Concept 18: Bayes Rule
-
Lesson 02: Naive Bayes
In this section, you'll learn how to build a spam e-mail classifier using the naive Bayes algorithm.
- Concept 01: Intro
- Concept 02: Guess the Person
- Concept 03: Known and Inferred
- Concept 04: Guess the Person Now
- Concept 05: Bayes Theorem
- Concept 06: Quiz: False Positives
- Concept 07: Solution: False Positives
- Concept 08: Bayesian Learning 1
- Concept 09: Bayesian Learning 2
- Concept 10: Bayesian Learning 3
- Concept 11: Naive Bayes Algorithm 1
- Concept 12: Naive Bayes Algorithm 2
- Concept 13: Building a Spam Classifier
- Concept 14: Exercise: Building a Spam Classifier
- Concept 15: Outro
-
Lesson 03: Bayes Nets
Sebastian explains using Bayes Nets as a compact graphical model to encode probability distributions for efficient analysis.
- Concept 01: Lesson Plan - Week 11
- Concept 02: Introduction
- Concept 03: Quiz: Bayes Network
- Concept 04: Computing Bayes Rule
- Concept 05: Quiz: Two Test Cancer
- Concept 06: Quiz: Two Test Cancer 2
- Concept 07: Quiz: Conditional Independence
- Concept 08: Quiz: Conditional Independence 2
- Concept 09: Quiz: Absolute And Conditional
- Concept 10: Quiz: Confounding Cause
- Concept 11: Quiz: Explaining Away
- Concept 12: Quiz: Explaining Away 2
- Concept 13: Quiz: Explaining Away 3
- Concept 14: Conditional Dependence
- Concept 15: Quiz: General Bayes Net
- Concept 16: Quiz: General Bayes Net 2
- Concept 17: Quiz: General Bayes Net 3
- Concept 18: Value Of A Network
- Concept 19: Quiz: D Separation
- Concept 20: Quiz: D Separation 2
- Concept 21: Quiz: D Separation 3
-
Lesson 04: Inference in Bayes Nets
Sebastian explains probabilistic inference using Bayes Nets, i.e. how to use evidence to calculate probabilities from the network.
- Concept 01: Probabilistic Inference
- Concept 02: Quiz: Overview and Example
- Concept 03: Quiz: Enumeration
- Concept 04: Quiz: Speeding Up Enumeration
- Concept 05: Quiz: Speeding Up Enumeration 2
- Concept 06: Quiz: Speeding Up Enumeration 3
- Concept 07: Quiz: Speeding Up Enumeration 4
- Concept 08: Causal Direction
- Concept 09: Quiz: Variable Elimination
- Concept 10: Quiz: Variable Elimination 2
- Concept 11: Quiz: Variable Elimination 3
- Concept 12: Variable Elimination 4
- Concept 13: Approximate Inference
- Concept 14: Quiz: Sampling Example
- Concept 15: Approximate Inference 2
- Concept 16: Rejection Sampling
- Concept 17: Quiz: Likelihood Weighting
- Concept 18: Likelihood Weighting 1
- Concept 19: Likelihood Weighting 2
- Concept 20: Gibbs Sampling
- Concept 21: Quiz: Monty Hall Problem
- Concept 22: Monty Hall Letter
-
Lesson 05: Hidden Markov Models
Learn Hidden Markov Models, and apply them to part-of-speech tagging, a very popular problem in Natural Language Processing.
- Concept 01: Lesson Plan - Week 12
- Concept 02: Intro
- Concept 03: Part of Speech Tagging
- Concept 04: Lookup Table
- Concept 05: Bigrams
- Concept 06: When bigrams won't work
- Concept 07: Hidden Markov Models
- Concept 08: Quiz: How many paths?
- Concept 09: Solution: How many paths
- Concept 10: Quiz: How many paths now?
- Concept 11: Quiz: Which path is more likely?
- Concept 12: Solution: Which path is more likely?
- Concept 13: Viterbi Algorithm Idea
- Concept 14: Viterbi Algorithm
- Concept 15: Further Reading
- Concept 16: Outro
-
Lesson 06: Part of Speech Tagging
In this project you will build a hidden Markov model (HMM) to perform part of speech tagging, a common pre-processing step in Natural Language Processing.
-
Lesson 07: Dynamic Time Warping
Thad explains the Dynamic Time Warping technique for working with time-series data.
-
Part 43 : Intro to Self-Driving Cars
Welcome to Learn How Self-Driving Cars works.
-
Lesson 04: Introduction to Self-Driving Cars - The Carla Chronicles: Back on Track
Work through the readiness assessment with Carla and her friends to make sure you are ready to begin your own personal adventure with self-driving cars!
- Concept 01: The Carla Chronicles: Back on Track
- Concept 02: Meet the Crew
- Concept 03: Wheel Size Matters
- Concept 04: Challenge: Rotation
- Concept 05: Lost in Space
- Concept 06: Challenge: Localization
- Concept 07: It's Getting Hot in Here
- Concept 08: Challenge: Changing Tire Size
- Concept 09: What's the Plan, Stan?
- Concept 10: Challenge: Planning
- Concept 11: The Split Decision
- Concept 12: Challenge: Shortest Path
- Concept 13: The End?
- Concept 14: Thanks for Helping Carla!
Part 44 : Bayesian Thinking
Learn the framework that underlies a self-driving car’s understanding of itself and the world around it, and to see the world the way a self-driving car does.
-
Module 01: Bayesian Thinking
-
Lesson 01: Introduction
A brief introduction to Bayesian Thinking from Sebastian.
-
Lesson 02: Joy Ride
A quick introduction to controlling a (simulated) car with code. Parts 1 and 2 will show you how to control gas and steering and in part 3 you'll program a car to parallel park.
-
Lesson 03: Probability
Learn the basics of probability - the language of robotics. This lesson will focus on the math. In later lessons you'll apply this math in Python code.
- Concept 01: Uncertainty in Driving
- Concept 02: Uncertainty in Robotics
- Concept 03: Learning Objectives Explained
- Concept 04: Learning Objectives - Probability
- Concept 05: Probability
- Concept 06: Flipping Coins
- Concept 07: Fair Coin
- Concept 08: Loaded Coin 1
- Concept 09: Loaded Coin 2
- Concept 10: Loaded Coin 3
- Concept 11: Complementary Outcomes
- Concept 12: Probability in Robotics
- Concept 13: Two Flips 1
- Concept 14: Two Flips 2
- Concept 15: Two Flips 3
- Concept 16: Two Flips 4
- Concept 17: Two Flips 5
- Concept 18: Two Cars 1-5
- Concept 19: One Head 1
- Concept 20: One Head 2
- Concept 21: One Of Three 1
- Concept 22: One Of Three 2
- Concept 23: Even Roll
- Concept 24: Doubles
- Concept 25: Summary
- Concept 26: [Optional] Cars and Probability
-
Lesson 04: Conditional Probability
In order to infer meaning from noisy sensor measurements, a self driving car needs to use the math of Conditional Probability. Learn this math from Sebastian (and then apply it in the next lesson).
- Concept 01: Conditional Probability
- Concept 02: Intro to Conditional Probability
- Concept 03: Estimating Based on Conditions
- Concept 04: Dependent Events and Conditional Probability
- Concept 05: Learning Objective Recap Explained
- Concept 06: Learning Objectives - Probability
- Concept 07: Learning Objectives - Conditional Probability
- Concept 08: Dependent Things
- Concept 09: Notation Note
- Concept 10: Medical Example 1
- Concept 11: Medical Example 2
- Concept 12: Medical Example 3
- Concept 13: Medical Example 4
- Concept 14: Medical Example 5
- Concept 15: Medical Example 6
- Concept 16: Medical Example 7
- Concept 17: Medical Example 8
- Concept 18: Total Probability
- Concept 19: Two Coins 1
- Concept 20: Two Coins 2
- Concept 21: Two Coins 3
- Concept 22: Two Coins 4
- Concept 23: Summary
- Concept 24: [Optional] Cars and Conditional Probability
-
Lesson 05: Programming Probability in Python
Your chance to learn basic Python syntax while applying what you learned about probability and conditional probability in the last two lessons.
- Concept 01: Learn by Doing
- Concept 02: Your First Programming Practice!
- Concept 03: Python Variables [demonstration]
- Concept 04: Data Types [demonstration]
- Concept 05: Python Control Flow [demonstration]
- Concept 06: For Loops [demonstration]
- Concept 07: Lists and Loops [demonstration]
- Concept 08: List Comprehensions [demonstration]
- Concept 09: Python's random Library [demonstration]
- Concept 10: Learning with Playgrounds
- Concept 11: Simulating Coin Flips [playground]
- Concept 12: Functions [demonstration]
- Concept 13: Simulating Probabilities [demonstration]
- Concept 14: Exercises
- Concept 15: Probability of Collision [exercise]
- Concept 16: Probability of Collision [solution]
-
Lesson 06: Bayes' Rule
Learn about Bayes' Rule from Sebastian and get your first peek at how a self driving car uses Bayes' Rule to understand where in the world it is.
- Concept 01: Reducing Uncertainty
- Concept 02: Bayes' Rule and Robotics
- Concept 03: Learning from Sensor Data
- Concept 04: Using Sensor Data
- Concept 05: Learning Objectives - Conditional Probability
- Concept 06: Learning Objectives - Bayes' Rule
- Concept 07: Bayes Rule
- Concept 08: Cancer Test
- Concept 09: Prior And Posterior
- Concept 10: Normalizing 1
- Concept 11: Normalizing 2
- Concept 12: Normalizing 3
- Concept 13: Total Probability
- Concept 14: Bayes Rule Diagram
- Concept 15: Equivalent Diagram
- Concept 16: Cancer Probabilities
- Concept 17: Probability Given Test
- Concept 18: Normalizer
- Concept 19: Normalizing Probability
- Concept 20: Disease Test 1
- Concept 21: Disease Test 2
- Concept 22: Disease Test 3
- Concept 23: Disease Test 4
- Concept 24: Disease Test 5
- Concept 25: Disease Test 6
- Concept 26: Bayes Rule Summary
- Concept 27: Robot Sensing 1
- Concept 28: Robot Sensing 2
- Concept 29: Robot Sensing 3
- Concept 30: Robot Sensing 4
- Concept 31: Robot Sensing 5
- Concept 32: Robot Sensing 6
- Concept 33: Robot Sensing 7
- Concept 34: Robot Sensing 8
- Concept 35: Generalizing
- Concept 36: Sebastian At Home
-
Lesson 07: Programming Bayes' Rule and World Representations
In this lesson, you can expect a lot of hands-on practice programming Bayesian probability in Python, and representing a 2D world that you'll need to localize a car.
- Concept 01: Bayes' Rule Steps
- Concept 02: Programming Probabilities [exercise]
- Concept 03: Total Probability [exercise]
- Concept 04: Testing the Total
- Concept 05: Programming Bayes' Rule [exercise]
- Concept 06: Testing Bayes' Rule
- Concept 07: Arrays [demonstration]
- Concept 08: Array Iteration and Stopping [exercise]
- Concept 09: 2D Arrays and the Robot World [demonstration]
- Concept 10: 2D Iteration [demonstration]
- Concept 11: Pattern Matching [exercise]
- Concept 12: Why use Numpy Arrays [demonstration]
-
Lesson 08: Probability Distributions
Learn how a robot represents it's belief about uncertain quantities using something known as a probability distribution.
- Concept 01: Probability Distributions
- Concept 02: Intro to Probability Distributions Part One
- Concept 03: Intro to Probability Distributions Part Two
- Concept 04: Learning Objectives - Bayes' Rule
- Concept 05: Discrete vs. Continuous Variables
- Concept 06: Discrete Probability Distributions
- Concept 07: Discrete Probability [Exercise]
- Concept 08: Discrete Probability [Exercise] Solution
- Concept 09: Continuous Variables
- Concept 10: Landing Probability
- Concept 11: Spinning Probability
- Concept 12: Stops Nowhere
- Concept 13: Range Probability
- Concept 14: Range Probability 2
- Concept 15: Range Probability 3
- Concept 16: Continuous Probability Distributions
- Concept 17: Density
- Concept 18: Birth Time Density
- Concept 19: Changing Density
- Concept 20: Changing Density 2
- Concept 21: Check Density
- Concept 22: Calculate Density
- Concept 23: Density Properties
- Concept 24: Summary
-
Lesson 09: Programming Probability Distributions
Apply what you've learned in this course by programming and visualizing probability distributions.
- Concept 01: Prepare for a Challenge
- Concept 02: Programming Probability Distributions
- Concept 03: Math in Python [demonstration]
- Concept 04: Uniform Distribution [exercise]
- Concept 05: Uniform Distribution Solution
- Concept 06: Function Improvements [exercise]
- Concept 07: Function Improvements Solution
- Concept 08: Plotting in Python [demonstration]
- Concept 09: Visualizing Uniform Distributions [exercise]
- Concept 10: Visualizing Uniform Distributions Solution
- Concept 11: Visualizing Piece-wise Uniform Distributions
- Concept 12: Visualizing Piece-Wise Distributions [exercise]
- Concept 13: Visualizing Piece-Wise Distributions Solution
- Concept 14: 1-D Car World [exercise]
- Concept 15: 1-D Car World Solution
- Concept 16: 2-D Car World [demonstration]
- Concept 17: 2-D Car World [exercise]
- Concept 18: 2-D Car World Solution
- Concept 19: Conclusion
-
Lesson 10: Gaussian Distributions
You will work with a specific continuous probability distribution called the Gaussian distribution. A Gaussian distribution helps describe uncertainty in sensor measurements and a vehicle's location.
- Concept 01: Introduction
- Concept 02: Continuous Distributions
- Concept 03: Gaussian Distributions
- Concept 04: Gaussian Equation
- Concept 05: Mean
- Concept 06: Standard Deviation
- Concept 07: Plotting Gaussians in Python [exercise]
- Concept 08: Plotting Gaussians in Python [Solution]
- Concept 09: Area Under the Curve
- Concept 10: Calculating Area Under the Curve in Python
- Concept 11: Calculating Area Under the Curve [Solution]
- Concept 12: Central Limit Theorem [Optional]
- Concept 13: Central Limit Theorem [Optional Demo]
- Concept 14: Conclusion
-
Lesson 11: Robot Localization
Sebastian Thrun will give you an overview of the theory behind localization!
- Concept 01: About this Lesson
- Concept 02: Introduction
- Concept 03: Localization
- Concept 04: Total Probability
- Concept 05: Uniform Probability Quiz
- Concept 06: Uniform Distribution
- Concept 07: Generalized Uniform Distribution
- Concept 08: Probability After Sense
- Concept 09: Compute Sum
- Concept 10: Normalize Distribution
- Concept 11: pHit and pMiss
- Concept 12: Sum of Probabilities
- Concept 13: Sense Function
- Concept 14: Normalized Sense Function
- Concept 15: Test Sense Function
- Concept 16: Multiple Measurements
- Concept 17: Exact Motion
- Concept 18: Move Function
- Concept 19: Inexact Motion 1
- Concept 20: Inexact Motion 2
- Concept 21: Inexact Motion 3
- Concept 22: Inexact Move Function
- Concept 23: Limit Distribution Quiz
- Concept 24: Move Twice
- Concept 25: Move 1000
- Concept 26: Sense and Move
- Concept 27: Sense and Move 2
- Concept 28: Localization Summary
- Concept 29: Nanodegree Note
- Concept 30: Formal Definition of Probability 1
- Concept 31: Formal Definition of Probability 2
- Concept 32: Formal Definition of Probability 3
- Concept 33: Bayes' Rule
- Concept 34: Cancer Test
- Concept 35: Theorem of Total Probability
- Concept 36: Coin Flip Quiz
- Concept 37: Two Coin Quiz
-
Lesson 12: Histogram Filter in Python
Write the
sense
andmove
functions for a 2 dimensional histogram filter in Python.
-
Part 45 : Working with Matrices
This course will focus on two tools which are vital to self-driving car engineers: object oriented programming and linear algebra.
-
Module 01: Working with Matrices
-
Lesson 01: Section Overview
An introduction to the amazing tools and algorithms you'll learn in this lesson.
-
Lesson 02: Introduction to Kalman Filters
Learn the intuition behind the Kalman Filter, a vehicle tracking algorithm and implement a one-dimensional tracker of your own.
- Concept 01: Introduction
- Concept 02: Tracking Intro
- Concept 03: Gaussian Intro
- Concept 04: Variance Comparison
- Concept 05: Preferred Gaussian
- Concept 06: Evaluate Gaussian
- Concept 07: Maximize Gaussian
- Concept 08: Measurement and Motion 1
- Concept 09: Measurement and Motion 2
- Concept 10: Shifting the Mean
- Concept 11: Predicting the Peak
- Concept 12: Parameter Update
- Concept 13: Parameter Update 2
- Concept 14: Separated Gaussians
- Concept 15: Separated Gaussians 2
- Concept 16: New Mean and Variance
- Concept 17: Gaussian Motion
- Concept 18: Predict Function
- Concept 19: Kalman Filter Code
- Concept 20: Kalman Prediction
- Concept 21: A Break from Kalman Filters
-
Lesson 03: State and Object-Oriented Programming
In this lesson, students will learn about representing the state of a car in programming as classes and objects and mathematically as vectors that can be changed with linear algebra!
- Concept 01: Localization Steps
- Concept 02: Intro to State
- Concept 03: Motion Models
- Concept 04: Quiz: Predicting State
- Concept 05: A Different Model
- Concept 06: Kinematics
- Concept 07: Quantifying State
- Concept 08: Lesson Outline
- Concept 09: Always Moving
- Concept 10: Objects
- Concept 11: Car Object
- Concept 12: Interacting with a Car Object
- Concept 13: Car Class
- Concept 14: Car Class File
- Concept 15: Look at the Class Code
- Concept 16: Turn Right
- Concept 17: Adding Color
- Concept 18: Instantiate Multiple Cars
- Concept 19: Color Class
- Concept 20: Overloading Functions
- Concept 21: Overloading Color Addition
- Concept 22: State Vector
- Concept 23: State Transformation Matrix
- Concept 24: Matrix Multiplication
- Concept 25: 1D State Vector and More Multiplication
- Concept 26: Modify Predict State
- Concept 27: Working with Matrices
-
Lesson 04: Matrices and Transformation of State
Linear Algebra is a rich branch of math and a useful tool. In this lesson you'll learn about the matrix operations that underly multidimensional Kalman Filters.
- Concept 01: Connection to Kalman Filters
- Concept 02: Kalman Prediction
- Concept 03: Kalman Filter Land
- Concept 04: Kalman Filter Prediction
- Concept 05: Another Prediction
- Concept 06: More Kalman Filters
- Concept 07: A Note on Notation
- Concept 08: Kalman Filter Design
- Concept 09: Let's Look at Where we Are
- Concept 10: The Kalman Filter Equations
- Concept 11: Simplifying the Kalman Filter Equations
- Concept 12: The Rest of the Lesson
- Concept 13: Representing State with Matrices
- Concept 14: Kalman Equation Reference
- Concept 15: What is a vector?
- Concept 16: Vectors in Python
- Concept 17: Coding Vectors
- Concept 18: Coding Vectors (solution)
- Concept 19: Guide to Mathematical Notation
- Concept 20: Matrices in Python
- Concept 21: Coding Matrices
- Concept 22: Coding Matrices (Solution)
- Concept 23: Matrix Addition
- Concept 24: Coding Matrix Addition
- Concept 25: Coding Matrix Addition (Solution)
- Concept 26: Matrix Multiplication
- Concept 27: Coding Matrix Multiplication
- Concept 28: Coding Matrix Multiplication (Solution)
- Concept 29: Transpose of a Matrix
- Concept 30: Coding the Transpose
- Concept 31: Coding the Transpose (Solution)
- Concept 32: The Identity Matrix
- Concept 33: Coding Identity Matrix
- Concept 34: Coding Identity Matrix (Solution)
- Concept 35: Matrix Inverse
- Concept 36: Coding Matrix Inverse
- Concept 37: Coding Matrix Inverse (Solution)
- Concept 38: What to Take Away from this Lesson
-
Lesson 05: Implement Matrix Class
Practice using your object oriented programming and matrix math skills by filling out the methods in a partially-completed
Matrix
class.
-
Part 46 : C++ Basics
This course is the first step in a rewarding journey towards C++ expertise. The goal is translation: get a program written in Python, and translate it into C++.
-
Module 01: C++ Basics
-
Lesson 01: C++ Getting Started
The differences between C++ and Python and how to write C++ code.
- Concept 01: Introduction
- Concept 02: Lesson Overview
- Concept 03: Elecia White
- Concept 04: Why C++
- Concept 05: Python and C++ Comparison
- Concept 06: Static vs Dynamic Typing
- Concept 07: C++ - A Statically Typed Language
- Concept 08: Basic Data Types
- Concept 09: Floating versus Double [demonstration]
- Concept 10: Doubles are Bigger
- Concept 11: Common Errors and Error Messages
- Concept 12: C++ Functions
- Concept 13: Anatomy of a Function
- Concept 14: Multiple Outputs
- Concept 15: Two Functions Same Name
- Concept 16: Function Signatures 1
- Concept 17: Function Signatures 2
- Concept 18: If and Boolean Logic
- Concept 19: While and For Loops
- Concept 20: Switch Statement
- Concept 21: Libraries
- Concept 22: Forge on!
-
Lesson 02: C++ Vectors
To program matrix algebra operations and translate your Python code, you will need to use C++ Vectors. These vectors are similar to Python lists, but the syntax can be somewhat tricky.
- Concept 01: C++ Vectors
- Concept 02: Namespaces
- Concept 03: Python Lists vs. C++ Vectors
- Concept 04: Initializing Vector Values
- Concept 05: Vector Methods
- Concept 06: Vectors and For Loops
- Concept 07: Math and Vectors
- Concept 08: 1D Vector Playground
- Concept 09: 2D Vectors
- Concept 10: 2D Vectors and For Loops
- Concept 11: 2D Vector Playground
- Concept 12: Next Lesson
-
Lesson 03: Practical C++
Learn how to write C++ code on your own computer and compile it into a executable program without running into too many compilation errors.
-
Lesson 04: C++ Object Oriented Programming
Learn the syntax of C++ object oriented programming as well as some of the additional OOP features provided by the language.
- Concept 01: Introduction
- Concept 02: Python vs. C++
- Concept 03: Why use Object Oriented Programming?
- Concept 04: Using a Class in C++ [Demo]
- Concept 05: Explanation of the Main.cpp File
- Concept 06: Practice Using a Class
- Concept 07: Review: Anatomy of a Class
- Concept 08: Other Facets of C++ Classes
- Concept 09: Private and Public
- Concept 10: Header Files
- Concept 11: Inclusion Guards
- Concept 12: Implement a Class
- Concept 13: Class Variables
- Concept 14: Class Function Declarations
- Concept 15: Constructor Functions
- Concept 16: Set and Get Functions
- Concept 17: Matrix Functions
- Concept 18: Use an Inclusion Guard
- Concept 19: Instantiate an Object
- Concept 20: Running your Program Locally
-
Lesson 05: Python and C++ Speed
In this lesson, we'll compare the execution times of C++ and Python programs.
-
Lesson 06: Translate Python to C++
Apply your knowledge of C++ syntax by translating the Histogram Filter code from the first course into C++.
-
Part 47 : Performance Programming in C++
Explore how to write good code that runs correctly. We’ll focus primarily on low level features of C++, but we’ll discuss other best practices as well.
-
Module 01: Performance Programming in C++
-
Lesson 01: C++ Intro to Optimization
Optimizing C++ involves understanding how a computer actually runs your programs. You'll learn how C++ uses the CPU and RAM to execute your code and get a sense for what can slow things down.
- Concept 01: Course Introduction
- Concept 02: Empathize with the Computer
- Concept 03: Intro to Computer Hardware
- Concept 04: Embedded Terminal Explanation
- Concept 05: Demo: Machine Code
- Concept 06: Assembly Language
- Concept 07: Binary
- Concept 08: Demo: Binary
- Concept 09: Demo: Binary Floats
- Concept 10: Memory and the CPU
- Concept 11: Demo: Stack vs Heap
- Concept 12: Outro
-
Lesson 02: C++ Optimization Practice
Now you understand how C++ programs execute. It's time to learn specific optimization techniques and put them into practice. This lesson will prepare you for the lesson's code optimization project.
- Concept 01: Introduction
- Concept 02: Software Development and Optimization
- Concept 03: Optimization Techniques
- Concept 04: Dead Code
- Concept 05: Exercise: Remove Dead Code
- Concept 06: If Statements
- Concept 07: Exercise: If Statements
- Concept 08: For Loops
- Concept 09: Exercise: For Loops
- Concept 10: Intermediate Variables
- Concept 11: Exercise: Intermediate Variables
- Concept 12: Vector Storage
- Concept 13: Exercise: Vector Storage
- Concept 14: References
- Concept 15: Exercise: References
- Concept 16: Sebastian's Synchronization Story
- Concept 17: Static Keyword
- Concept 18: Exercise: Static Keyword
- Concept 19: Speed Challenge
-
Lesson 03: Project: Optimize Histogram Filter
Get ready to optimize some C++ code. You are provided with a working 2-dimensional histogram filter; your job is to get the histogram filter code to run faster!
-
Part 48 : Navigating Data Structures
Algorithmic thinking is a skill you’ll refine throughout your career. In this course you’ll focus on frequently used data structures and algorithms.
-
Module 01: Navigating Data Structures
-
Lesson 01: How to Solve Problems
In this lesson you'll solve a hard problem with the help of Dave Evans and you'll learn a systematic approach to solving hard computer programming problems as you do.
- Concept 01: Course Overview
- Concept 02: About this Lesson
- Concept 03: How to Solve Problems
- Concept 04: Days Between Dates
- Concept 05: Attempting the Problem [workspace]
- Concept 06: First Step
- Concept 07: Understanding a Problem
- Concept 08: The First Rule
- Concept 09: What Are the Inputs
- Concept 10: How Are Inputs Represented
- Concept 11: What Are the Outputs
- Concept 12: Obey the Rules
- Concept 13: Next Step
- Concept 14: The Expected Output
- Concept 15: Take the Next Step
- Concept 16: Try an Example
- Concept 17: Harder Example
- Concept 18: Algorithm Pseudocode
- Concept 19: Should We Implement It
- Concept 20: Different Approach
- Concept 21: Simple Mechanical Algorithm
- Concept 22: Don't Optimize Prematurely
- Concept 23: What Should We Write First
- Concept 24: Define Simple nextDay
- Concept 25: Making Progress Is Good
- Concept 26: What Should We Do Next
- Concept 27: Define daysBetweenDates
- Concept 28: Step One Pseudocode
- Concept 29: Step Two Helper Function
- Concept 30: Step Three daysBetweenDates
- Concept 31: Test for Valid Inputs
- Concept 32: Real World Problem
- Concept 33: Best Strategy
- Concept 34: Completing the Problem
- Concept 35: Finish daysBetweenDates
- Concept 36: Solution Step I
- Concept 37: Solution Step II
- Concept 38: Solution Step III
- Concept 39: Solution Step IV
- Concept 40: Conclusion
-
Lesson 02: Data Structures
The list isn't the only structure for storing data! In this lesson you'll learn about sets, dictionaries and other Python data structures.
- Concept 01: Lesson Overview
- Concept 02: Tracking Tickets
- Concept 03: Design Tradeoffs
- Concept 04: Three Approaches
- Concept 05: Representing a Single Ticket
- Concept 06: The problems with strings and lists
- Concept 07: Intro to Dictionaries
- Concept 08: Intro to Dictionaries 2
- Concept 09: Keys and Values
- Concept 10: Adding Labels
- Concept 11: Implementing Labels (and Introducing Sets)
- Concept 12: Performance Considerations
- Concept 13: Lists, Timing, and Performance
- Concept 14: How Lists Work
- Concept 15: Performance of Sets and Dictionaries
- Concept 16: How Sets and Dictionaries Work
- Concept 17: Other Data Structures [optional]
- Concept 18: Choosing good data structures
- Concept 19: Conclusion
-
Lesson 03: The Search Problem
When programming a car to drive itself you run into problems. Many of these are "search" problems. In this lesson you'll learn what search problems are and several algorithms for solving them.
- Concept 01: Lesson Overview
- Concept 02: Introduction
- Concept 03: What Is A Problem?
- Concept 04: Example: Route Finding
- Concept 05: Quiz: Tree Search
- Concept 06: Tree Search Continued
- Concept 07: Quiz: Graph Search
- Concept 08: Quiz: Breadth First Search 1
- Concept 09: Breadth First Search 2
- Concept 10: Quiz: Breadth First Search 3
- Concept 11: Breadth First Search 4
- Concept 12: Breadth First Search 5
- Concept 13: Uniform Cost Search
- Concept 14: Uniform Cost Search 1
- Concept 15: Uniform Cost Search 2
- Concept 16: Uniform Cost Search 3
- Concept 17: Uniform Cost Search 4
- Concept 18: Uniform Cost Search 5
- Concept 19: Quiz: Search Comparison
- Concept 20: Search Comparison 1
- Concept 21: Quiz: Search Comparison 2
- Concept 22: Search Comparison 3
- Concept 23: On Uniform Cost
- Concept 24: A* Search
- Concept 25: A* Search 1
- Concept 26: A* Search 2
- Concept 27: A* Search 3
- Concept 28: A* Search 4
- Concept 29: A* Search 5
- Concept 30: Optimistic Heuristic
- Concept 31: Quiz: Sliding Blocks Puzzle
- Concept 32: Sliding Blocks Puzzle 1
- Concept 33: Sliding Blocks Puzzle 2
- Concept 34: Problems with Search
- Concept 35: A Note on Implementation
-
Lesson 04: Implement Route Planner
In this lesson you will actually implement a Google-maps style routing algorithm using A star search.
-
Part 49 : Vehicle Motion and Control
This course is a crash course in two branches of mathematics which are crucial to self driving cars: calculus and trigonometry. You will learn how a self driving car uses various motion sensors to help it understand its own motion. At the end of this course you will use raw sensor data (which give information about distance driven, acceleration, and rotation rates) to reconstruct a vehicle's trajectory through space.
-
Module 01: Vehicle Motion and Control
-
Lesson 01: Odometers, Speedometers and Derivatives
Gain a conceptual understanding of the derivative and basic calculus by plotting points and finding slopes.
- Concept 01: Teleoperation at Phantom Auto
- Concept 02: Inertial Navigation
- Concept 03: Course Overview
- Concept 04: Inertial Navigation Sensors
- Concept 05: Afternoon Drive
- Concept 06: Delta x over Delta t
- Concept 07: Reducing Delta t
- Concept 08: Plotting Position vs. Time
- Concept 09: Interpreting Position vs. Time Graphs
- Concept 10: Average vs. Instantaneous Speed
- Concept 11: Defining the Derivative
- Concept 12: Understanding the Derivative
- Concept 13: Differential Notation
- Concept 14: A "Typical" Calculus Problem
- Concept 15: How Odometers Work
- Concept 16: Speed from Position Data
- Concept 17: Position, Velocity, and Acceleration
- Concept 18: Implement an Accelerometer
- Concept 19: Summary
-
Lesson 02: Accelerometers, Rate Gyros and Integrals
Learn how integrals can be used to calculate accumulated changes by finding the area under a curve.
- Concept 01: Lesson Introduction
- Concept 02: Differentiation Recap
- Concept 03: Acceleration Basics
- Concept 04: Plotting Elevator Acceleration
- Concept 05: Reasoning About Two Peaks
- Concept 06: The Integral: Area Under a Curve
- Concept 07: Approximating the Integral
- Concept 08: Approximating Integrals with Code
- Concept 09: Integrating Accelerometer Data
- Concept 10: Rate Gyros
- Concept 11: Integrating Rate Gyro Data
- Concept 12: Working with Real Data
- Concept 13: Accumulating Errors
- Concept 14: Sensor Strengths and Weaknesses
- Concept 15: Lesson Summary
-
Lesson 03: Two Dimensional Robot Motion and Trigonometry
Learn the basics of trigonometry and how to decompose a self driving car's motion into X and Y components.
- Concept 01: Lesson Introduction
- Concept 02: Plotting Robot Motion (right angles only)
- Concept 03: Plotting Robot Motion Solution
- Concept 04: Moving at an Angle
- Concept 05: Moving at 53.13 Degrees
- Concept 06: Who Cares About 53.13 Degrees?
- Concept 07: The Power of Trigonometry
- Concept 08: Opposite, Adjacent, Hypotenuse
- Concept 09: Trigonometric Ratios
- Concept 10: Looking up Sin, Cos, and Tan
- Concept 11: Trigonometry and Vehicle Motion
- Concept 12: Solving Trig Problems
- Concept 13: Keeping Track of x and y
- Concept 14: Keeping Track of x and y (solution)
- Concept 15: Conclusion
-
Lesson 04: Reconstructing Trajectories from Sensor Data
Use raw acceleration, displacement, and angular rotation data from a vehicle's accelerometer, odometer, and rate gyros to reconstruct a vehicle's X, Y trajectory.
-
Part 50 : Computer Vision, Deep Learning, and Sensor Fusion
Here, you'll first become an expert in applying Computer Vision and Deep Learning on automotive problems. You will teach the car to detect lane lines, predict steering angle, and more all based on just camera data, along with working with lidar and radar data later on.
-
Module 01: Computer Vision, Deep Learning, and Sensor Fusion
-
Lesson 04: Computer Vision Fundamentals
Get a taste of some basic computer vision techniques to find lane markings on the road.
- Concept 01: Power of Cameras
- Concept 02: Setting up the Problem
- Concept 03: Color Selection
- Concept 04: Color Selection Code Example
- Concept 05: Color Selection
- Concept 06: Region Masking
- Concept 07: Color and Region Combined
- Concept 08: Color Region
- Concept 09: Finding Lines of Any Color
- Concept 10: What is Computer Vision?
- Concept 11: Canny Edge Detection
- Concept 12: Canny to Detect Lane Lines
- Concept 13: Canny Edges
- Concept 14: Hough Transform
- Concept 15: Hough Transform to Find Lane Lines
- Concept 16: Hough Transform
- Concept 17: Parameter Tuning
-
Lesson 05: Project: Finding Lane Lines
Write code to identify lane lines on the road, first in an image, and later in a video stream.
-
Lesson 06: Camera Calibration
Learn how to calibrate your camera to remove inherent distortions that can affect its perception of the world.
- Concept 01: The Challenges with Cameras
- Concept 02: Welcome to Computer Vision
- Concept 03: Overview
- Concept 04: Getting Started
- Concept 05: Distortion Correction
- Concept 06: Effects of Distortion
- Concept 07: Pinhole Camera Model
- Concept 08: Image Formation
- Concept 09: Measuring Distortion
- Concept 10: Finding Corners
- Concept 11: Calibrating Your Camera
- Concept 12: Correcting for Distortion
- Concept 13: Lane Curvature
- Concept 14: Perspective Transform
- Concept 15: Curvature and Perspective
- Concept 16: Transform a Stop Sign
- Concept 17: Intuitions
- Concept 18: Undistort and Transform
- Concept 19: How I Did It
- Concept 20: Coming Up
-
Lesson 07: Gradients and Color Spaces
Learn how to use gradient thresholds and different color spaces to more easily identify lane markings on the road.
- Concept 01: Gradient Threshold
- Concept 02: Sobel Operator
- Concept 03: Applying Sobel
- Concept 04: Magnitude of the Gradient
- Concept 05: Direction of the Gradient
- Concept 06: Combining Thresholds
- Concept 07: Color Spaces
- Concept 08: Color Thresholding
- Concept 09: HLS intuitions
- Concept 10: HLS and Color Thresholds
- Concept 11: HLS Quiz
- Concept 12: Color and Gradient
-
Lesson 08: Advanced Computer Vision
Discover more advanced computer vision techniques to improve upon your lane lines algorithm!
- Concept 01: Reviewing Steps
- Concept 02: Processing Each Image
- Concept 03: Finding the Lines: Histogram Peaks
- Concept 04: Finding the Lines: Sliding Window
- Concept 05: Finding the Lines: Search from Prior
- Concept 06: Measuring Curvature I
- Concept 07: Measuring Curvature II
- Concept 08: Bonus Round: Computer Vision [Optional]
-
Lesson 09: Project: Advanced Lane Finding
Write a software pipeline to identify the lane boundaries in a video from a front-facing camera on a car.
-
Lesson 10: Neural Networks
Build and train neural networks from linear and logistic regression to backpropagation and multilayer perceptron networks.
- Concept 01: Neural Network Intuition
- Concept 02: Introduction to Deep Learning
- Concept 03: Starting Machine Learning
- Concept 04: A Note on Deep Learning
- Concept 05: Quiz: Housing Prices
- Concept 06: Solution: Housing Prices
- Concept 07: Linear to Logistic Regression
- Concept 08: Classification Problems 1
- Concept 09: Classification Problems 2
- Concept 10: Linear Boundaries
- Concept 11: Higher Dimensions
- Concept 12: Perceptrons
- Concept 13: Perceptrons II
- Concept 14: Why "Neural Networks"?
- Concept 15: Perceptrons as Logical Operators
- Concept 16: Perceptron Trick
- Concept 17: Perceptron Algorithm
- Concept 18: Error Functions
- Concept 19: Log-loss Error Function
- Concept 20: Discrete vs Continuous
- Concept 21: Softmax
- Concept 22: One-Hot Encoding
- Concept 23: Maximum Likelihood
- Concept 24: Maximizing Probabilities
- Concept 25: Cross-Entropy 1
- Concept 26: Cross-Entropy 2
- Concept 27: Multi-Class Cross Entropy
- Concept 28: Logistic Regression
- Concept 29: Gradient Descent
- Concept 30: Gradient Descent: The Code
- Concept 31: Perceptron vs Gradient Descent
- Concept 32: Continuous Perceptrons
- Concept 33: Non-linear Data
- Concept 34: Non-Linear Models
- Concept 35: Neural Network Architecture
- Concept 36: Feedforward
- Concept 37: Multilayer Perceptrons
- Concept 38: Backpropagation
- Concept 39: Further Reading
-
Lesson 11: TensorFlow
The Principal Scientist at Google Brain introduces you to deep learning and Tensorflow, Google's deep learning framework.
- Concept 01: Deep Learning Frameworks
- Concept 02: Introduction to Deep Neural Networks
- Concept 03: What is Deep Learning?
- Concept 04: Solving Problems - Big and Small
- Concept 05: Let's Get Started!
- Concept 06: Installing TensorFlow
- Concept 07: Hello, Tensor World!
- Concept 08: Quiz: Tensorflow Input
- Concept 09: Quiz: Tensorflow Math
- Concept 10: Transition to Classification
- Concept 11: Supervised Classification
- Concept 12: Let's make a deal
- Concept 13: Training Your Logistic Classifier
- Concept 14: TensorFlow Linear Function
- Concept 15: Quiz: Linear Function
- Concept 16: Linear Update
- Concept 17: Quiz: Softmax
- Concept 18: Quiz: TensorFlow Softmax Workspaces
- Concept 19: One-Hot Encoding
- Concept 20: Quiz: One-Hot Encoding
- Concept 21: Cross Entropy
- Concept 22: Minimizing Cross Entropy
- Concept 23: Practical Aspects of Learning
- Concept 24: Quiz: Numerical Stability
- Concept 25: Normalized Inputs and Initial Weights
- Concept 26: Measuring Performance
- Concept 27: Transition: Overfitting -> Dataset Size
- Concept 28: Validation and Test Set Size
- Concept 29: Validation Set Size
- Concept 30: Validation Test Set Size Continued
- Concept 31: Optimizing a Logistic Classifier
- Concept 32: Stochastic Gradient Descent
- Concept 33: Momentum and Learning Rate Decay
- Concept 34: Parameter Hyperspace!
- Concept 35: Mini-batch
- Concept 36: Quiz 2: Mini-batch
- Concept 37: Epochs
- Concept 38: Intro TensorFlow Neural Network
- Concept 39: Lab: Neural Network Workspaces
-
Lesson 12: Deep Neural Networks
Learn how to go from a simple neural network to a deep neural network.
- Concept 01: Let's Go Deeper
- Concept 02: Intro to Deep Neural Networks
- Concept 03: Number of Parameters
- Concept 04: Linear Models are Limited
- Concept 05: Rectified Linear Units
- Concept 06: Network of ReLUs
- Concept 07: 2-Layer Neural Network
- Concept 08: Quiz: TensorFlow ReLu
- Concept 09: No Neurons
- Concept 10: The Chain Rule
- Concept 11: Backprop
- Concept 12: Deep Neural Network in TensorFlow
- Concept 13: Training a Deep Learning Network
- Concept 14: Save and Restore TensorFlow Models
- Concept 15: Finetuning
- Concept 16: Regularization Intro
- Concept 17: Regularization
- Concept 18: Regularization Quiz
- Concept 19: Dropout
- Concept 20: Dropout Pt. 2
- Concept 21: Quiz: TensorFlow Dropout
- Concept 22: Quiz 2: TensorFlow Dropout
-
Lesson 13: Convolutional Neural Networks
Learn the theory behind Convolutional Neural Networks and how they help us dramatically improve performance in image classification.
- Concept 01: CNNs Have Taken Over
- Concept 02: Intro To CNNs
- Concept 03: Color
- Concept 04: Statistical Invariance
- Concept 05: Convolutional Networks
- Concept 06: Intuition
- Concept 07: Filters
- Concept 08: Feature Map Sizes
- Concept 09: Convolutions continued
- Concept 10: Parameters
- Concept 11: Quiz: Convolution Output Shape
- Concept 12: Solution: Convolution Output Shape
- Concept 13: Quiz: Number of Parameters
- Concept 14: Solution: Number of Parameters
- Concept 15: Quiz: Parameter Sharing
- Concept 16: Solution: Parameter Sharing
- Concept 17: Visualizing CNNs
- Concept 18: TensorFlow Convolution Layer
- Concept 19: Explore The Design Space
- Concept 20: TensorFlow Max Pooling
- Concept 21: Quiz: Pooling Intuition
- Concept 22: Solution: Pooling Intuition
- Concept 23: Quiz: Pooling Mechanics
- Concept 24: Solution: Pooling Mechanics
- Concept 25: Quiz: Pooling Practice
- Concept 26: Solution: Pooling Practice
- Concept 27: Quiz: Average Pooling
- Concept 28: Solution: Average Pooling
- Concept 29: 1x1 Convolutions
- Concept 30: Inception Module
- Concept 31: Convolutional Network in TensorFlow
- Concept 32: TensorFlow Convolutional Layer Workspaces
- Concept 33: Solution: TensorFlow Convolution Layer
- Concept 34: TensorFlow Pooling Layer Workspaces
- Concept 35: Solution: TensorFlow Pooling Layer
- Concept 36: Lab: LeNet in TensorFlow
- Concept 37: LeNet Lab Workspace
- Concept 38: CNNs - Additional Resources
-
Lesson 14: LeNet for Traffic Signs
Using the infamous LeNet neural network architecture, take your first steps toward building a Traffic Sign classifier!
- Concept 01: Introduction
- Concept 02: LeNet Architecture
- Concept 03: LeNet Data
- Concept 04: LeNet Implementation
- Concept 05: LeNet Training Pipeline
- Concept 06: LeNet Evaluation Pipeline
- Concept 07: LeNet Training the Model
- Concept 08: LeNet Testing
- Concept 09: LeNet for Traffic Signs
- Concept 10: Visualizing Layers
-
Lesson 15: Project: Traffic Sign Classifier
Put your skills to the test by using deep learning to classify different traffic signs!
-
Lesson 16: Keras
Take on the neural network framework, Keras! Build and train neural networks more easily.
- Concept 01: Deep Learning Breakthroughs
- Concept 02: Introduction
- Concept 03: Deep Learning Frameworks
- Concept 04: High Level Frameworks
- Concept 05: Keras Overview
- Concept 06: Neural Networks in Keras
- Concept 07: Convolutions in Keras
- Concept 08: Pooling in Keras
- Concept 09: Dropout in Keras
- Concept 10: Testing in Keras
- Concept 11: Conclusion
-
Lesson 17: Transfer Learning
Learn about some of the most famous neural network architectures, and how you can use them to create new models by leveraging existing canonical networks.
- Concept 01: Introduction
- Concept 02: Bryan Catanzaro
- Concept 03: GPU vs. CPU
- Concept 04: Transfer Learning
- Concept 05: Deep Learning History
- Concept 06: ImageNet
- Concept 07: AlexNet
- Concept 08: AlexNet Today
- Concept 09: VGG
- Concept 10: Empirics
- Concept 11: GoogLeNet
- Concept 12: ResNet
- Concept 13: Without Pre-trained Weights
- Concept 14: Lab: Transfer Learning
- Concept 15: Outro
- Concept 16: Bonus Round: Deep Learning [Optional]
-
Lesson 18: Project: Behavioral Cloning
Train a deep neural network to drive a car like you!
- Concept 01: Vehicle Simulator
- Concept 02: Intro to Behavioral Cloning Project
- Concept 03: Project Resources
- Concept 04: Running the Simulator
- Concept 05: Data Collection Tactics
- Concept 06: Data Collection Strategies
- Concept 07: Data Visualization
- Concept 08: Training Your Network
- Concept 09: Running Your Network
- Concept 10: Data Preprocessing
- Concept 11: More Networks
- Concept 12: Data Augmentation
- Concept 13: Using Multiple Cameras
- Concept 14: Cropping Images in Keras
- Concept 15: Even More Powerful Network
- Concept 16: More Data Collection
- Concept 17: Visualizing Loss
- Concept 18: Generators
- Concept 19: Recording Video in Autonomous Mode
- Concept 20: Project Workspace Instructions
- Concept 21: Project Behavioral Cloning
- Concept 22: Share your success - Behavioral Cloning
-
Lesson 20: Sensors
Meet the team at Mercedes who will help you track objects in real-time with Sensor Fusion.
-
Lesson 21: Kalman Filters
Sebastian Thrun will walk you through the usage and concepts of a Kalman Filter using Python.
- Concept 01: Introduction
- Concept 02: Tracking Intro
- Concept 03: Gaussian Intro
- Concept 04: Variance Comparison
- Concept 05: Preferred Gaussian
- Concept 06: Evaluate Gaussian
- Concept 07: Maximize Gaussian
- Concept 08: Measurement and Motion
- Concept 09: Shifting the Mean
- Concept 10: Predicting the Peak
- Concept 11: Parameter Update
- Concept 12: Parameter Update 2
- Concept 13: Separated Gaussians
- Concept 14: Separated Gaussians 2
- Concept 15: New Mean and Variance
- Concept 16: Gaussian Motion
- Concept 17: Predict Function
- Concept 18: Kalman Filter Code
- Concept 19: Kalman Prediction
- Concept 20: Kalman Filter Land
- Concept 21: Kalman Filter Prediction
- Concept 22: Another Prediction
- Concept 23: More Kalman Filters
- Concept 24: Kalman Filter Design
- Concept 25: Kalman Matrices
- Concept 26: Conclusion
-
Lesson 22: C++ Checkpoint
Are you ready to build Kalman Filters with C++? Take these quizzes to find out!
- Concept 01: High Performance Computing
- Concept 02: [Optional] Beta Test our Upcoming C++ Course
- Concept 03: Challenge 1
- Concept 04: Challenge 1 Solution
- Concept 05: Challenge 2
- Concept 06: Challenge 2 Solution
- Concept 07: Challenge 3
- Concept 08: Challenge 3 Solution
- Concept 09: Challenge 4
- Concept 10: Challenge 4 Solution
- Concept 11: Challenge 5
- Concept 12: Challenge 5 Solution
- Concept 13: Outro and Advice
-
Lesson 23: Geometry and Trigonometry Refresher
This optional content is designed to refresh knowledge of trigonometry and geometry in support of Term 1 objectives.
- Concept 01: About this Lesson
- Concept 02: Lesson Introduction
- Concept 03: Plotting Robot Motion (right angles only)
- Concept 04: Plotting Robot Motion Solution
- Concept 05: Moving at an Angle
- Concept 06: Moving at 53.13 Degrees
- Concept 07: Who Cares About 53.13 Degrees?
- Concept 08: The Power of Trigonometry
- Concept 09: Opposite, Adjacent, Hypotenuse
- Concept 10: Trigonometric Ratios
- Concept 11: Looking up Sin, Cos, and Tan
- Concept 12: Trigonometry and Vehicle Motion
- Concept 13: Solving Trig Problems
- Concept 14: Keeping Track of x and y
- Concept 15: Keeping Track of x and y (solution)
- Concept 16: Conclusion
-
Lesson 24: Extended Kalman Filters
Build a Kalman Filter in C++ that's capable of handling data from multiple sources.
- Concept 01: Kalman Filters in C++
- Concept 02: Intro
- Concept 03: Lesson Map and Fusion Flow
- Concept 04: Lesson Variables and Equations
- Concept 05: Estimation Problem Refresh
- Concept 06: Kalman Filter Intuition
- Concept 07: Kalman Filter Equations in C++ Part 1
- Concept 08: Kalman Filter Equations in C++ Part 2
- Concept 09: State Prediction
- Concept 10: Process Covariance Matrix
- Concept 11: Laser Measurements Part 1
- Concept 12: Laser Measurements Part 2
- Concept 13: Laser Measurements Part 3
- Concept 14: Laser Measurements Part 4
- Concept 15: Radar Measurements
- Concept 16: Mapping with a Nonlinear Function
- Concept 17: Extended Kalman Filter
- Concept 18: Multivariate Taylor Series Expansion
- Concept 19: Jacobian Matrix Part 1
- Concept 20: Jacobian Matrix Part 2
- Concept 21: EKF Algorithm Generalization
- Concept 22: Sensor Fusion General Flow
- Concept 23: Evaluating KF Performance Part 1
- Concept 24: Evaluating KF Performance 2
- Concept 25: Outro
- Concept 26: Bonus Round: Sensor Fusion [Optional]
-
Lesson 25: Extended Kalman Filters Project
Apply everything you've learned about Sensor Fusion by implementing an Extended Kalman Filter in C++!
- Concept 01: Back to Bayes Theorem
- Concept 02: Intro to Extended Kalman Filter Project
- Concept 03: Data File for EKF project
- Concept 04: File Structure
- Concept 05: Main.cpp
- Concept 06: Project Code
- Concept 07: Tips and Tricks
- Concept 08: Project Resources
- Concept 09: Project Instructions for workspaces
- Concept 10: Project Extended Kalman Filter Workspace
- Concept 11: Project Instructions for local setup
- Concept 12: uWebSocketIO Starter Guide
- Concept 13: Environment Setup (Windows)
- Concept 14: Environment Setup (Linux)
- Concept 15: Environment Setup (Mac)
- Concept 16: Compiling and Running the Project
- Concept 17: Share your success - EKF
-
Part 51 : Localization, Path Planning, Control, and System Integration
Here, you'll expand on your sensor knowledge to localize and control the vehicle. You'll evaluate sensor data from camera, radar, lidar, and GPS, and use these in closed-loop controllers that actuate the vehicle, finishing by combining all your skills on a real self-driving car!.
-
Module 01: Localization, Path Planning, Control, and System Integration
-
Lesson 01: Introduction to Localization
Meet the team that will guide you through the localization lessons!
- Concept 01: Localization Overview
- Concept 02: Lesson Introduction
- Concept 03: Localization Intuition Quiz
- Concept 04: Localization Intuition Explanation
- Concept 05: Localizing a Self Driving Car
- Concept 06: Overview of the Lessons
- Concept 07: Localization
- Concept 08: Uniform Probability Quiz
- Concept 09: Uniform Distribution
- Concept 10: Generalized Uniform Distribution
- Concept 11: Probability After Sense
- Concept 12: Compute Sum
- Concept 13: Normalize Distribution
- Concept 14: pHit and pMiss
- Concept 15: Sum of Probabilities
- Concept 16: Sense Function
- Concept 17: Normalized Sense Function
- Concept 18: Test Sense Function
- Concept 19: Multiple Measurements
- Concept 20: Lesson Breakpoint
- Concept 21: Exact Motion
- Concept 22: Move Function
- Concept 23: Inexact Motion 1
- Concept 24: Inexact Motion 2
- Concept 25: Inexact Motion 3
- Concept 26: Inexact Move Function
- Concept 27: Limit Distribution Quiz
- Concept 28: Move Twice
- Concept 29: Move 1000
- Concept 30: Sense and Move
- Concept 31: Sense and Move 2
- Concept 32: Localization Summary
- Concept 33: Formal Definition of Probability 1
- Concept 34: Formal Definition of Probability 2
- Concept 35: Formal Definition of Probability 3
- Concept 36: Bayes' Rule
- Concept 37: Cancer Test
- Concept 38: Theorem of Total Probability
- Concept 39: Coin Flip Quiz
- Concept 40: Two Coin Quiz
-
Lesson 02: Markov Localization
Learn the math behind localization as well as how to implement Markov localization in C++.
- Concept 01: Return to Bayes' Rule
- Concept 02: Overview
- Concept 03: Localization Posterior: Introduction
- Concept 04: Localization Posterior Explanation and Implementation
- Concept 05: Bayes' Rule
- Concept 06: Bayes' Filter For Localization
- Concept 07: Calculate Localization Posterior
- Concept 08: Initialize Belief State
- Concept 09: Initialize Priors Function
- Concept 10: Solution: Initialize Priors Function
- Concept 11: Quiz: How Much Data?
- Concept 12: How Much Data: Explanation
- Concept 13: Derivation Outline
- Concept 14: Apply Bayes Rule with Additional Conditions
- Concept 15: Bayes Rule and Law of Total Probability
- Concept 16: Total Probability and Markov Assumption
- Concept 17: Markov Assumption for Motion Model: Quiz
- Concept 18: Markov Assumption for Motion Model: Explanation
- Concept 19: After Applying Markov Assumption: Quiz
- Concept 20: Recursive Structure
- Concept 21: Lesson Breakpoint
- Concept 22: Implementation Details for Motion Model
- Concept 23: Noise in Motion Model: Quiz
- Concept 24: Noise in Motion Model: Solution
- Concept 25: Determine Probabilities
- Concept 26: Motion Model Probabiity I
- Concept 27: Motion Model Probability II
- Concept 28: Coding the Motion Model
- Concept 29: Solution: Coding the Motion Model
- Concept 30: Observation Model Introduction
- Concept 31: Markov Assumption for Observation Model
- Concept 32: Finalize the Bayes Localization Filter
- Concept 33: Bayes Filter Theory Summary
- Concept 34: Observation Model Probability
- Concept 35: Get Pseudo Ranges
- Concept 36: Solution: Get Pseudo Ranges
- Concept 37: Coding the Observation Model
- Concept 38: Solution: Coding the Observation Model
- Concept 39: Coding the Full Filter
- Concept 40: Solution: Coding the Full Filter
- Concept 41: Conclusion
-
Lesson 03: Motion Models
Learn about vehicle movement and motion models to predict where your car will be at a future time.
- Concept 01: Motion in Autonomy
- Concept 02: Lesson Introduction
- Concept 03: Motion Models: Bicycle Model
- Concept 04: Yaw Rate and Velocity
- Concept 05: Note on Frames of Reference
- Concept 06: Roll, Pitch and Yaw: Quiz
- Concept 07: Odometry
- Concept 08: Odometry Errors: Quiz
- Concept 09: Odometry Errors: Solution
- Concept 10: Conclusion
-
Lesson 04: Particle Filters
Sebastian will teach you what a particle filter is, as well as the theory and math behind the filter.
- Concept 01: Field Trip
- Concept 02: State Space
- Concept 03: Belief Modality
- Concept 04: Efficiency
- Concept 05: Exact or Approximate
- Concept 06: Particle Filters
- Concept 07: Using Robot Class
- Concept 08: Robot Class Details
- Concept 09: Moving Robot
- Concept 10: Add Noise
- Concept 11: Robot World
- Concept 12: Creating Particles
- Concept 13: Robot Particles
- Concept 14: Importance Weight
- Concept 15: Resampling
- Concept 16: Never Sampled 1
- Concept 17: Never Sampled 2
- Concept 18: Never Sampled 3
- Concept 19: New Particle
- Concept 20: Resampling Wheel
- Concept 21: Orientation 1
- Concept 22: Orientation 2
- Concept 23: Error
- Concept 24: You and Sebastian
- Concept 25: Filters
- Concept 26: 2012
-
Lesson 05: Implementation of a Particle Filter
Now that you know the theory, learn how to code a particle filter!
- Concept 01: Particle Filters in C++
- Concept 02: Introduction
- Concept 03: Pseudocode
- Concept 04: Initialization
- Concept 05: Program Gaussian Sampling: Code
- Concept 06: Program Gaussian Sampling: Code Solution
- Concept 07: Prediction Step
- Concept 08: Calculate Prediction Step: Quiz
- Concept 09: Calculate Prediction Step Quiz Explanation
- Concept 10: Data Association: Nearest Neighbor
- Concept 11: Nearest Neighbor Advantages and Disadvantages
- Concept 12: Update Step
- Concept 13: Calculating Error
- Concept 14: Transformations and Associations
- Concept 15: Converting Landmark Observations
- Concept 16: Quiz: Landmarks
- Concept 17: Landmarks Quiz Solution
- Concept 18: Quiz: Association
- Concept 19: Quiz: Particle Weights
- Concept 20: Particle Weights Solution
- Concept 21: Explanation of Project Code
- Concept 22: Bonus Round: Localization [Optional]
-
Lesson 06: Kidnapped Vehicle Project
In this project, you'll build a particle filter and combine it with a real map to localize a vehicle!
-
Lesson 07: Search
Learn about discrete path planning and algorithms for solving the path planning problem.
- Concept 01: Motion Planning
- Concept 02: Introduction to Path Planning
- Concept 03: About this Lesson
- Concept 04: Motion Planning
- Concept 05: Compute Cost
- Concept 06: Compute Cost 2
- Concept 07: Optimal Path
- Concept 08: Optimal Path 2
- Concept 09: Maze
- Concept 10: Maze 2
- Concept 11: First Search Program
- Concept 12: Expansion Grid
- Concept 13: Print Path
- Concept 14: A*
- Concept 15: Implement A*
- Concept 16: A* in Action
- Concept 17: Dynamic Programming
- Concept 18: Computing Value
- Concept 19: Computing Value 2
- Concept 20: Value Program
- Concept 21: Optimum Policy
- Concept 22: Left Turn Policy
- Concept 23: Planning Conclusion
-
Lesson 08: Prediction
Use data from sensor fusion to generate predictions about the likely behavior of moving objects.
- Concept 01: Introduction and Overview
- Concept 02: I/O Recap
- Concept 03: Model-Based vs Data-Driven Approaches
- Concept 04: Which is Best?
- Concept 05: Data Driven Example - Trajectory Clustering
- Concept 06: Trajectory Clustering 2 - Online Prediction
- Concept 07: Thinking about Model Based Approaches
- Concept 08: Frenet Coordinates
- Concept 09: Process Models
- Concept 10: More on Process Models
- Concept 11: Multimodal Estimation
- Concept 12: Summary of Data Driven and Model Based Approaches
- Concept 13: Overview of Hybrid Approaches
- Concept 14: Intro to Naive Bayes
- Concept 15: Naive Bayes Quiz
- Concept 16: Implement Naive Bayes C++
- Concept 17: Implement Naive Bayes C++ (solution)
- Concept 18: Conclusion
-
Lesson 09: Behavior Planning
Learn how to think about high-level behavior planning in a self-driving car.
- Concept 01: Where To
- Concept 02: Lesson Outline
- Concept 03: Understanding Output
- Concept 04: The Behavior Problem
- Concept 05: Finite State Machines
- Concept 06: Formalizing Finite State Machines
- Concept 07: FSM Intuition
- Concept 08: States for Self Driving Cars
- Concept 09: The States We'll Use
- Concept 10: Inputs to Transition Functions
- Concept 11: Behavior Planning Pseudocode
- Concept 12: Create a Cost Function - Speed Penalty
- Concept 13: Example Cost Function - Lane Change Penalty
- Concept 14: Implement a Cost Function in C++
- Concept 15: Implement a Cost Function in C++ (solution)
- Concept 16: Implement a Second Cost Function in C++
- Concept 17: Implement a Second Cost Function in C++ (solution)
- Concept 18: Cost Function Design and Weight Tweaking
- Concept 19: Cost Function Matching
- Concept 20: Scheduling Compute Time
- Concept 21: Implement Behavior Planner in C++
- Concept 22: Implement Behavior Planner in C++ (solution)
- Concept 23: Conclusion
-
Lesson 10: Trajectory Generation
Use C++ and the Eigen linear algebra library to build candidate trajectories for the vehicle to follow.
- Concept 01: From Behavior to Trajectory
- Concept 02: Lesson Overview
- Concept 03: The Motion Planning Problem
- Concept 04: Properties of Motion Planning Algorithms
- Concept 05: Types of Motion Planning Algorithms
- Concept 06: A* Reminder
- Concept 07: A* Reminder Solution
- Concept 08: Hybrid A* Introduction
- Concept 09: Hybrid A* Tradeoffs
- Concept 10: Hybrid A* Tradeoffs Solution
- Concept 11: Hybrid A* in Practice
- Concept 12: Hybrid A* Heuristics
- Concept 13: Hybrid A* Pseudocode
- Concept 14: Implement Hybrid A* in C++
- Concept 15: Implement Hybrid A* in C++ (solution)
- Concept 16: Environment Classification
- Concept 17: Frenet Reminder
- Concept 18: The Need for Time
- Concept 19: s, d, and t
- Concept 20: Trajectory Matching
- Concept 21: Structured Trajectory Generation Overview
- Concept 22: Trajectories with Boundary Conditions
- Concept 23: Jerk Minimizing Trajectories
- Concept 24: Derivation Overview
- Concept 25: Derivation Details 2
- Concept 26: Polynomial Trajectory Generation
- Concept 27: Implement Quintic Polynomial Solver C++
- Concept 28: Implement Quintic Polynomial Solver Solution
- Concept 29: What should be checked?
- Concept 30: Implementing Feasibility
- Concept 31: Putting it All Together
- Concept 32: Polynomial Trajectory Reading (optional)
- Concept 33: Polynomial Trajectory Generation Playground
- Concept 34: Conclusion
- Concept 35: Bonus Round: Path Planning [Optional]
-
Lesson 11: Project: Highway Driving
Drive a car down a highway with other cars using your own path planner!
-
Lesson 12: PID Control
Learn about and how to use PID controllers with Sebastian!
- Concept 01: Intro
- Concept 02: PID Control
- Concept 03: Proportional Control
- Concept 04: Implement P Controller
- Concept 05: P Controller Solution
- Concept 06: Oscillations
- Concept 07: PD Controller
- Concept 08: PD Controller Solution
- Concept 09: Systematic Bias
- Concept 10: Is PD Enough
- Concept 11: PID implementation
- Concept 12: PID Implementation Solution
- Concept 13: Twiddle
- Concept 14: Parameter Optimization
- Concept 15: Parameter Optimization Solution
- Concept 16: Outro
- Concept 17: Bonus Round: Control [Optional]
-
Lesson 13: Project: PID Controller
Implement a PID controller in C++ to maneuver the vehicle around the lake race track!
-
Lesson 16: Autonomous Vehicle Architecture
Learn about the system architecture for Carla, Udacity's autonomous vehicle!
- Concept 01: Putting Everything Together
- Concept 02: Introduction
- Concept 03: The Sensor Subsystem
- Concept 04: The Perception Subsystem
- Concept 05: Perception Subsystem Components
- Concept 06: The Planning Subsystem
- Concept 07: Planning Subsystem Components
- Concept 08: The Control Subsystem
- Concept 09: On to the code!
-
Lesson 17: Introduction to ROS
Obtain an architectural overview of the ROS Framework and setup your own ROS environment on your computer.
- Concept 01: Communications Between Systems
- Concept 02: Introduction
- Concept 03: Welcome to ROS Essentials
- Concept 04: Build Robots with ROS
- Concept 05: Brief History of ROS
- Concept 06: Nodes and Topics
- Concept 07: Message Passing
- Concept 08: Services
- Concept 09: Compute Graph
- Concept 10: Turtlesim Overview
- Concept 11: ROS Workspace Instructions
- Concept 12: ROS Workspace
- Concept 13: Your Virtual Machine
- Concept 14: Source the ROS Environment
- Concept 15: Run Turtlesim
- Concept 16: Turtlesim Comms: List Nodes
- Concept 17: Turtlesim Comms: List Topics
- Concept 18: Turtlesim Comms: Get Topic Info
- Concept 19: Turtlesim Comms: Message Information
- Concept 20: Turtlesim Comms: Echo a Topic
- Concept 21: Recap
-
Lesson 18: Packages & Catkin Workspaces
Learn about ROS workspace structure, essential command line utilities, and how to manage software packages within a project.
-
Lesson 19: Writing ROS Nodes
Learn to write ROS Nodes in Python.
- Concept 01: Closing In
- Concept 02: Overview
- Concept 03: ROS Publishers
- Concept 04: Simple Mover
- Concept 05: Simple Mover: The Code
- Concept 06: ROS Services
- Concept 07: Arm Mover
- Concept 08: Arm Mover: The Code
- Concept 09: Arm Mover: Launch and Interact
- Concept 10: ROS Subscribers
- Concept 11: Look Away
- Concept 12: Look Away: The Code
- Concept 13: Look Away: Launch and Interact
- Concept 14: Logging
- Concept 15: Recap
- Concept 16: Outro
-
Lesson 20: Project: Program an Autonomous Vehicle
Run your code on Carla, Udacity's own autonomous vehicle!
- Concept 01: Have Fun!
- Concept 02: Introduction
- Concept 03: Getting Started
- Concept 04: Project Overview
- Concept 05: Waypoint Updater Node (Partial)
- Concept 06: Waypoint Updater Partial Walkthrough
- Concept 07: DBW Node
- Concept 08: DBW Walkthrough
- Concept 09: Traffic Light Detection Node
- Concept 10: Object Detection Lab
- Concept 11: Detection Walkthrough
- Concept 12: Waypoint Updater Node (Full)
- Concept 13: Full Waypoint Walkthrough
- Concept 14: Project Submission and Getting Feedback
- Concept 15: Project Workspace Instructions
- Concept 16: Capstone Project Workspace
-
Lesson 21: Completing the Program
Congratulations! You've reached the end of the program! Learn how to officially complete the program and graduate.
-
Part 52 : Unscented Kalman Filters, Model Predictive Control, Advanced Deep Learning / Semantic Segmentation, and Functional Safety here
Here you'll Learn few topics like Unscented Kalman Filters, Model Predictive Control, Advanced Deep Learning / Semantic Segmentation, and Functional Safety here.
-
Module 01: Additional Content
-
Lesson 01: Object Detection
In this lesson, you'll learn how to detect and track vehicles using color and gradient features and a support vector machine classifier.
- Concept 01: Intro to Vehicle Tracking
- Concept 02: Arpan and Drew
- Concept 03: Finding Cars
- Concept 04: Object Detection Overview
- Concept 05: Manual Vehicle Detection
- Concept 06: Features
- Concept 07: Feature Intuition
- Concept 08: Color Features
- Concept 09: Template Matching
- Concept 10: Template Matching Quiz
- Concept 11: Color Histogram Features
- Concept 12: Histograms of Color
- Concept 13: Histogram Comparison
- Concept 14: Color Spaces
- Concept 15: Explore Color Spaces
- Concept 16: Spatial Binning of Color
- Concept 17: Gradient Features
- Concept 18: HOG Features
- Concept 19: Data Exploration
- Concept 20: scikit-image HOG
- Concept 21: Combining Features
- Concept 22: Combine and Normalize Features
- Concept 23: Build a Classifier
- Concept 24: Labeled Data
- Concept 25: Data Preparation
- Concept 26: Train a Classifier
- Concept 27: Parameter Tuning
- Concept 28: Color Classify
- Concept 29: HOG Classify
- Concept 30: Sliding Windows
- Concept 31: How many windows?
- Concept 32: Sliding Window Implementation
- Concept 33: Multi-scale Windows
- Concept 34: Search and Classify
- Concept 35: Hog Sub-sampling Window Search
- Concept 36: False Positives
- Concept 37: Multiple Detections & False Positives
- Concept 38: Tracking Pipeline
- Concept 39: Summary
- Concept 40: Traditional vs. Deep Learning Approach
-
Lesson 02: Unscented Kalman Filters
While Extended Kalman Filters work great for linear motion, real objects rarely move linearly. With Unscented Kalman Filters, you'll be able to accurately track non-linear motion!
- Concept 01: Introduction
- Concept 02: The CTRV Model
- Concept 03: The CTRV Model State Vector
- Concept 04: CTRV Differential Equation
- Concept 05: CTRV Integral 1
- Concept 06: CTRV Integral 2
- Concept 07: CTRV Zero Yaw Rate
- Concept 08: CTRV Process Noise Vector
- Concept 09: CTRV Process Noise Position
- Concept 10: UKF Process Chain
- Concept 11: What Problem Does the UKF Solve?
- Concept 12: UKF Basics Unscented Transformation
- Concept 13: Generating Sigma Points
- Concept 14: Generating Sigma Points Assignment 1
- Concept 15: Generating Sigma Points Assignment 2
- Concept 16: UKF Augmentation
- Concept 17: Augmentation Assignment 1
- Concept 18: Augmentation Assignment 2
- Concept 19: Sigma Point Prediction
- Concept 20: Sigma Point Prediction Assignment 1
- Concept 21: Sigma Point Prediction Assignment 2
- Concept 22: Predicted Mean and Covariance
- Concept 23: Predicted Mean and Covariance Assignment 1
- Concept 24: Predicted Mean and Covariance Assignment 2
- Concept 25: Measurement Prediction
- Concept 26: Predict Radar Measurement Assignment 1
- Concept 27: Predict Radar Measurement Assignment 2
- Concept 28: UKF Update
- Concept 29: UKF Update Assignment 1
- Concept 30: UKF Update Assignment 2
- Concept 31: Parameters and Consistency
- Concept 32: What to Expect from the Project
- Concept 33: Story Time
- Concept 34: Outro
- Concept 35: Bonus Round: Sensor Fusion [Optional]
-
Lesson 03: Vehicle Models
In this lesson, you'll learn about kinematic and dynamic vehicle models. We'll use these later with Model Predictive Control.
- Concept 01: Intro
- Concept 02: Vehicle Models
- Concept 03: State
- Concept 04: Building a Kinematic Model
- Concept 05: Global Kinematic Model
- Concept 06: Solution: Global Kinematic Model
- Concept 07: Following Trajectories
- Concept 08: Fitting Polynomials
- Concept 09: Solution: Fitting Polynomials
- Concept 10: Errors
- Concept 11: Dynamic Models
- Concept 12: Dynamic Models - Forces
- Concept 13: Dynamic Models - Slip Angle
- Concept 14: Dynamic Models - Slip Ratio
- Concept 15: Dynamic Models - Tire Models
- Concept 16: Actuator Constraints
- Concept 17: Outro
-
Lesson 04: Model Predictive Control
In this lesson, you'll learn how to frame the control problem as an optimization problem over time horizons. This is Model Predictive Control!
- Concept 01: Intro
- Concept 02: Reference State
- Concept 03: Dealing With Stopping
- Concept 04: Additional Cost Considerations
- Concept 05: Length and Duration
- Concept 06: Putting It All Together
- Concept 07: Latency
- Concept 08: Mind The Line
- Concept 09: Solution: Mind The Line
- Concept 10: Tuning MPC
- Concept 11: Outro
- Concept 12: Bonus Round: Control [Optional]
-
Lesson 05: Fully Convolutional Networks
In this lesson you'll learn the motivation for Fully Convolutional Networks and how they are structured.
- Concept 01: Intro
- Concept 02: Why Fully Convolutional Networks (FCNs) ?
- Concept 03: Fully Convolutional Networks
- Concept 04: Fully Connected to 1x1 Convolution
- Concept 05: 1x1 Convolution Quiz
- Concept 06: 1x1 Convolution Quiz Solution
- Concept 07: Transposed Convolutions
- Concept 08: Transposed Convolution Quiz
- Concept 09: Transposed Convolution Quiz Solution
- Concept 10: Skip Connections
- Concept 11: FCNs In The Wild
- Concept 12: Outro
-
Lesson 06: Scene Understanding
In this lesson you'll be introduced to the problem of Scene Understanding and the role FCNs play.
- Concept 01: Intro
- Concept 02: Bounding Boxes
- Concept 03: Semantic Segmentation
- Concept 04: Scene Understanding
- Concept 05: IoU
- Concept 06: IOU Example
- Concept 07: IoU Quiz
- Concept 08: IOU Solution
- Concept 09: FCN-8 - Encoder
- Concept 10: FCN-8 - Decoder
- Concept 11: FCN-8 - Classification & Loss
- Concept 12: Object Detection Lab
- Concept 13: Outro
-
Lesson 07: Inference Performance
In this lesson you'll become familiar with various optimizations in an effort to squeeze every last bit of performance at inference.
- Concept 01: Intro
- Concept 02: Why Bother With Performance
- Concept 03: Semantic Segmentation Revisited
- Concept 04: Interlude: Using The AMI
- Concept 05: Freezing Graphs
- Concept 06: Graph Transforms
- Concept 07: Fusion
- Concept 08: Optimizing For Inference
- Concept 09: Reducing Precision
- Concept 10: Quantization Quiz
- Concept 11: Quantization Conversion
- Concept 12: 8-bit Calculations
- Concept 13: Compilation
- Concept 14: AOT & JIT
- Concept 15: Reusing The Graph
- Concept 16: Outro
-
Part 53 : Functional Safety
learn to make safer vehicles using risk evaluation and systems engineering.
-
Module 01: Additional Content
-
Lesson 08: Introduction to Functional Safety
You will learn to make safer vehicles using risk evaluation and systems engineering.
- Concept 01: Introduction to the Functional Safety Module
- Concept 02: Introduction to the Lesson
- Concept 03: What is Safety?
- Concept 04: What is Functional Safety?
- Concept 05: Introduction to Identifying Hazards
- Concept 06: Sebastian and Technology Errors
- Concept 07: Introduction to Evaluating Risks
- Concept 08: Reducing Risk with Systems Engineering
- Concept 09: Introduction to ISO 26262
- Concept 10: The Full V Model
- Concept 11: Summary
- Concept 12: Sebastian Talks About Self-Driving Car Risks
-
Lesson 09: Functional Safety: Safety Plan
A functional safety plan is critical to any functional safety project. Here you will learn what goes into a safety plan so that you can document your own.
-
Lesson 10: Functional Safety: Hazard Analysis and Risk Assessment
In a hazard analysis and risk assessment, you will identify vehicular malfunctions and evaluate their risk levels. You can then derive safety goals defining how your vehicle will remain safe.
- Concept 01: Introduction
- Concept 02: Advanced Driver Assistance System
- Concept 03: Item Definition
- Concept 04: Introduction to Hazard Analysis and Risk Assessment
- Concept 05: Sebastian Discussing Hazards
- Concept 06: Situational Analysis
- Concept 07: Identification of Hazards
- Concept 08: Risk Assessment, Severity, Exposure, Controllability
- Concept 09: Automotive Safety Integrity Levels (ASIL)
- Concept 10: Safety Goals
- Concept 11: Lesson Summary
-
Lesson 11: Functional Safety: Functional Safety Concept
You will derive functional safety requirements from the safety goals and then add extra functionality to the system diagram. Finally you document your work, a part of functional safety.
- Concept 01: Introduction
- Concept 02: Functional Safety Analysis
- Concept 03: Functional Safety Requirements
- Concept 04: Allocation to the Architecture
- Concept 05: Architecture Refinement
- Concept 06: ASIL Inheritance
- Concept 07: ASIL Decomposition
- Concept 08: Fault Tolerant Time Interval
- Concept 09: Warning and Degradation Concept
- Concept 10: Verification and Validation Acceptance Criteria
- Concept 11: Sebastian On Requirements and Testing
- Concept 12: Summary
-
Lesson 12: Functional Safety: Technical Safety Concept
Once you have derived functional safety requirements, you drill down into more detail. In the technical safety concept, you refine your requirements into technical safety requirements.
- Concept 01: Introduction
- Concept 02: Deriving Technical Safety Requirements
- Concept 03: Other Types of Technical Safety Requirements
- Concept 04: Technical Safety Requirement Attributes
- Concept 05: Allocation of Requirements to System Architecture Elements
- Concept 06: Sebastian on Testing
- Concept 07: Summary
-
Lesson 13: Functional Safety at the Software and Hardware Levels
The last step in the vehicle safety design phase is to derive hardware and software safety requirements. In this lesson, you will derive these requirements and refine a software system architecture.
- Concept 01: Introduction
- Concept 02: V model
- Concept 03: Hardware Failure Metrics
- Concept 04: Programming Languages
- Concept 05: Software Safety Life-cycle
- Concept 06: Software Safety Requirements Lane Departure Warning
- Concept 07: Other Sources of Software Safety Requirements
- Concept 08: Freedom from Interference - Spatial
- Concept 09: Freedom from Interference - Temporal
- Concept 10: Freedom from Interference - Temporal Part 2
- Concept 11: Sebastian and Temporal Interference
- Concept 12: Freedom from Interference - Communication
- Concept 13: System Architecture Safety Design Patterns
- Concept 14: Lesson Summary
- Concept 15: Module Summary
-
Part 54 : Technical Interview Prep
Learn the skills technical interviewers expect you to know—efficiency, common algorithms, manipulating popular data structures, and how to explain a solution.
-
Lesson 01: Introduction and Efficiency
Begin the section on data structures and algorithms, including Python and efficiency practice.
- Concept 01: Course Introduction
- Concept 02: Course Outline
- Concept 03: Course Expectations
- Concept 04: Syntax
- Concept 05: Python Practice
- Concept 06: Python: The Basics
- Concept 07: Efficiency
- Concept 08: Notation Intro
- Concept 09: Notation Continued
- Concept 10: Worst Case and Approximation
- Concept 11: Efficiency Practice
-
Lesson 02: List-Based Collections
Learn the definition of a list in computer science, and see definitions and examples of list-based data structures, arrays, linked lists, stacks, and queues.
- Concept 01: Welcome to Collections
- Concept 02: Lists
- Concept 03: Arrays
- Concept 04: Python Lists
- Concept 05: Linked Lists
- Concept 06: Linked Lists in Depth
- Concept 07: Linked List Practice
- Concept 08: Stacks
- Concept 09: Stacks Details
- Concept 10: Stack Practice
- Concept 11: Queues
- Concept 12: Queue Practice
-
Lesson 03: Searching and Sorting
Explore how to search and sort with list-based data structures, including binary search and bubble, merge, and quick sort. Learn how to use recursion.
- Concept 01: Binary Search
- Concept 02: Efficiency of Binary Search
- Concept 03: Binary Search Practice
- Concept 04: Recursion
- Concept 05: Recursion Practice
- Concept 06: Intro to Sorting
- Concept 07: Bubble Sort
- Concept 08: Efficiency of Bubble Sort
- Concept 09: Bubble Sort Practice
- Concept 10: Merge Sort
- Concept 11: Efficiency of Merge Sort
- Concept 12: Merge Sort Practice
- Concept 13: Quick Sort
- Concept 14: Efficiency of Quick Sort
- Concept 15: Quick Sort Practice
-
Lesson 04: Maps and Hashing
Understand the concepts of sets, maps (dictionaries), and hashing. Examine common problems and approaches to hashing, and practice with examples.
-
Lesson 05: Trees
Learn the concepts and terminology associated with tree data structures. Investigate tree types, such as binary search trees, heaps, and self-balancing trees.
- Concept 01: Trees
- Concept 02: Tree Basics
- Concept 03: Tree Terminology
- Concept 04: Tree Practice
- Concept 05: Tree Traversal
- Concept 06: Depth-First Traversals
- Concept 07: Tree Traversal Practice
- Concept 08: Search and Delete
- Concept 09: Insert
- Concept 10: Binary Search Trees
- Concept 11: Binary Tree Practice
- Concept 12: BSTs
- Concept 13: BST Complications
- Concept 14: BST Practice
- Concept 15: Heaps
- Concept 16: Heapify
- Concept 17: Heap Implementation
- Concept 18: Self-Balancing Trees
- Concept 19: Red-Black Trees - Insertion
- Concept 20: Tree Rotations
-
Lesson 06: Graphs
Examine the theoretical concept of a graph and understand common graph terms, coded representations, properties, traversals, and paths.
- Concept 01: Graph Introduction
- Concept 02: What Is a Graph?
- Concept 03: Directions and Cycles
- Concept 04: Connectivity
- Concept 05: Graph Practice
- Concept 06: Graph Representations
- Concept 07: Adjacency Matrices
- Concept 08: Graph Representation Practice
- Concept 09: Graph Traversal
- Concept 10: DFS
- Concept 11: BFS
- Concept 12: Graph Traversal Practice
- Concept 13: Eulerian Path
-
Lesson 07: Case Studies in Algorithms
Explore famous computer science problems, specifically the Shortest Path Problem, the Knapsack Problem, and the Traveling Salesman Problem.
-
Lesson 08: Technical Interviewing Techniques
Learn about the “algorithm” for answering common technical interviewing questions. Practice and get tips for giving interviewers what they’re looking for.
- Concept 01: Interview Introduction
- Concept 02: Clarifying the Question
- Concept 03: Confirming Inputs
- Concept 04: Test Cases
- Concept 05: Brainstorming
- Concept 06: Runtime Analysis
- Concept 07: Coding
- Concept 08: Debugging
- Concept 09: Interview Wrap-Up
- Concept 10: Time for Live Practice with Pramp
- Concept 11: Next Steps
-
Lesson 09: Practice Behavioral Questions
Practice answering behavioral questions and evaluate sample responses.
- Concept 01: Introduction
- Concept 02: Self-Practice: Behavioral Questions
- Concept 03: Analyzing Behavioral Answers
- Concept 04: Time When You Showed Initiative?
- Concept 05: What Motivates You at the Workplace?
- Concept 06: A Problem and How You Dealt With It?
- Concept 07: What Do You Know About the Company?
- Concept 08: Time When You Dealt With Failure?
Part 55 : Introduction to Autonomous Flight
In this course, you will get an introduction to flight history, challenges, and vehicles. You will learn about our quadrotor test platform, work in our custom simulator, and build your first project—getting a quadrotor to take-off and fly around a backyard!.
-
Module 01: Introduction to Autonomous Flight
-
Lesson 02: Autonomous Flight
In this lesson you'll get a high level overview of the concepts underlying autonomous flight and the physical components from which flying vehicles are made.
- Concept 01: Overview
- Concept 02: History of Autonomous Flight
- Concept 03: Vehicle Morphologies
- Concept 04: Why Quadrotors?
- Concept 05: Quadrotor Components
- Concept 06: Airframe
- Concept 07: Motors / Speed Controllers
- Concept 08: Propellers
- Concept 09: Batteries
- Concept 10: Driving a Quad
- Concept 11: Attitude Control
- Concept 12: Autopilot
- Concept 13: IMU Gyros
- Concept 14: IMU Accelerometers
- Concept 15: GPS
- Concept 16: Flight Computer
- Concept 17: Summary
-
Lesson 03: Backyard Flyer
In this lesson you'll write the "Hello, world!" of drone programming as you write event-driven code that causes a quadrotor to take off, fly in a square, and land.
- Concept 01: Lesson Introduction
- Concept 02: Lesson Overview
- Concept 03: Simulator Demonstration
- Concept 04: Simulator Exploration - Manual Flight
- Concept 05: Flight Computer Programming
- Concept 06: Environment Setup
- Concept 07: Simulator Exploration - Programmatic Flight
- Concept 08: The Problems with Sequential Execution
- Concept 09: Event Driven Programming
- Concept 10: Event Driven Programming Explained
- Concept 11: A Simple Flight Plan
- Concept 12: Phases of Flight
- Concept 13: Project Development Workflow Options
- Concept 14: Backyard Flyer (local development)
- Concept 15: Virtual Machine Intro
- Concept 16: Backyard Flyer (X-Windows)
- Concept 17: Project Cheat Sheet
-
Lesson 04: Drone Integration
Walkthrough the steps you need to take to get your code running on an actual drone! We'll show you the steps for the "Intel Aero", but a lot of what you'll learn applies to other drones as well.
- Concept 01: Drone Integration Introduction
- Concept 02: Intel Aero Unboxing
- Concept 03: Intel Aero First Boot
- Concept 04: A Note on Safety
- Concept 05: Intel Aero Setup
- Concept 06: Getting Familiar with QGroundControl
- Concept 07: Configure PX4
- Concept 08: Modifying Backyard Flyer
- Concept 09: Let's go Fly!
- Concept 10: Crazyflie Introduction
- Concept 11: Crazyflie Backyard Flyer
- Concept 12: Crazyflie Keyboard Control
-
Part 56 : Planning
Flying robots must traverse complex, dynamic environments. Wind, obstacles, unreliable sensor data, and other randomness all present significant challenges. In this course, you will learn the fundamentals of aerial path planning. You will begin with 2D problems, optimize your solutions using waypoints, and then scale your solutions to three dimensions. You will apply these skills in your second project—autonomously navigating your drone through a dense urban environment.
-
Lesson 01: Planning as Search
Solving the planning problem really comes down performing search through a state space to find a path from a start state to a goal state and here you'll get a chance to do just that!
- Concept 01: Sebastian Introduction
- Concept 02: Transition to Planning
- Concept 03: The Planning Problem
- Concept 04: Search Space
- Concept 05: Grid Representation
- Concept 06: Search
- Concept 07: Partial Plans
- Concept 08: Breadth vs Depth
- Concept 09: Jupyter Notebooks
- Concept 10: Breadth-First Exercise
- Concept 11: Cost
- Concept 12: Cost Exercise
- Concept 13: Heuristics
- Concept 14: A*
- Concept 15: A* Exercise
- Concept 16: Summary
-
Lesson 02: Flying Car Representation
Your vehicle has a physical size and orientation in the world and here you'll learn how to think about position and orientation as part of your planning solution.
- Concept 01: Sebastian Introduction
- Concept 02: Introduction
- Concept 03: Geodetic Frame
- Concept 04: ECEF Frame
- Concept 05: Geodetic to NED Exercise
- Concept 06: Body Frame
- Concept 07: Euler Angles
- Concept 08: Gimbal Lock
- Concept 09: Rotation Matrices
- Concept 10: Euler Rotations Exercise
- Concept 11: Quaternions
- Concept 12: Quaternion Exercise
- Concept 13: Motions as Transformations
- Concept 14: Configuration Space
- Concept 15: Configuration Space Exercise
- Concept 16: Summary
-
Lesson 03: From Grids to Graphs
Graphs are really just a way of describing how your search space is connected. Here you'll learn about the tradeoffs between grids and graphs and each can be used in your planning representation.
- Concept 01: Sebastian Introduction
- Concept 02: Introduction to Graphs
- Concept 03: Waypoint Extraction
- Concept 04: Collinearity
- Concept 05: Collinearity Exercise
- Concept 06: Ray Tracing
- Concept 07: Bresenham
- Concept 08: Bresenham Exercise
- Concept 09: Putting it Together Exercise
- Concept 10: Grids to Graphs
- Concept 11: Graph Tradeoffs
- Concept 12: Generating Graphs
- Concept 13: Medial Axis Exercise
- Concept 14: Voronoi Graph Exercise
- Concept 15: Graph Search Exercise
- Concept 16: Deadbands
- Concept 17: Summary
-
Lesson 04: Moving into 3D
Here you'll make the leap from two dimensions to three dimensions and discover how you can use different representations of your search space to optimize your planning solution.
- Concept 01: Sebastian Introduction
- Concept 02: Introduction
- Concept 03: 3D Grids
- Concept 04: Voxel Map Exercise
- Concept 05: 2.5D Maps
- Concept 06: Random Sampling
- Concept 07: Random Sampling Exercise
- Concept 08: Probabilistic Roadmap
- Concept 09: Probabilistic Roadmap Exercise
- Concept 10: Local Planning
- Concept 11: Receding Horizon
- Concept 12: Receding Horizon Exercise
- Concept 13: Replanning
- Concept 14: Summary
-
Lesson 05: Real World Planning
In this lesson, you'll dive deep into some advanced concepts that are crucial to motion planning in the real world, where a consideration for physics and preparedness for the unexpected are crucial.
- Concept 01: Sebastian Introduction
- Concept 02: Intro
- Concept 03: Constraints
- Concept 04: Modelling Dynamics
- Concept 05: Modeling Dynamics Exercise
- Concept 06: Dubins Car
- Concept 07: Dubins Car Exercise
- Concept 08: Steering
- Concept 09: Steering Exercise
- Concept 10: RRT
- Concept 11: RRT Exercise
- Concept 12: Adding Obstacles
- Concept 13: Potential Field Planning
- Concept 14: Potential Field Exercise
- Concept 15: Summary
-
Lesson 06: Project: 3D Motion Planning
In this project, you'll get a chance to apply what you've learned about 3D motion planning from the last several lessons to plan and execute a mission in a complex urban environment!
Part 57 : Controls
In the previous course, we implemented 3D path planning but assumed a solution for actually following paths. In reality, moving a flying vehicle requires determining appropriate low-level motor controls. In this course, you will build a nonlinear cascaded controller and incorporate it into your software in the project.
-
Lesson 01: Vehicle Dynamics
Learn how flying vehicles move in one and two dimensions by understanding how propellers create forces and moments which cause accelerations and rotations.
- Concept 01: Sebastian Introduction
- Concept 02: Introduction to Vehicle Dynamics
- Concept 03: The Forces on A Quadrotor
- Concept 04: Force and Translational Equilibrium
- Concept 05: Rotational Equilibrium 1
- Concept 06: Rotational Equilibrium 2
- Concept 07: Rotor Physics
- Concept 08: Unbalanced Forces Cause Linear Acceleration
- Concept 09: F Equals MA
- Concept 10: Unbalanced Moments cause Rotational Acceleration
- Concept 11: Coaxial Drone Dynamics Exercise
- Concept 12: Coaxial Dynamics Explained
- Concept 13: Tracking Changes to State
- Concept 14: Second Order Systems
- Concept 15: Tracking Changes to State Exercise
- Concept 16: Compact Representations of State
- Concept 17: Uncontrolled Drone Exercise
- Concept 18: Motion in Two Dimensions
- Concept 19: Decomposing Thrust Vectors
- Concept 20: Calculating Moments
- Concept 21: Rotation Rates to Moments and Thrusts
- Concept 22: Calculating Accelerations in 2D Exercise
- Concept 23: Controlling a 2D Quad
- Concept 24: Controlling a 2D Drone Exercise
- Concept 25: Summary
-
Lesson 02: Introduction to Vehicle Control
Learn how to control a drone moving in one dimension using Proportional Integral Derivative (PID) Control.
- Concept 01: Sebastian Introduction
- Concept 02: Lesson Introduction
- Concept 03: Perfect Control
- Concept 04: The problem with open loop control Exercise
- Concept 05: Perfect is Impossible
- Concept 06: Simple Control Diagrams
- Concept 07: P Controllers
- Concept 08: Implement P Controller Exercise
- Concept 09: Two Problems with P Control
- Concept 10: P Controller Tuning Exercise
- Concept 11: Limitations of P Control
- Concept 12: PD Control
- Concept 13: Implement PD Controller Exercise
- Concept 14: PD Incorporating Feedforward Control
- Concept 15: Implement PD with Feedforward Exercise
- Concept 16: PD Math and Reparametrization
- Concept 17: Overshoot, Rise Time, Settling Time
- Concept 18: More Model Errors
- Concept 19: PID Control
- Concept 20: PID Control Exercise
- Concept 21: PID Control Example
- Concept 22: PID controller in practice
- Concept 23: Summary
-
Lesson 03: Control Architecture
The controls problem becomes more difficult in two dimensions. Learn how to use a cascaded PID control architecture to control a flying vehicle that moves in two dimensions.
- Concept 01: Sebastian Introduction
- Concept 02: Intro to 2D Dynamics
- Concept 03: Underactuation
- Concept 04: Coupling
- Concept 05: Lesson Overview
- Concept 06: Motivation for Linearization
- Concept 07: Linearization Math
- Concept 08: Linearization Intuition 1
- Concept 09: Linearization Intuition 2
- Concept 10: Linearization Intuition 3
- Concept 11: Linearization Exploration Exercise
- Concept 12: Controlling Motion Near Hover
- Concept 13: Intro to Cascaded Control
- Concept 14: Implement Linear Controller Exercise
- Concept 15: Separation of Time Scales
- Concept 16: Non-Linear Control
- Concept 17: Implement Non-Linear Controller Exercise
- Concept 18: Comparing Trajectories Exercise
- Concept 19: Summary
-
Lesson 04: Full 3D Control
In this lesson you'll take everything you've learned so far about vehicle dynamics and control and put it together to control a quadrotor that moves in three dimensions.
- Concept 01: Sebastian Introduction
- Concept 02: Lesson Overview
- Concept 03: Review of 2D Dynamics
- Concept 04: World vs Body Frames
- Concept 05: Tracking 3D Dynamics Overview
- Concept 06: Notebook Walkthrough
- Concept 07: 3D Drone Part 1 Exercise
- Concept 08: Tracking Rotations in 3D
- Concept 09: Euler's Equations in a Rotating Frame
- Concept 10: 3D Drone Part 2 Exercise
- Concept 11: Integrating PQR Into the World Frame
- Concept 12: 3D Drone Part 3 Exercise
- Concept 13: Summary of 3D Dynamics
- Concept 14: "Control Knobs" for a 3D Quadrotor
- Concept 15: 3D Control Architecture
- Concept 16: First vs Second Order Systems
- Concept 17: Understanding Attitude Control Equations
- Concept 18: 3D Drone Part 4 Exercise
- Concept 19: Controller Design
- Concept 20: Controller Design 2
- Concept 21: 3D Drone Part 5 Exercise
- Concept 22: Practical Considerations
- Concept 23: From Path Planning to Control
- Concept 24: Trajectory Generation Exercise
- Concept 25: Polynomial Segmentation Exercise
- Concept 26: Conclusion
-
Lesson 05: Project: Building a Controller
In this project you'll implement a controller for a quadrotor in C++.
-
Lesson 06: Drone Integration
Walkthrough the steps you need to take to get a version of your controls project on a crazyflie!
Part 58 : Estimation
In this course, we will finish peeling back the layers of your autonomous flight solution. Instead of assuming perfect sensor readings, you will utilize sensor fusion and filtering. You will design an Extended Kalman Filter (EKF) to estimate attitude and position from IMU and GPS data of a flying robot.
-
Lesson 01: Introduction to Estimation
Review basic probability and learn three approaches to state estimation for a stationary vehicle.
- Concept 01: Sebastian Introduction
- Concept 02: Welcome Back
- Concept 03: Intro to Estimation
- Concept 04: Review of Discrete Probability
- Concept 05: Expected Value
- Concept 06: Variance
- Concept 07: Playing with Probabilities Notebook
- Concept 08: Probability Density Functions
- Concept 09: Uniform Distribution Notebook
- Concept 10: Uniform and Gaussian Distributions
- Concept 11: Estimating Parameters from Data
- Concept 12: Multivariate Distributions
- Concept 13: 2D Gaussian Notebook
- Concept 14: Joint and Marginal Distributions
- Concept 15: Correlation and Independence
- Concept 16: Conditional Distributions
- Concept 17: Applying Bayes' Rule
- Concept 18: Approaches to Estimation
- Concept 19: Intro to Least Squares
- Concept 20: Deriving the Maximum Likelihood Estimator
- Concept 21: Fitting a Line with Linear Least Squares
- Concept 22: Least Squares Notebook
- Concept 23: Recursive Estimation
- Concept 24: Recursive Estimation Notebook
- Concept 25: The Problem with non-Linearities
- Concept 26: Calculating the Jacobian
- Concept 27: Non-Linear Least Squares Notebook
- Concept 28: Conclusion
-
Lesson 02: Introduction to Sensors
In this lesson you'll learn about the sensors a drone uses to localize itself in the world. You'll implement sensor models, analyze sources of error, and perform calibration of various sensors.
- Concept 01: Sebastian Introduction
- Concept 02: Welcome Back
- Concept 03: Introduction
- Concept 04: Complementary Sensors
- Concept 05: Inertial Sensors
- Concept 06: Rate Gyro Physics and Implementation
- Concept 07: Gyro Measurement Model
- Concept 08: Gyroscope Measurements
- Concept 09: Dead Reckoning Uncertainty
- Concept 10: Full 3D Attitude Update
- Concept 11: Accelerometers
- Concept 12: Dead Reckoning 3D
- Concept 13: Two Things Accelerometers Measure
- Concept 14: Inertial Navigation vs Position Fixing
- Concept 15: Reading an IMU Spec Sheet
- Concept 16: Three Sources of Error
- Concept 17: Calibration
- Concept 18: IMU Calibration
- Concept 19: Magnetometer Intuition
- Concept 20: Magnetometer Errors and Calibration
- Concept 21: Magnetometer Calibration
- Concept 22: GPS Overview
- Concept 23: GPS Math
- Concept 24: GPS Errors, Initialization, and Calibration
- Concept 25: The Barometer
- Concept 26: Barometer Errors and Calibration
- Concept 27: Barometer and GPS integration
- Concept 28: Summary
-
Lesson 03: Extended Kalman Filters
In this lesson you'll learn how to estimate the state of a drone that's actually moving! You'll implement a Kalman Filter for a 1D drone and an Extended Kalman Filter for a non-linear 2D drone.
- Concept 01: Sebastian Introduction
- Concept 02: Introduction
- Concept 03: 1D PID Control
- Concept 04: Controlling with Noisy Measurements
- Concept 05: Averaging Measurements
- Concept 06: Recursive Averaging
- Concept 07: Averaging Exercise
- Concept 08: The Need for Control
- Concept 09: Estimation Filters
- Concept 10: The Bayes Filter
- Concept 11: The Kalman Filter
- Concept 12: Kalman Predict
- Concept 13: The Measurement Function
- Concept 14: Kalman Update
- Concept 15: Kalman Filter Exercise
- Concept 16: Nonlinear Drone
- Concept 17: EKF Predict
- Concept 18: Non-linear Measurement Model
- Concept 19: EKF Update
- Concept 20: EKF Exercise
- Concept 21: Summary
-
Lesson 04: The 3D EKF and UKF
Take what you learned in the previous lesson and generalize to three dimensions. After learning about the 3D EKF you'll also learn another estimation algorithm called the Unscented Kalman Filter.
- Concept 01: Sebastian Introduction
- Concept 02: Welcome Back
- Concept 03: 3D Estimation Overview
- Concept 04: EKF Tradeoffs 1 - State
- Concept 05: EKF Tradeoffs 2 - Control
- Concept 06: Attitude Estimation
- Concept 07: Complementary Filter Math
- Concept 08: Attitude Estimation Exercise
- Concept 09: EKF Implementation 1 - Overview
- Concept 10: EKF Implementation 2 - Predict
- Concept 11: EKF Implementation 3 - Update
- Concept 12: Kalman Recap
- Concept 13: Drone in 3D Exercise
- Concept 14: The Unscented Kalman Filter
- Concept 15: UKF Sigma Points
- Concept 16: UKF Predict
- Concept 17: UKF Update
- Concept 18: UKF Exercise
- Concept 19: Conclusion
-
Lesson 05: Project: Estimation
In this project you'll implement an estimator to track the position and attitude of a quadrotor moving in three dimensions.
-
Lesson 06: GPS Denied Navigation
How do you estimate vehicle state when you don't have GPS? In this lesson you'll learn about optical flow and particle filters as two approaches to solving this problem.
- Concept 01: Sebastian Introduction
- Concept 02: Introduction
- Concept 03: Optical Flow Estimation Overview
- Concept 04: Good Features to Track
- Concept 05: Feature Tracker Exercise
- Concept 06: Tracking a Single Pixel
- Concept 07: Lucas Kanade Optical Flow
- Concept 08: Optical Flow Exercise
- Concept 09: Translating Optical Flow to Vehicle Velocity
- Concept 10: Intro to Particle Filters
- Concept 11: Sampled Distributions
- Concept 12: Propagating Samples
- Concept 13: Numerical Estimation Exercise
- Concept 14: Sampling from Arbitrary Distributions
- Concept 15: Sensor Modeling
- Concept 16: Monte Carlo Sampling for Sensor Fusion
- Concept 17: Sensor Fusion Exercise
- Concept 18: Putting it All Together
- Concept 19: Particle Filter Exercise
- Concept 20: Particle Filter Pros and Cons
- Concept 21: Conclusion
Part 59 : Fixed Wing
While quadrotors are the ideal test platform for aerial robotics, flying cars and other long-range aircrafts leverage the aerodynamic efficiencies of fixed-wing flight. In this course, you will learn how to adapt the concepts you’ve learned so far and successfully fly a fixed-wing aircraft in simulation.
-
Lesson 01: Introduction to Fixed-Wing Flight
This lesson provides a brief introduction to Fixed Wing Vehicles, flying cars, and the components of typical fixed-wing aircraft.
- Concept 01: What's a Flying Car?
- Concept 02: History of Hybrid Vehicles
- Concept 03: Fixed Wing vs. Rotary Wing Aircraft
- Concept 04: Components of a Fixed Wing Aircraft
- Concept 05: Components of a Wing
- Concept 06: Installing the Fixed Wing Simulator
- Concept 07: Fixed Wing Control Surfaces
- Concept 08: Summary
-
Lesson 02: Lift and Drag
Build mathematical models for lift and drag, the aerodynamic forces that make fixed wing flight possible (and difficult).
- Concept 01: Introduction to Lift and Drag
- Concept 02: Physics Review
- Concept 03: Fixed Wing Dynamics: Longitudinal vs Lateral/Directional
- Concept 04: Longitudinal Analysis 1
- Concept 05: Longitudinal Analysis 2
- Concept 06: Frames of Reference Summary
- Concept 07: Rotation Matrices Exercise
- Concept 08: Lift and Stall
- Concept 09: Calculating Lift
- Concept 10: Drag
- Concept 11: Pitching
- Concept 12: Trim States and Simplified Models
- Concept 13: Straight and Level Flight
- Concept 14: Climbing Flight
- Concept 15: Fixed Wing Cheat Sheet
- Concept 16: Lift and Drag Exercise
- Concept 17: Conclusion
-
Lesson 03: Longitudinal Model
Analyze both non-linear and linear models of a fixed-wing aircraft's motion in the x-z plane and use linear algebra to identify two oscillatory "modes" of motion.
- Concept 01: Lesson Introduction
- Concept 02: Force-Free Motion
- Concept 03: Characterizing State Variables
- Concept 04: Full Longitudinal Dynamics
- Concept 05: Exploring Longitudinal Dynamics Exercise
- Concept 06: Understanding Oscillations
- Concept 07: The Rest of the Lesson
- Concept 08: Linearized Model
- Concept 09: Eigenvalues and Eigenvectors
- Concept 10: Exploring Complex Exponentials Notebook
- Concept 11: Exponentials and Stability
- Concept 12: Modes of Motion
- Concept 13: Identifying Eigenvalues Exercise
- Concept 14: Short Period Response and Phugoid
- Concept 15: Conclusion
-
Lesson 04: Lateral-Directional Model
Understand the lateral-directional dynamics of fixed wing vehicles by looking at aircraft from above and behind.
- Concept 01: Introduction to Lateral-Directional Dynamics
- Concept 02: Force-Free Motion
- Concept 03: Incorporating Forces
- Concept 04: Coordinated Turns
- Concept 05: Roll-Yaw Coupling
- Concept 06: Static Stability
- Concept 07: The Rest of the Lesson
- Concept 08: Linearized Model
- Concept 09: Identifying Dynamic Modes
- Concept 10: Stability Analysis 1: Roll Mode
- Concept 11: Stability Analysis 2: Spiral Mode
- Concept 12: Stability Analysis 3: Dutch Roll
- Concept 13: Conclusion
-
Lesson 05: Fixed-Wing Autopilot
Apply the concepts of PID control by implementing an autopilot for fixed wing flight.
- Concept 01: Lesson Introduction
- Concept 02: Fixed Wing Trajectories
- Concept 03: System Architecture
- Concept 04: Controller Design Principles
- Concept 05: Lateral Autopilot
- Concept 06: Course Hold
- Concept 07: Inner Loops: Roll and Sideslip Hold
- Concept 08: Longitudinal Autopilot
- Concept 09: Longitudinal Control Loops
- Concept 10: Autopilot Tuning
- Concept 11: Integrator Windup
- Concept 12: Conclusion
-
Lesson 06: Optional Project: Fixed-Wing Control
In this optional project you will control a simulated fixed-wing aircraft by implementing and tuning your own autopilot in Python.