– Part 01-Module 01-Lesson 01_Welcome

Learn how to simulate your first robotic environment with Gazebo, the most common simulation engine used by Roboticists around the world.

– Part 01-Module 01-Lesson 02_What is a Robot

– Part 01-Module 01-Lesson 03_Search and Sample Return

– Part 01-Module 01-Lesson 04_Career Support Overview

– Part 01-Module 01-Lesson 05_Get Help from Peers and Mentors

– Part 01-Module 04-Lesson 01_ Explores – Biologically Inspired Robots

– Part 01-Module 04-Lesson 02_6 Questions on Robotics Careers

– Part 01-Module 05-Lesson 01_Intro to Kinematics

– Part 01-Module 05-Lesson 02_Forward and Inverse Kinematics

– Part 01-Module 06-Lesson 01_ Explores – Human Robot Interaction Robot Ethics

– Part 01-Module 06-Lesson 02_Product Pitch

– Part 01-Module 07-Lesson 01_Perception Overview

– Part 01-Module 07-Lesson 02_Introduction to 3D Perception

– Part 01-Module 07-Lesson 03_Calibration, Filtering, and Segmentation

– Part 01-Module 07-Lesson 04_Clustering for Segmentation

– Part 01-Module 07-Lesson 05_Object Recognition

– Part 01-Module 07-Lesson 06_3D Perception Project

– Part 01-Module 08-Lesson 01_ Explores – Soft Robotics

– Part 01-Module 09-Lesson 01_ Explores – Robot Grasping

– Part 01-Module 10-Lesson 01_Introduction to Controls

– Part 01-Module 10-Lesson 02_Quadrotor Control using PID

– Part 01-Module 11-Lesson 01_ Explores Swarm Robotics

– Part 01-Module 11-Lesson 02_Networking in Robotics

– Part 01-Module 12-Lesson 01_Intro to Neural Networks

– Part 01-Module 12-Lesson 02_TensorFlow for Deep Learning

– Part 01-Module 12-Lesson 03_Deep Neural Networks

– Part 01-Module 12-Lesson 04_Convolutional Neural Networks

– Part 01-Module 12-Lesson 05_Fully Convolutional Networks

– Part 01-Module 12-Lesson 06_Lab Semantic Segmentation

– Part 01-Module 12-Lesson 07_Project Follow Me

– Part 01-Module 12-Lesson 08_Term 1 Outro

– Part 01-Module 13-Lesson 01_Introduction to C++ for Robotics

Discover how ROS provides a flexible and unified software environment for developing robots in a modular and reusable manner. Learn how to manage existing ROS packages within a project, and how to write ROS Nodes of your own in C++.

– Part 02-Module 01-Lesson 01_Introduction to Term 2

– Part 02-Module 01-Lesson 02_The Jetson TX2

Part 02 Module 01 Lesson 03_Interacting with Robotics Hardware

– Part 02-Module 01-Lesson 04_Lab Hardware Hello World

– Part 02-Module 01-Lesson 05_Robotics Sensor Options

– Part 02-Module 02-Lesson 01_Inference Development

– Part 02-Module 02-Lesson 02_Inference Applications in Robotics

– Part 02-Module 02-Lesson 03_Project Robotic Inference

– Part 02-Module 03-Lesson 01_Introduction to Localization

Learn how Gaussian filters can be used to estimate noisy sensor readings, and how to estimate a robot’s position relative to a known map of the environment with Monte Carlo Localization (MCL).

– Part 02-Module 03-Lesson 02_Kalman Filters

– Part 02-Module 03-Lesson 03_Lab Kalman Filters

– Part 02-Module 03-Lesson 04_Monte Carlo Localization

– Part 02-Module 03-Lesson 05_Build MCL in C++

– Part 02-Module 03-Lesson 06_Project Where Am I

– Part 02-Module 04-Lesson 01_Introduction to Mapping and SLAM

Learn how to create a Simultaneous Localization and Mapping (SLAM) implementation with ROS packages and C++. You’ll achieve this by combining mapping algorithms with what you learned in the localization lessons.

– Part 02-Module 04-Lesson 02_ Occupancy Grid Mapping

– Part 02-Module 04-Lesson 03_Grid-based FastSLAM

– Part 02-Module 04-Lesson 04_GraphSLAM

– Part 02-Module 04-Lesson 05_Project Map My World Robot

– Part 02-Module 05-Lesson 01_Intro to RL for Robotics

– Part 02-Module 05-Lesson 02_RL Basics

– Part 02-Module 05-Lesson 03_Q-Learning Lab

– Part 02-Module 05-Lesson 04_Deep RL

– Part 02-Module 05-Lesson 05_DQN Lab

– Part 02-Module 05-Lesson 06_Deep RL Manipulator

– Part 02-Module 05-Lesson 07_Project Deep RL Arm Manipulation

– Part 02-Module 06-Lesson 01_Intro to Path Planning and Navigation

Learn different Path Planning and Navigation algorithms. Then, combine SLAM and Navigation into a home service robot that can autonomously transport objects in your home!

– Part 02-Module 06-Lesson 02_Classic Path Planning

– Part 02-Module 06-Lesson 03_Lab Path Planning

– Part 02-Module 06-Lesson 04_Sample-Based and Probabilistic Path Planning

– Part 02-Module 06-Lesson 05_Research in Navigation

– Part 02-Module 06-Lesson 06_Project Home Service Robot

– Part 02-Module 07-Lesson 01_Strengthen Your Online Presence Using LinkedIn

– Part 01-Module 03-Lesson 01_GitHub

– Part 02-Module 07-Lesson 02_Optimize Your GitHub Profile

– Part 02-Module 08-Lesson 01_Completing the Program

– Part 03-Module 01-Lesson 01_Project Introduction

– Part 03-Module 01-Lesson 02_Project Details

– Part 04-Module 01-Lesson 01_Autonomous Systems Interview Practice

01. What is a Robot

 

Explore the Simulator

Unity Environment

We used the Unity game engine to build the simulated environment you’ll be navigating through in this project. Unity offers a free personal license and is largely open-source, making it a great option for a program like this. It works across Linux, OS X and Windows platforms.

You don’t need to know anything more about Unity to use the simulator, but if you want to learn more or get started building your own environments check out their website! The code we used to build the environment for this project is all open-source and you can check out the code in this repository.

Exploring the Simulator

 

 

Play

00:00

01:50

Enable captions

Settings

Enter fullscreen

Play

Download and Launch the Simulator

 

The first step of the project is to download the simulator and familiarize yourself with how it works. Use the links below to get the simulator version that’s appropriate for your operating system.

MacOS Simulator Build
Linux Simulator Build
Windows Simulator Build

When you launch (double click on) the simulator you will have the option to set the resolution and graphics quality. You could choose lower resolution / quality for faster rendering. Be sure to check the box next to Windowed so the simulator doesn’t take up the full screen. Click on the input tab to change the keyboard input definitions; this may be necessary if you are on a non-U.S. keyboard. The next time you launch, these settings will be restored. Click Play to launch the simulator!

Once you click Play you’ll see a screen that looks like the one below. Choose Training Mode to be able to manually drive the rover around in the environment. You’ll notice the app takes over your mouse immediately. You can use the Esc key to get your mouse back.

Manual Controls

 

Experiment in Training Mode with the various manual functions.

  • Throttle, brake and steering: wsad letters or arrow keys (can also steer with mouse)
  • Toggle mouse function between camera perspective and steering: esc key
  • Change the viewing perspective: tab key or mouse
  • Change zoom on the viewing camera: mouse scroll
  • Reset viewing camera zoom: middle mouse button (MMB)
  • Activate the robot arm to pick up a sample: left mouse button (LMB) or enter key (only works when is near objective = yes)

Have a look around and explore the environment!

Note: Unity vs. Gazebo

Later in this program you’ll be using the Gazebo simulation environment because of the powerful physics engine it offers and its seamless integration with the Robot Operating System or ROS, which you’ll learn about in the next lessons. Unity, however, offers much more photorealistic image quality than Gazebo, which can be a major advantage for computer vision applications. In this program we provide you with exposure to both Unity and Gazebo because both are powerful tools and each has its own unique advantages.

Simulation is a huge part of robotics development and several teams (including our own) are working to make the connection between ROS and Unity much more fluid. As such, going forward as a roboticist, you can expect to get even more exposure to working with game engines in simulation.

Wishlist 0
Open wishlist page Continue shopping