INSTRUCTOR and TA

    Md Jahidul Islam. Office Hours: Thursday 4:00 PM - 5:00 PM. @LAR-339D.
    Lecture: M/W/F 3:00PM-3:50PM @LAR-330
    TA: TBD. Office Hours: TBD.

COURSE PREREQUISITES

    Microprocessor Applications or embedded systems or equivalent courses
    Fluent in object-oriented programming (Python and/or C++)
    Basics of linear algebra and calculus

Textbooks

    Introduction to Robotics: Mechanics and Control (Pearson; 4th Edition).
    By John Craig. ISBN-13: 978-0133489798. Pearson; 4th edition.

    Probabilistic Robotics (Int. Robotics & Autonomous Agents series; 1st Edition).
    By Sebastian Thrun, Wolfram Burgard and Dieter Fox.
    ISBN-13: 978-0262201629, ISBN-10: 0262201623.

RECOMMENDED MATERIALS

Tutorials and Sample Projects


Course Materials


Week
1-2
Lecture #1: Introduction to Robotics and AI
• Robotics and AI overview: past, present, and future • Robot Operating System (ROS2) and OpenCV overview
Week
2-3
Lecture #2: ROS2 Pipeline for Robotics
• Sensory integration and design choices • Robot Operating System (ROS2) and OpenCV overview
Homework 1: ROS2 Middleware Integration and Simulation
Part A: ROS2 Humble Introduction [20%]
  • Expectation: We expect that ROS2, Python (3.8+), and OpenCV (3.2+) will be up and running on your device, and you will know the operational basics. It is recommended that we follow the “beginner tutorials” for ROS2 usage.
Part B: Using ROS2 for camera face detection in OpenCV [50%]
  • Expectation: You will now use the usb_cam ROS2 package to visualize the image data from a USB camera. You will then create a custom node to subscribe and convert this image data for OpenCV-based face detection. Bounding boxes will then be drawn over the detected faces.
Part C: Publishing the output image as a ROS topic [25%]
  • Expectation: Once the bounding box is drawn, the output image will be converted back into ROS image data. This image data will be published to a custom topic for visualization.
Part D: Write ROS launch files [5%]
  • Expectation: You will create a single ROS launch file to initiate all nodes from a single terminal command. The expected ROS nodes are usb_cam, your ROS node, and rqt_image_view.
Homework 1
Week
3-4
Lecture #3: Spatial Descriptions and Transformations
• Homogeneous, Euler, and Rodrigues rotations • Representation in matrix, vector, and quaternion space • Quaternion algebra and details on SO(3)
Week
4-5
Lecture #4: Kinematics: Manipulators and Mobile Robots
• Forward Kinematics: DH notation (manipulators) • Adaptation: UGVs (TurtleBot Kinematics) • Inverse Kinematics (Basics)
Homework 2: Spatial Kinematics & Transformations in SO(3)
Part A: Coordinate frame of reference conversions [20%]
  • Expectation: You will convert the Cartesian points in terms of the cylindrical and spherical coordinate systems. From the given frames and measurements, you will find the transformation matrix corresponding to the translation and rotation between frames.
Part B: Rotation matrix representation and analysis [25%]
  • Expectation: You will have a basic understanding of SO(3); with a given 3x3 matrix, you will prove it is a valid rotation matrix in SO3. You will then find the axis of rotation and the angle. You will also know how to find the axis of rotation in vectorized form and use it.
Part D: Quaternion concepts [15%]
  • Expectation: You will know the basics of Quaternion algebra, particularly for representing rotations and transformations of a rigid body (robot joints). You will learn how to find the rotation matrix from a given quaternion vector, and then find the quaternion vector from a given Cartesian vector rotation. A few basic quaternion properties will need to be proved analytically as well.
Part E: Forward kinematics [35%]
  • Expectation: You will learn to use a ROS2 library to visualize the kinematics of a manipulator arm movement. The results will then need to be reported for different goal input cases.
Part F: Rodrigues' proof [5%]
  • Expectation: You will derive the equivalent angle-axis formula for a rotation matrix in SO(3) from the Rodrigues’ formula.
Homework 2 Homework 2
Week
5-6
Lecture #5: Locomotion: Mobile Robots
• Motion Gaits: 2-DOF, 3-DOF, and 6-DOF • Quaternion: Rotation Space and SLERP Interpolation
Week
6-7
Lecture #6: Robot Perception: UGVs / UAVs
• Exteroceptive and interoceptive sensing • Gyroscope, accelerometers, IMU, GPS • Range sensors, camera vision, and LiDAR
Week
7-8
Lecture #7: Visual Perception in Robots
• Vision basics - UGVs / AUVs / UAVs • Camera model: intrinsic and extrinsic parameters • Object detection and tracking • Homography estimation
Homework 3: Visual Perception and Motion Kinematics
Part A: Augmented visual by homography estimation [30%]
  • Expectation: You will implement an image inpainting task to place a logo on a content image by using homography estimation. The logo should be correctly stretched to fit the given perspective and output both the combined image output and the warped output.
  • Note: Up to 5% bonus if you can implement a filter for noise reduction.
Part B: Camera calibration using ROS/OpenCV/Matlab [20%]
  • Expectation: You will calibrate a camera that you have access to using a pre-existing library. The intrinsic matrix K and information about the camera model being used will need to be provided. You can use your cellphone camera or any of our robots’ cameras for this assignment.
Part C: Epipolar geometry concepts [10%]
  • Expectation: You will provide derivations and proofs for epipolar geometry concepts that demonstrate understanding of the epipolar constraints and 3D scene geometry.
Part D: Motion kinematics of a 2D robot in ROS2 [40%]
  • Expectation: You will use the ROS2 turtlesim package that comes with the installation of ROS2 to perform various movement tasks in a 2D environment. A simple movement pattern task will be done first, and then a second point following the task will be implemented. Custom nodes will need to be created for both applications.
Homework 3 Homework 3
Mid-Term
Mid-term review class: first class of week #8
• Mid-term exam: individual, in-person, written exam, during regular class time Syllabus: Lecture 1-7 and HW 1-3 concepts Open notes: You can bring five sheets of paper as notes

Topics:
• Rotations and transformations in SO(3): matrix and vector forms • Forward kinematics and motion modeling • Visual perception: camera model and homography estimation
Week
9-10
Lecture #8: Visual Odometry and 3D Robot Vision
• 2-DOF, 3-DOF, and 6-DOF robots • Acoustic and optical localization • 3D robot vision and scene geometry
Week
10-11
Lecture #9A: Map-based Planning
• Map-based algorithms: Bug0, Bug1, Bug2 algorithms • Graph-based algorithms: BFS, DFS, Dijkstra, A*
Week
11-12
Lecture #9B: Probabilistic Planning Under Uncertainty
• Sampling-based algorithms: PRM, RRT, RRT* • Target-centric planners • Online/active planning under uncertainty
Homework 4: Path Planning and Robot Localization
Note: Parts of HW4 and HW5 are group assignments, since they involve working on the actual robot platforms.
We will form 3-4 member groups in class for these two homeworks.
Part A (Individual): Quaternion interpolation [30%]
  • Expectation: You will learn how to perform interpolation between two robot poses for smooth trajectory planning in quaternion space. You will implement functions to transform between different rotation matrix notations. These will then be used when implementing an SLERP interpolation in quaternion space. Figures of varying interpolation scales will need to be shown.
Part B (Group): Robot localization and path finding [70%]
  • Weight: 20% (mapping) + 20% (navigation) + 30% (planning) = 70%
  • Platform: Turtlebot, Robot Dog, or RPi simulation engine
  • Expectation: Using one of the given robot platforms, the group will use the given libraries to perform SLAM localization and mapping. With the generated map, they will implement Bug Algorithms (Bug0, Bug1, Bug2) to avoid obstacles when directing the robot to goals.
Homework 4 Homework 4
Week
12-13
Lecture #10A: Mobile robot localization
• Map-based localization in 2D and 3D • Localization from camera, imu, and/or range data • Localization from landmarks without a prior map
Week
13-14
Lecture #10B: State Estimation, Filtering, and Feedback Control
• State estimation and planning under uncertainty • Kalman filtering (KF, EKF, UKF) • PID controllers (how to implement and tune)
Homework 5: Robot Localization, Filtering, & State Estimation
Note: The first two parts of this assignment are individual. The last part is a group assignment, a continuation of the HW4B assignment; the groups are recommended to use the same robot platform for these two homework assignments.
Part A (Individual): State estimation from landmarks [15%]
  • Expectation: You will find analytical solutions for 2D/3D pose estimation of a robot from multiple known landmarks using range data and/or bearing measurements.
Part B (Individual): Kalman filters for object tracking [20%]
  • Expectation: You will learn how to implement a basic Kalman filtering (KF) algorithm for 2D object tracking using Python-OpenCV libraries. The steps for the Kalman filtering process, as well as tuning, should be understood for a successful implementation.
Part C (Group): Robot localization and path finding [60%]
  • Weight: 20% (detection) + 20% (planner) + 30% (control) = 70%
  • Platform: Turtlebot, Robot Dog, or manipulator arm
  • Expectation: Continuing from the previous pathing implementation in HW4B, your group will integrate a person/object detector for following motion. A motion/path planner will be implemented on the robot platform. A KF-based control algorithm must be implemented to allow the robot to follow the outputs from the detection and planning algorithms.
Part D (Individual): Implementing RRT/RRT* [5%]
  • Expectation: You will develop RRT/RRT* on the robot to traverse source-to-destination paths.
Homework 5 Homework 5 Homework 5
Final
Final Exam review class: last lecture of course (Week 15)
• Final Exam: individual, in-person, written exam, during official finals’ time • Syllabus: Lecture 9-10 and HW 4-5 concepts • Open notes: You can bring five sheets of paper as notes

Topics:
• Path planning algorithms: map-based and graph-based planners • Localization and state estimation filters • ROS2-OpenCV programming questions