Difference between revisions of "CISC849 F2023"
(→Schedule) |
(→Course information) |
||
(35 intermediate revisions by the same user not shown) | |||
Line 28: | Line 28: | ||
*15% 3 quizzes | *15% 3 quizzes | ||
*25% Final project | *25% Final project | ||
+ | *2% Extra credit for completing course evaluation | ||
Programming assignments will be graded on how many of the subtasks you complete or demonstrate successfully. | Programming assignments will be graded on how many of the subtasks you complete or demonstrate successfully. | ||
Line 257: | Line 258: | ||
|Connecting to the lidar sensor; representing the environment as a map and basic planning | |Connecting to the lidar sensor; representing the environment as a map and basic planning | ||
|[https://index.ros.org/r/rplidar_ros/#humble rplidar_ros]; [http://nameless.cis.udel.edu/class_data/cisc367_s2023/mr_chap10.4.pdf MR Chap. 10.4], <br>[https://people.eecs.berkeley.edu/~pabbeel/cs287-fa13/slides/MappingWithKnownPoses.pdf Abbeel slides] (skip SLAM, reflection maps) | |[https://index.ros.org/r/rplidar_ros/#humble rplidar_ros]; [http://nameless.cis.udel.edu/class_data/cisc367_s2023/mr_chap10.4.pdf MR Chap. 10.4], <br>[https://people.eecs.berkeley.edu/~pabbeel/cs287-fa13/slides/MappingWithKnownPoses.pdf Abbeel slides] (skip SLAM, reflection maps) | ||
− | |[https://docs.google.com/presentation/d/1Jz0E8Cu8vUOMPKcpUpgDwD_ZXjwRF8nabbbJO-m_1lc/edit?usp=sharing slides] | + | |[https://docs.google.com/presentation/d/1Jz0E8Cu8vUOMPKcpUpgDwD_ZXjwRF8nabbbJO-m_1lc/edit?usp=sharing slides]<br>Quiz #1<br>[https://docs.google.com/document/d/1-mp0cIdBBqwiyz-iO0FQ7N-WBiM2G9LbCcav2D_zXbw/edit?usp=sharing HW #3]<br>[https://drive.google.com/file/d/1z0jc3tOPUiV0lSebMUj87ge4azn4v99l/view?usp=sharing Isaac sim udbot Sep. 22 version] |
|- | |- | ||
|10 | |10 | ||
Line 264: | Line 265: | ||
|Least-squares line-fitting and <br>outlier rejection with RANSAC for lidar scans | |Least-squares line-fitting and <br>outlier rejection with RANSAC for lidar scans | ||
|Scikit-learn [https://scikit-learn.org/stable/modules/linear_model.html#ordinary-least-squares ordinary least-squares] and [https://scikit-learn.org/stable/modules/linear_model.html#ransac-regression RANSAC] | |Scikit-learn [https://scikit-learn.org/stable/modules/linear_model.html#ordinary-least-squares ordinary least-squares] and [https://scikit-learn.org/stable/modules/linear_model.html#ransac-regression RANSAC] | ||
− | | | + | |[https://docs.google.com/presentation/d/1QH0fLPJEd8YrHjy7sbUdCHFGF1ntDFmyOcARCjgc5Uo/edit?usp=sharing slides]<!--<br> [https://docs.google.com/document/d/18gidhrrhHwwZiF3EWQZqZ96kaL8gtWZA8sBHyTiXA3k/edit?usp=sharing HW #3]--> |
|- | |- | ||
|style="background:rgb(102, 204, 255)"|11 | |style="background:rgb(102, 204, 255)"|11 | ||
|Oct. 3 | |Oct. 3 | ||
− | | | + | |HW #3 |
− | | | + | | |
− | | | + | | |
| | | | ||
|- | |- | ||
|12 | |12 | ||
|Oct. 5 | |Oct. 5 | ||
− | | | + | |Localization |
− | | | + | |Particle filters, MCL |
− | | | + | |[http://robots.stanford.edu/probabilistic-robotics/ppt/particle-filters.ppt Thrun particle filtering slides] |
− | | | + | |[https://docs.google.com/presentation/d/1oKrs7SfJE93Pbj0lyRGlbaW73f0rxYR_lZV9jWfL3bk/edit?usp=sharing slides] |
|- | |- | ||
|style="background:rgb(102, 204, 255)"|13 | |style="background:rgb(102, 204, 255)"|13 | ||
|Oct. 10 | |Oct. 10 | ||
− | | | + | |HW #3 |
− | | | + | | |
− | | | + | |<!--[https://navigation.ros.org/tutorials/docs/navigation2_with_slam.html nav2 Navigating while mapping tutorial]--> |
|<!--[https://docs.google.com/presentation/d/17_J6F7mgkX69VI8iRt2O6zU2b4DsS3OLjhI3_HJj_OQ/edit?usp=sharing slides] -->''HW #3 due'' | |<!--[https://docs.google.com/presentation/d/17_J6F7mgkX69VI8iRt2O6zU2b4DsS3OLjhI3_HJj_OQ/edit?usp=sharing slides] -->''HW #3 due'' | ||
|- | |- | ||
|14 | |14 | ||
|Oct. 12 | |Oct. 12 | ||
− | | | + | |Computer vision<!--Simulation--><!--Localization --> |
− | |<!--Randomized search, path smoothing--><!--ROS Gazebo--><!--Inference/tracking, Markov localization --> | + | |Connecting to the RGB-D camera, line finding<!--Randomized search, path smoothing--><!--ROS Gazebo--><!--Inference/tracking, Markov localization --> |
− | |<!--[http://lavalle.pl/planning/ch1.pdf PA Chap. 1-1.3], [http://nameless.cis.udel.edu/class_data/cisc367_s2023/mr_chap10.1.pdf MR Chap. 10-10.1]--><!--[https://classic.gazebosim.org/tutorials Gazebo classic tutorials]--><!--[http://www.asl.ethz.ch/education/master/mobile_robotics/Lecture7b.pdf ETH localization 1] (through slide 31), [http://www.asl.ethz.ch/education/master/mobile_robotics/Lecture8.pdf ETH localization 2] (through slide 41),--> | + | |[https://github.com/IntelRealSense/realsense-ros realsense-ros Github], basic OpenCV color processing, [https://docs.opencv.org/3.4/d9/db0/tutorial_hough_lines.html Hough lines]<!--[http://lavalle.pl/planning/ch1.pdf PA Chap. 1-1.3], [http://nameless.cis.udel.edu/class_data/cisc367_s2023/mr_chap10.1.pdf MR Chap. 10-10.1]--><!--[https://classic.gazebosim.org/tutorials Gazebo classic tutorials]--><!--[http://www.asl.ethz.ch/education/master/mobile_robotics/Lecture7b.pdf ETH localization 1] (through slide 31), [http://www.asl.ethz.ch/education/master/mobile_robotics/Lecture8.pdf ETH localization 2] (through slide 41),--> |
− | |HW #4 | + | |[https://docs.google.com/presentation/d/1QvD6TgPAAFkzy89l-BICkeBjRMZBXx5c6x8Hfnxt9Ac/edit?usp=sharing slides]<br>[https://docs.google.com/document/d/1veVCs5tw2ipJmoKMsvNNClDAy75hNJKbg4lBwVXpRhI/edit?usp=sharing HW #4] |
|- | |- | ||
|style="background:rgb(102, 204, 255)"|15 | |style="background:rgb(102, 204, 255)"|15 | ||
|Oct. 17 | |Oct. 17 | ||
|Computer vision | |Computer vision | ||
− | | | + | |Tags/fiducials |
− | |[https://github.com/ | + | |[https://github.com/AprilRobotics/apriltag AprilTag Github], [https://github.com/christianrauch/apriltag_ros apriltag_ros] |
− | | | + | |[https://docs.google.com/presentation/d/19NVEdbWcTzFdFuvZPPio6yw3NabYI8ToX15oL-g_bKk/edit?usp=sharing slides] |
|- | |- | ||
|16 | |16 | ||
|Oct. 19 | |Oct. 19 | ||
− | | | + | |HW #4<!--Localization --> |
− | | | + | |Coding time |
− | | | + | |<!--[http://robots.stanford.edu/probabilistic-robotics/ppt/particle-filters.ppt Thrun particle filtering slides], [http://nameless.cis.udel.edu/class_data/cisc829/oct4/UTenn_Parker_Localization.pdf U. Tennessee Monte Carlo Localization], [http://robots.stanford.edu/papers/fox.aaai99.ps.gz Fox et al. AAAI 1999 paper], [http://www.youtube.com/watch?v=uiIi2rSKWAU Nao localization at U. Freiburg]--> |
|<!--[https://docs.google.com/presentation/d/1Fm2bfCp97l0buYP6xlHOw9j187zppfknAYc6JVIzJbI/edit?usp=sharing slides] --> | |<!--[https://docs.google.com/presentation/d/1Fm2bfCp97l0buYP6xlHOw9j187zppfknAYc6JVIzJbI/edit?usp=sharing slides] --> | ||
|- | |- | ||
Line 311: | Line 312: | ||
|Oct. 24 | |Oct. 24 | ||
|Computer vision<!--Mapping--> | |Computer vision<!--Mapping--> | ||
− | | | + | |Getting depth/3-D point cloud from the RealSense;<br>object detection<!--SLAM (ROS [http://www.ros.org/wiki/gmapping gmapping])--> |
− | |[https://github.com/ | + | |[https://github.com/IntelRealSense/realsense-ros realsense-ros Github], [https://github.com/ros-perception/perception_pcl/tree/ros2 perception_pcl] for further analysis<br>[https://github.com/ultralytics/yolov5 YOLOv5] (Python, lots of requirements), [https://github.com/ultralytics/yolov5/wiki/Train-Custom-Data Training YOLOv5 on custom data]<!--[http://nameless.cis.udel.edu/class_data/cisc829/oct18/thrun_fastslam.pdf Thrun FastSLAM slides], [http://www.youtube.com/watch?v=7iIDdvCXIFM gmapping demo], [http://www.youtube.com/watch?v=F8pdObV_df4 Darmstadt mapping]--> |
− | | | + | |[https://docs.google.com/presentation/d/1kVPizXsegH3GmaWcItvmkULkXkNPP1ppSwIfkAYc_pQ/edit?usp=sharing slides] |
|- | |- | ||
|18 | |18 | ||
Line 320: | Line 321: | ||
|Coding time | |Coding time | ||
|<!--UC Berkeley notes on [http://nameless.cis.udel.edu/class_data/cisc829/oct23/gmapping.pdf gmapping], [http://nameless.cis.udel.edu/class_data/cisc829/oct23/scan-matching.pdf scan-matching]--> | |<!--UC Berkeley notes on [http://nameless.cis.udel.edu/class_data/cisc829/oct23/gmapping.pdf gmapping], [http://nameless.cis.udel.edu/class_data/cisc829/oct23/scan-matching.pdf scan-matching]--> | ||
− | |''HW #4 due'' | + | |Quiz #2<br>''HW #4 due'' |
|- | |- | ||
|style="background:rgb(102, 204, 255)"|19 | |style="background:rgb(102, 204, 255)"|19 | ||
|Oct. 31 | |Oct. 31 | ||
− | | | + | |HW #5 <!--Mapping--> |
− | | | + | | |
− | | | + | |<!--[http://nameless.cis.udel.edu/class_data/cisc829/oct25/engelhard_rgbd_slam_slides.pdf RGBD SLAM slides], [http://nameless.cis.udel.edu/class_data/cisc829/oct25/endres12icra.pdf ICRA 2012 RGBD SLAM paper], U. Texas slides on SIFT features [http://nameless.cis.udel.edu/class_data/cisc829/oct25/grauman_lecture15_local_features.pdf 1], [http://nameless.cis.udel.edu/class_data/cisc829/oct25/grauman_lecture16_bow.pdf 2]--> |
− | |HW #5 | + | |[https://docs.google.com/document/d/1veVCs5tw2ipJmoKMsvNNClDAy75hNJKbg4lBwVXpRhI/edit?usp=sharing HW #5] |
|- | |- | ||
|20 | |20 | ||
|Nov. 2 | |Nov. 2 | ||
− | | | + | |SLAM |
− | | | + | |Problem statement, ROS <tt>slam_toolbox</tt> package<!--Particle filtering, Monte Carlo localization (ROS [http://www.ros.org/wiki/amcl amcl])--><!--RGB-D SLAM (ROS [http://www.ros.org/wiki/rgbdslam rgbdslam])--> |
− | |[https://github.com/ | + | |[https://github.com/SteveMacenski/slam_toolbox slam_toolbox Github], [http://nameless.cis.udel.edu/class_data/cisc829/oct18/thrun_fastslam.pdf Thrun FastSLAM slides] |
− | + | |[https://docs.google.com/presentation/d/1sZroW743u5PhntBbnOhYdOpQ9Jt5QW-tiyjdOE-6j3I/edit?usp=sharing slides] | |
|- | |- | ||
|style="background:rgb(102, 204, 255)"|21 | |style="background:rgb(102, 204, 255)"|21 | ||
|Nov. 7 | |Nov. 7 | ||
− | | | + | |HW #5 |
− | | | + | | |
− | |||
| | | | ||
+ | |''HW #5 due'' | ||
|- | |- | ||
|22 | |22 | ||
|Nov. 9 | |Nov. 9 | ||
− | | | + | |Motion learning |
− | | | + | |Dynamical tasks, reinforcement learning |
− | |<!--[https://navigation.ros.org/ nav2], [https://navigation.ros.org/tutorials/docs/navigation2_on_real_turtlebot3.html Turtlebot3 navigation tutorial]--> | + | |[https://github.com/Farama-Foundation/Gymnasium Gymnasium Github]<!--[https://navigation.ros.org/ nav2], [https://navigation.ros.org/tutorials/docs/navigation2_on_real_turtlebot3.html Turtlebot3 navigation tutorial]--> |
− | | | + | |[https://docs.google.com/presentation/d/1bO66kLQX5SegxtRNdLR0aoweBGT288ziSJ6QqtUqO-s/edit?usp=sharing slides]<br>[http://nameless.cis.udel.edu/class_wiki/index.php/CISC849_F2023_Project Final project] |
|- | |- | ||
|style="background:rgb(102, 204, 255)"|23 | |style="background:rgb(102, 204, 255)"|23 | ||
|Nov. 14 | |Nov. 14 | ||
− | | | + | |Project brainstorming |
+ | | | ||
| | | | ||
| | | | ||
− | |||
|- | |- | ||
| | | | ||
Line 383: | Line 384: | ||
|<!--Professional responsibilities, robots as caregivers and soldiers--> | |<!--Professional responsibilities, robots as caregivers and soldiers--> | ||
|<!--[https://www.acm.org/code-of-ethics ACM], [https://www.ieee.org/about/corporate/governance/p7-8.html IEEE] codes of ethics; [http://nameless.cis.udel.edu/class_data/849_f2018/coe_for_robotics_engineers.pdf proposed code for robotics engineers];<br>[http://moralmachines.blogspot.com/2009/01/ethical-frontiers-of-robotics.html "The Ethical Frontiers of Robotics", N. Sharkey (2008)]--> | |<!--[https://www.acm.org/code-of-ethics ACM], [https://www.ieee.org/about/corporate/governance/p7-8.html IEEE] codes of ethics; [http://nameless.cis.udel.edu/class_data/849_f2018/coe_for_robotics_engineers.pdf proposed code for robotics engineers];<br>[http://moralmachines.blogspot.com/2009/01/ethical-frontiers-of-robotics.html "The Ethical Frontiers of Robotics", N. Sharkey (2008)]--> | ||
− | | | + | |[https://docs.google.com/presentation/d/17-rU-cVdcGT-9u5mNX-CRRYXODnldfVZTqjDFKSpQ2Y/edit?usp=sharing slides] |
|- | |- | ||
|25 | |25 | ||
|Nov. 30 | |Nov. 30 | ||
− | | | + | |Final project |
+ | |Coding time | ||
+ | | | ||
+ | | | ||
+ | |- | ||
+ | |style="background:rgb(102, 204, 255)"|26 | ||
+ | |Dec. 5 | ||
+ | |Final project | ||
|Quiz then coding time | |Quiz then coding time | ||
| | | | ||
|Quiz #3 | |Quiz #3 | ||
|- | |- | ||
− | | | + | |27 |
− | |Dec. | + | |Dec. 7 |
|Final project | |Final project | ||
|Coding time | |Coding time | ||
Line 399: | Line 407: | ||
| | | | ||
|- | |- | ||
− | | | + | |style="background:rgb(102, 204, 255)"| |
− | |Dec. | + | |Dec. 11/12 |
|Final project demos/presentations | |Final project demos/presentations | ||
| | | |
Latest revision as of 15:57, 5 December 2023
Course information
Title | CISC849-015 Robot Navigation and Autonomy |
Description | A hands-on approach to implementing mobile robot algorithms on a small wheeled platform, both in simulation and reality. We will focus on image- and depth-based sensing algorithms for obstacle segmentation, object recognition, and motion planning, including deep learning techniques in these areas. |
When | Tuesdays and Thursdays, 2:20-3:40 pm. When there is a homework due, no more than the first 30 minutes of each class will be in lecture format. The rest of the class period (and optionally the subsequent office hours) will be spent working on the robots. |
Where | Smith 211 |
Instructor | Christopher Rasmussen, 446 Smith Hall, cer@cis.udel.edu |
Office hours | Tuesdays and Thursdays, 3:40-5 pm in Smith 211 (starting Aug. 31) |
Grading |
Programming assignments will be graded on how many of the subtasks you complete or demonstrate successfully. For the overall course grade, a preliminary absolute mark will be assigned to each student based on the percentage of the total possible points they earn according to the standard formula: A = 90-100, B = 80-90, C = 70-80, etc., with +'s and -'s given for the upper and lower third of each range, respectively. Based on the distribution of preliminary grades for all students (i.e., "the curve"), the instructor may increase these grades monotonically to calculate final grades. This means that your final grade can't be lower than your preliminary grade, and your final grade won't be higher than that of anyone who had a higher preliminary grade. I will try to keep you informed about your standing throughout the semester. If you have any questions about grading or expectations at any time, please feel free to ask me. |
Academic policies | Programming projects should be demo'd in class on the deadline day and uploaded to Canvas by midnight of that day (with a grace period of a few hours afterward...after sunrise is definitely late). A late homework is a 0 without a valid prior excuse. To give you a little flexibility, you have 3 "late days" to use on homeworks to extend the deadline to the next class period without penalty. No more than one late day may be used per assignment. Late days will automatically be subtracted, but as a courtesy please notify the instructor in an e-mail of your intention to use late days before the deadline.
Assignment submissions should consist of a directory containing all code (your .cpp/.py files, etc.), any output data generated (e.g., images, movies, etc.), and an explanation of your approach, what worked and didn't work, etc. contained in a separate text or HTML file. Do not submit executables or .o files, please! The directory you submit for each assignment should be packaged by tar'ing and gzip'ing it or just zip'ing it. Students can discuss problems with one another in general terms, but must work independently or within their teams as specified for each assignment. This also applies to online and printed resources: you may consult them as references (as long as you cite them), but the words and source code you turn in must be yours alone. The University's policies on academic dishonesty are set forth in the student code of conduct here. |
Instructions/Resources
Robot |
Yes, our robot platform is a Roomba. Except it can't clean.
|
ROS |
We are using ROS2, to be exact, and the Humble Hawksbill version. My laptop and the Raspberry Pi's on the robots are running Ubuntu Desktop 22.04 LTS, and I strongly recommend that you do the same. I say this not because of convenience, but after hard experience last semester with another group of students. Trying to install and run ROS2 on Windows or MacOS natively is a recipe for frustration and disappointment. Here are installation instructions for Ubuntu. Several students with MacOS had success using Ubuntu in a VM such as UTM. With a VM there are more issues such as setting up bridging mode for wifi, but it's workable. A lot of public sample code and tutorials are available in both C++ and Python, but expect mostly C++ examples from me. I am agnostic about which of these two languages you use for homeworks and projects, but you will get your best support from me in C++. |
Readings |
|
Gazebo |
This is a 3-D robot simulator |
Schedule
Note: The blue squares in the "#" column below indicate Tuesdays.
# | Date | Topic | Notes | Readings/links | Assignments/slides |
---|---|---|---|---|---|
1 | Aug. 29 | Introduction | Background, course information | slides | |
2 | Aug. 31 | hello robot, hello ROS | ROS basics, Create ROS interface | Creating & building ROS2 packages, ROS2 nodes, ROS2 topics, Create 3 topics |
slides HW #1 |
3 | Sep. 5 | Kinematics/Dynamics | Degrees of freedom, configuration space; wheeled systems (unicycle/car vs. differential drive) | MR Chap. 2, PA Chap. 13.1.2 | slides |
4 | Sep. 7 | ROS | ROS 2 workspaces and packages, subscribing and publishing, and timer callbacks | slides HW #1 due | |
5 | Sep. 12 | HW #2 | HW #2 | ||
6 | Sep. 14 | Controllers | Waypoint following, line following, trajectory and wall following (pure pursuit) | RVC Chap. 4-4.1.2 | slides |
7 | Sep. 19 | HW #2 | |||
8 | Sep. 21 | Robot geometry | URDFs, ROS rviz2 and tf2, odometry | rviz2 user manual (Turtlebot3), Introduction to tf2, MR Chap. 13.4 | slides HW #2 due |
9 | Sep. 26 | Costmaps and discrete motion planning | Connecting to the lidar sensor; representing the environment as a map and basic planning | rplidar_ros; MR Chap. 10.4, Abbeel slides (skip SLAM, reflection maps) |
slides Quiz #1 HW #3 Isaac sim udbot Sep. 22 version |
10 | Sep. 28 | Estimation | Least-squares line-fitting and outlier rejection with RANSAC for lidar scans |
Scikit-learn ordinary least-squares and RANSAC | slides |
11 | Oct. 3 | HW #3 | |||
12 | Oct. 5 | Localization | Particle filters, MCL | Thrun particle filtering slides | slides |
13 | Oct. 10 | HW #3 | HW #3 due | ||
14 | Oct. 12 | Computer vision | Connecting to the RGB-D camera, line finding | realsense-ros Github, basic OpenCV color processing, Hough lines | slides HW #4 |
15 | Oct. 17 | Computer vision | Tags/fiducials | AprilTag Github, apriltag_ros | slides |
16 | Oct. 19 | HW #4 | Coding time | ||
17 | Oct. 24 | Computer vision | Getting depth/3-D point cloud from the RealSense; object detection |
realsense-ros Github, perception_pcl for further analysis YOLOv5 (Python, lots of requirements), Training YOLOv5 on custom data |
slides |
18 | Oct. 26 | HW #4 | Coding time | Quiz #2 HW #4 due | |
19 | Oct. 31 | HW #5 | HW #5 | ||
20 | Nov. 2 | SLAM | Problem statement, ROS slam_toolbox package | slam_toolbox Github, Thrun FastSLAM slides | slides |
21 | Nov. 7 | HW #5 | HW #5 due | ||
22 | Nov. 9 | Motion learning | Dynamical tasks, reinforcement learning | Gymnasium Github | slides Final project |
23 | Nov. 14 | Project brainstorming | |||
Nov. 16 | NO CLASS Instructor away |
||||
Nov. 21 | NO CLASS Thanksgiving break |
||||
Nov. 23 | NO CLASS Thanksgiving away |
||||
24 | Nov. 28 | Ethical & societal issues | slides | ||
25 | Nov. 30 | Final project | Coding time | ||
26 | Dec. 5 | Final project | Quiz then coding time | Quiz #3 | |
27 | Dec. 7 | Final project | Coding time | ||
Dec. 11/12 | Final project demos/presentations | Final project due |