SLAM and navigation implementation for kitbot
- Install necessary packages for SLAM with LIDAR:
- Use kitware to drive the robot to a pose set by
rviz2
- Clone into the
src
directory of your colcon workspace for ROS - Make sure you have cloned kitware and kitware_interface in
src
as well cd
in to thekitware_slam
and runsetup.sh
- Run
colcon build
from the workspace folder
The following steps should already be done on the hardware we give students.
- Make sure you have TAMProxy-Firmware running on a teensy
- Install TAMProxy-pyHost
- Change the transformation for
base_link_to_base_laser_tf_node
inlaunch/common.launch.py
to reflect the location of the LIDAR's base to the center of your robot. The argument format isor[ x, y, z, roll, pitch, yaw, parent_frame, child_frame]
[ x, y, z, qx, qy, qz, qw, parent_frame, child_frame]
- Tuning the PID and error tolerance constants in
kitware/differential_driver.py
for desired performance/accuracy
Important
Run colcon build
from the workspace folder to update the changes after modifying. Else the changes will not be applied
- Make sure the robot is running off the battery and not attached to anything
ssh -X
into the robotcd
into your ROS workspace- Run
source install/setup.bash
to setup your environment to use the built packages - Launch
- If using LDLIDAR: run
ros2 launch kitware_slam ldlidar_slam.launch.py
- If using YDLIDAR: run
ros2 launch kitware_slam ldlidar_slam.launch.py
- Open another terminal and
ssh -X
into the robot again - Run
rviz2
- Open
rviz
configuration :TODO: - Set a goal pose
- Cross your finger that the robot will go
- Make sure the
kitbot
works with keyboard. If not then troubleshoot that first (pin settings inkitbot.py
is correct, motor wiring is correct)