-
Notifications
You must be signed in to change notification settings - Fork 1
feat(#157): Started lidar mapping #160
base: main
Are you sure you want to change the base?
Conversation
Simulation results
|
…ents' into 157-feature-map-lidar-points-to-rgb-image
|
Additional time for progress: ~12h |
# Conflicts: # code/perception/launch/perception.launch
|
Additional effort to implement Buffer node and debugging: 4 hours |
|
The most important thing in this PR is the fork of lidar-camera (https://github.com/timdreier/lidar-camera). Since we hadn't enough time we couldn't fix all errors regarding this node. What works: We get a PointCloud filtered by the viewport of the according camera: The points should have the color they have in the camera image, what obviously doesn't work. Furthermore, the node creates a depth map. Sadly, the depth map is missing some points. This could be improved by optimizing the LIDAR-Parameters to send more LIDAR points in the Viewport of the Camera. The current output is like this: Furthermore, this PR contains a buffer node, which basically just overrides Depth-Values in the Depthmap if they are greater than a threshold (5000 in this example). The hoped to get a more consistent image by doing so. However, this could cause some problems because the depthmap may contain old values. would be interesting to test the state things in future work! |


Description
Map the lidar points to the front RGB image, to get the depth (distance) of segments later.
Fixes #157
Time invested
Tim Dreier: 8h (so far)
Type of change
Please delete options that are not relevant.
Does this PR introduce a breaking change?
No.
Most important changes
That it's possible to get the depth of any segment in a image.
Checklist: