ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
0

Simple straigh line navigation goal using ORB_SLAM2 and rtabmap_ros visual SLAMS.

asked 2018-05-30 01:48:37 -0600

Astronaut gravatar image

updated 2018-05-30 23:58:08 -0600

Hi

Im new in the navigation stack. I would like to do some obstacle avoidance/collision detection using ORB_SLAM2 and rtabmap_ros visual SLAMS. The base of the robot is not moving and obstacle avoidance is in real Cartesian space in 3D. The robot is kind of a crane. So the base is fix but the hook is moving in x and y coordinate space So for the beginning just need a simple straight line navigation goal. Already made ORB_SLAM2 working with my bag file. So what set up is need it in order to use ORAB_SLAM / rtabmap_ros with navigation stack? Im using Ubuntu 16.04 and ros kinetic

Thanks

edit retag flag offensive close merge delete

1 Answer

Sort by ยป oldest newest most voted
2

answered 2018-05-30 08:57:20 -0600

matlabbe gravatar image

updated 2018-06-01 10:42:56 -0600

As the robot is fix, you may already know the position of the hook anytime. You don't need SLAM. You may only use the navigation stack, updating the obstacle_layer with the camera data for obstacle avoidance and trajectory planning in 2D. If you need to make trajectories in 3D, look at the moveit package.

If you need to construct a map, you could feed the pose of the hook to rtabmap_ros and disable its loop closure detection (parameter Kp/MaxFeatures=-1, as we assume the pose is drift-free). Example:

$ roslaunch rtabmap_ros rtabmap.launch visual_odometry:=false odom_frame_id:=base frame_id:=hook rtabmap_args:="--delete_db_on_start --Kp/MaxFeatures -1"

TF base -> hook would be published by your robot controller.

EDIT

If the transformation between the hook and the base cannot be known, SLAM would be used for the hook (like the hook is a robot). With ORB_SLAM2, it would be possible to get the pose and publish it on TF. Configure move_base to use this frame through TF, so that navigation knows where the robot is. For the map, ORB_SLAM2 doesn't provide a 2D occupancy grid out-of-the-box, not sure how you can generate one. With rtabmap_ros, you can have the pose already published on TF and a 2D occupancy grid. See navigation stack's tutorials to know how to setup move_base.

cheers,
Mathieu

edit flag offensive delete link more

Comments

The hook is moving so, only the base is fix but the hook is moving in 3D Cartesian (world) space. So I still need SLAM. So I need to detect objects and need a real time SLAM. So how is would be the solution with ORB_SLAM2 ? For tabmap_ros just enable its loop closure detection right?

Astronaut gravatar image Astronaut  ( 2018-05-31 00:00:06 -0600 )edit

Are you able to add some position sensors to know the position of the hook accordingly to the base? If it is possible, I suggest to use this approach instead of SLAM to avoid drift problem. If not, not sure about ORB_SLAM2, but rtabmap_ros can give map->odom->base_link TF transforms for move_base

matlabbe gravatar image matlabbe  ( 2018-05-31 13:47:13 -0600 )edit

because I will have 3 sets of sensors (cameras and LIDAR). One in the cabin operator, second on the hook trolley , and third on the hook blook. So I need to see how all of them react. If use only on the hook then I need SLAM for sure. So how can integrate ORB_SLAM1 or 2 with navigation goal?

Astronaut gravatar image Astronaut  ( 2018-05-31 22:09:38 -0600 )edit

Can you show me a minimal launch file from rtabmap_ros so I can build the map and localization in Rviz? I would like to start first from that. I made a bag file from a mp4 video from the camera on the hook block. Can you show me pls some launch file so can see the map and localization of the hook?

Astronaut gravatar image Astronaut  ( 2018-06-02 07:00:54 -0600 )edit

Rtabmap_ros doesn't work with single camera stream, at least stereo or RGB-D camera is required. For a stereo example, see this tutorial. Subscribe to /rtabmap/grid_map to get the map. If you want to use only a single camera, stick with ORB_SLAM.

matlabbe gravatar image matlabbe  ( 2018-06-02 08:47:31 -0600 )edit

ok. thanks. Can you help me with launch example for map and localisation in case of ORB_SLAM?

Astronaut gravatar image Astronaut  ( 2018-06-02 15:37:33 -0600 )edit

No sorry, you may get more help asking on github of ORB_SLAM. Beware that if you go mono SLAM, you will have a scale problem that can make navigation less trivial. I suggest to get at least a stereo camera or some sort of visual inertial odometry approach to avoid the scale problem.

matlabbe gravatar image matlabbe  ( 2018-06-02 19:16:47 -0600 )edit

I need the online,real time map of the surrounding around the hook block.Means occupancy 3D dense map (with voxels, bounding boxes) around the hook block.If can use the SLAM then can close the loop and solve the 2 unknowns.Any 3D Visual occupancy dense map using 2 cameras that I can start with? .

Astronaut gravatar image Astronaut  ( 2018-06-19 03:01:04 -0600 )edit

Question Tools

1 follower

Stats

Asked: 2018-05-30 01:48:37 -0600

Seen: 1,535 times

Last updated: Jun 01 '18