ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange |
2023-07-13 15:31:01 -0500 | received badge | ● Famous Question (source) |
2020-01-24 01:13:31 -0500 | received badge | ● Famous Question (source) |
2019-05-20 02:36:12 -0500 | marked best answer | robot_localization IMU and visual odometry from rtabmap I realize this topic has come up a lot, but I've read every related question on here and nothing is quite working yet. I have a depth camera and an IMU. The IMU is rigidly attached to the camera. I'm using rtabmap to produce visual odometry and localize against a map I previously created, and I've created my own IMU node called Tacyhon for my invensense MPU 9250 board. Here are my frames and rosgraph... The acceleration and angular velocity seem to be working fine, however, orientation is relative to camera_link, not relative to map. Thus a 90 degree rotation results in robot_localization thinking that the IMU is facing 180 degrees from its starting location, since it is 90 degrees rotated from camera_link. How can I set my acceleration and angular velocity from the IMU topic to be relative to camera link, but my orientation data to be relative to map so that r_l can fuse the orientation from the IMU with the orientation from rtabmap? Here is my launch file. Any help would be appreciated! |
2018-07-10 02:19:06 -0500 | received badge | ● Notable Question (source) |
2018-05-11 08:00:13 -0500 | received badge | ● Popular Question (source) |
2018-04-09 04:29:14 -0500 | received badge | ● Famous Question (source) |
2018-02-28 20:32:47 -0500 | received badge | ● Notable Question (source) |
2017-11-14 12:13:09 -0500 | received badge | ● Famous Question (source) |
2017-11-02 00:37:12 -0500 | asked a question | image_transport latency reduction techniques image_transport latency reduction techniques So far, no matter what camera I use, image_transport seems to introduce ~40 |
2017-09-13 19:32:50 -0500 | received badge | ● Famous Question (source) |
2017-09-13 19:32:50 -0500 | received badge | ● Notable Question (source) |
2017-08-11 11:29:27 -0500 | received badge | ● Famous Question (source) |
2017-08-11 11:29:27 -0500 | received badge | ● Notable Question (source) |
2017-07-22 07:56:36 -0500 | received badge | ● Famous Question (source) |
2017-05-25 07:50:11 -0500 | received badge | ● Notable Question (source) |
2017-05-23 23:43:55 -0500 | commented question | IMU message definition Was #6 ever solved? What is the usual best practice for IMUs with magnetometers? My IMU orientation vector is relative t |
2017-05-17 20:00:46 -0500 | received badge | ● Popular Question (source) |
2017-05-14 20:56:51 -0500 | asked a question | robot_localization IMU and visual odometry from rtabmap robot_localization IMU and visual odometry from rtabmap I realize this topic has come up a lot, but I've read every rela |
2017-05-01 19:19:57 -0500 | received badge | ● Popular Question (source) |
2017-04-27 22:40:48 -0500 | edited question | Axis conversion between OSVR and ROS Axis conversion between OSVR and ROS I'm using the vrpn_client_ros package to connect an OSVR headset to ROS. However, t |
2017-04-27 22:17:53 -0500 | received badge | ● Student (source) |
2017-04-27 22:16:54 -0500 | asked a question | Axis conversion between OSVR and ROS Axis conversion between OSVR and ROS I'm using the vrpn_client_ros package to connect an OSVR headset to ROS. However, t |
2017-04-15 09:35:21 -0500 | received badge | ● Notable Question (source) |
2017-02-05 23:29:49 -0500 | commented answer | Persistent map frame in RTAB-Map localization mode Thanks for the update. Unfortunately, |
2017-02-01 21:27:46 -0500 | commented answer | Persistent map frame in RTAB-Map localization mode Just to confirm, where is the parameter |
2017-02-01 20:51:10 -0500 | commented answer | Persistent map frame in RTAB-Map localization mode
|
2017-01-21 14:42:33 -0500 | commented question | Persistent map frame in RTAB-Map localization mode I saw this in the rtabmap multi-session mapping video. There must be a way to get the absolute coordinates relative to the map. |
2016-12-19 21:00:16 -0500 | received badge | ● Popular Question (source) |
2016-12-19 13:12:00 -0500 | received badge | ● Notable Question (source) |
2016-12-17 00:00:45 -0500 | received badge | ● Enthusiast |
2016-12-16 11:12:21 -0500 | received badge | ● Popular Question (source) |
2016-12-16 10:58:51 -0500 | answered a question | robot_localization high cpu and no output Problem solved here: https://github.com/cra-ros-pkg/robot_... |
2016-12-14 23:53:42 -0500 | asked a question | rtabmap odometry rate in rviz
|
2016-12-14 23:45:12 -0500 | asked a question | robot_localization high cpu and no output I have an IMU (orientation only) and rtabmap connected to robot_localization. When I include the pose from the IMU and the odom from rtabmap, robot_localization works fine and uses 1-2% CPU. However, when I add the twist messages from the IMU, robot_localization only produces about 10 messages before ceasing to have any output. When I watch it in htop, it uses 100-105% CPU. I assume that whatever it's doing is blocking indicated by no messages being published and ctrl-c failing to work. I examined the twist data going into robot_localization from the IMU and they look fine. They're coming in at about 375 Hz like the pose messages. rtabmap is coming in at about 10Hz with its odom message. Transforms have been applied, but I'm not sure if they are correct in the way they relate the rtabmap reference frame with the IMU frame. Any idea why this would be happening? |
2016-11-15 17:10:47 -0500 | received badge | ● Popular Question (source) |
2016-11-14 18:32:47 -0500 | asked a question | Integrating an IMU with rtabmap My rtabmap is having a very hard time with keeping track of orientation during both localization and mapping. I'm wondering if there is any easy way to feed in IMU data so that the direction of gravity can help orient the images and produce better results. That is, I'm wondering if I can associate orientation data with every frame such that it improves rtabmap's SLAM capabilities. |
2016-11-14 18:24:31 -0500 | received badge | ● Supporter (source) |
2016-11-14 18:24:27 -0500 | received badge | ● Popular Question (source) |
2016-11-02 21:16:51 -0500 | commented answer | 6DOF Localization with stereo camera against RGBD dense point cloud This solution is only localization based on loop closure images, right? It is not localizing against any point cloud or 3D data, but only against the loop closure images and their vectors, right? |
2016-11-02 01:23:47 -0500 | asked a question | 6DOF Localization with stereo camera against RGBD dense point cloud I have a point cloud of an environment generated by a Kinect One using rtabmap. I have a stereo camera, and I'm looking for a ROS package capable of performing full 6DOF localization using the stereo camera against this existing point cloud. Although I have found many solutions for stereo SLAM, I am having a hard time finding anything for localization-only using the stereo camera. What I have found seems to only be depth-based. I'm looking for something that could perform this localization based on the full RGBD cloud, and is compatible with Kinetic. Are there any packages that do this? I would like to produce a pose estimate primarily based on the stereo camera, augmented by IMUs and IR rangefinders, and other monocular cameras. Presumably, once I have a stereo localization, I could pass the other sensors through the robot_localization package to get a fused posed estimate. Any suggestions for a stereo localization package? |
2015-05-20 18:23:39 -0500 | answered a question | Problem detecting yaml-cpp in Hydro installation on MACOS Maveriks I'm encountering the same issue you are, but pkg-config is installed in /usr/local/bin for me. Any suggestions? I found this link which gave some more insights, but still hasn't solved the problem. |