ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange |
2023-05-31 01:22:52 -0500 | received badge | ● Nice Question (source) |
2021-02-15 07:42:46 -0500 | received badge | ● Famous Question (source) |
2018-09-10 06:16:00 -0500 | received badge | ● Famous Question (source) |
2017-11-07 02:17:58 -0500 | received badge | ● Taxonomist |
2017-04-19 13:04:57 -0500 | received badge | ● Notable Question (source) |
2017-02-06 07:34:38 -0500 | received badge | ● Popular Question (source) |
2016-11-24 12:58:49 -0500 | received badge | ● Notable Question (source) |
2016-11-24 12:58:49 -0500 | received badge | ● Popular Question (source) |
2016-06-03 13:20:28 -0500 | asked a question | camera calibration with small checkerboard in the camera calibration tutorial they use a very large board with a checkerboard pattern. Is a human-sized checkerboard really necessary? It seems to me a smaller checkerboard should be equivalent as long as the camera focus was correct. Can I use a checkerboard on a letter-sized sheet of paper with good results? |
2016-06-01 09:24:28 -0500 | received badge | ● Famous Question (source) |
2016-05-30 05:52:53 -0500 | received badge | ● Famous Question (source) |
2016-05-18 02:57:48 -0500 | received badge | ● Student (source) |
2016-05-17 10:59:55 -0500 | asked a question | local costmap without sensors? I'm having some problems tuning the local planner on my robot, so I'm trying to simulate in rviz without sensor data. how do I populate the local costmap with obstacles from the global costmap? my params look like this right now: costmap_common.yaml: global_costmap.yaml: local_costmap.yaml: but the local costmap is empty so the robot tries to go through obstacles |
2016-05-17 10:51:45 -0500 | commented question | RTAB-map with robot_localization I think it was just due to dropped frames. Kinectv2 has problems with certain usb hardware apparently and it starts to drop frames after about 30 seconds on my laptop.. |
2016-05-09 19:32:12 -0500 | received badge | ● Popular Question (source) |
2016-05-09 19:32:12 -0500 | received badge | ● Notable Question (source) |
2016-05-08 10:27:37 -0500 | received badge | ● Notable Question (source) |
2016-04-05 03:12:36 -0500 | asked a question | RTAB-map with robot_localization I'm building a simple differential drive robot with the kinect+fake laser setup from the tutorials, and now I'm trying to do some navigation with move_base. my transforms look like this, except instead of IMU I'm combining wheel encoder odometry and visual odometry from RTAB-map. when visual odometry is used alone, it's mostly fine until the robot gets close to a wall or something. I'm trying to use robot_localization to combine the two odometries and feed it into RTAB-map as /odom_filtered. </node> With robot_localization, after a while the map looks like this: the 3d and 2d map becomes chaotic and shifts around each frame. my view_frames looks like this: any idea what I'm doing wrong? The visual odometry is only published at around 1hz, could that be a problem? |
2016-04-01 20:52:48 -0500 | commented answer | RTAB-map 2d occupancy grid ah, did not know about rqt_graph. I had put the kinect bridge and rtabmap stuff in one launch file, which apparently produced some kind of conflict. I've got grid_map working now. thanks for the help |
2016-03-25 09:56:36 -0500 | received badge | ● Popular Question (source) |
2016-03-24 02:37:15 -0500 | received badge | ● Famous Question (source) |
2016-03-24 02:37:15 -0500 | received badge | ● Famous Question (source) |
2016-03-24 01:09:18 -0500 | commented answer | RTAB-map 2d occupancy grid I'm running this launch file for the kinect2_bridge: https://github.com/introlab/rtabmap_r... but I don't see any configs that should conflict with generating a 2d map |
2016-03-24 01:01:03 -0500 | commented answer | RTAB-map 2d occupancy grid I have RGB-D images and /scan working, but my expectation was that RGB-D + /scan = grid_map. rtabmap node creates the topic /rtabmap/grid_map but nothing is published to it. I guess it's a bit confusing because there's no tutorial or demo showing how grid_map works, or even how to enable it. |
2016-03-23 12:41:32 -0500 | commented answer | RTAB-map 2d occupancy grid how do I know which topics are necessary for grid_map? I thought only /scan is required for the grid map |
2016-03-23 12:39:23 -0500 | received badge | ● Supporter (source) |
2016-03-22 22:24:15 -0500 | asked a question | RTAB-map 2d occupancy grid I'm trying to get /rtabmap/grid_map working. I'm using the kinect + fake 2d laserscan method in the tutorial, and there is data being published to /scan When I rostopic echo /rtabmap/grid_map, nothing is displayed. I expected that if the subscribe_laserScan argument is on, rtabmap will output a 2d occupancy grid map under /rtabmap/grid_map. Is that incorrect? What other settings could affect grid_map? |
2016-03-22 22:13:06 -0500 | marked best answer | RTAB-Map with odometry and laserscan just getting started with ROS. I've got RTAB-map working with a kinectv2, but I'm seeing loss of tracking when there aren't enough feature points in the camera view. I was wondering if adding odometry input would alleviate this problem in the actual robot? Would a laser scanner improve accuracy as well? I'm contemplating buying a laser scanner as per: http://wiki.ros.org/rtabmap_ros/Tutor... but I'm not certain how the laser scan interacts with RTAB-map. It says that the laser scan is used to construct a 2d occupancy grid, but that can already be done without RTAB-map. Is the 2d laser scan data used to improve performance in the 3d SLAM at all? |
2016-03-22 22:13:06 -0500 | received badge | ● Scholar (source) |
2016-03-18 20:18:34 -0500 | received badge | ● Notable Question (source) |
2016-02-11 13:49:14 -0500 | received badge | ● Popular Question (source) |
2016-02-06 08:04:54 -0500 | received badge | ● Famous Question (source) |
2016-02-05 20:24:13 -0500 | received badge | ● Enthusiast |
2016-02-04 15:56:38 -0500 | commented question | XV11 LIDAR on Jade ok, I'll try that. The diagram on the tutorial had both at 3.0 so I thought that was the best way. |
2016-02-04 13:01:36 -0500 | commented question | XV11 LIDAR on Jade I've also tried powering the lidar and motor from 2.8->3.3 volts on a bench supply. At extreme ranges I stop getting data, the debug output is from 3.0v. I've checked to make sure current isn't being limited, it draws about 100mA |
2016-02-04 12:46:03 -0500 | commented question | XV11 LIDAR on Jade I don't have the hardware connected now but /rpms is also empty I think. I'll check it when I hook it up again. |
2016-02-04 12:17:37 -0500 | commented question | XV11 LIDAR on Jade yeah, the /scan topic only appears after running the driver. I've tried both versions of the firmware too. |
2016-02-03 16:54:17 -0500 | received badge | ● Notable Question (source) |
2016-02-03 08:39:14 -0500 | received badge | ● Popular Question (source) |
2016-02-02 20:58:29 -0500 | received badge | ● Editor (source) |
2016-02-02 20:54:37 -0500 | asked a question | XV11 LIDAR on Jade I'm following these tutorials on using the xv11 lidar with ROS: http://wiki.ros.org/xv_11_laser_drive... http://wiki.ros.org/xv_11_laser_drive... looking at the raw input on /dev/ttyUSB0 it looks like I'm getting data, but is empty, as is when I I see and it's the same for both versions of the firmware I'm thinking that the driver might have problems with ROS Jade? I'm not sure what else could be going wrong. Has anyone else got this working with ROS Jade? are there any other drivers I could try? |