ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange |
2016-11-18 05:51:20 -0500 | received badge | ● Necromancer (source) |
2016-07-18 06:19:14 -0500 | commented answer | TF Tree Shared var I think if the transform between M and A, B and C is fixed, you have to setup your tf tree like this: Global -> M -> A I prefer the fixed global frame at the top. But if you can't setup this configuration, manually compute and broadcast the transform. |
2016-07-18 06:09:32 -0500 | commented question | Set offset to local costmap and rotate with robot Sorry, I didn't understand what is the exact problem with setting the base frame in the center of the robot, especially the part that you said the ackermann kinematics. Can you please explain more? |
2016-07-17 08:08:05 -0500 | commented question | Using move_base without map Where are the plugins in your |
2016-07-17 08:08:05 -0500 | received badge | ● Commentator |
2016-07-15 13:33:48 -0500 | answered a question | TF Tree Shared var Well, no. Check out the tf tutorials).
|
2016-07-15 07:29:39 -0500 | received badge | ● Autobiographer |
2016-07-15 07:26:59 -0500 | commented question | velocity smoother I don't think you should be thinking about smoothing the velocity. The local planners already available provide acceleration control parameters that you can use in order to limit changes in the output velocity. |
2016-07-15 02:20:55 -0500 | commented question | Set offset to local costmap and rotate with robot
|
2016-07-15 01:38:50 -0500 | commented answer | robot_localization does not produce odom->base_link transform @mohakbhardwaj: I think you should start with a simpler configuration, currently you have set several advanced parameters and finding the exact problem is not that easy. By the way I haven't tested robot_localization with only a single imu, but I don't think the result would be satisfying. |
2016-07-14 03:59:55 -0500 | answered a question | Source data types for navigation stack (PointCloud or PointCloud2?) Yes, you can set the data_type to PointCloud2. More information is available in the wiki. |
2016-07-14 00:40:19 -0500 | answered a question | robot_localization does not produce odom->base_link transform Thanks to the information you've provided now I can help you solve your problem. The
while the correct tf tree must be like this:
Your base frame has two parents which is clearly not possible. Each frame can only have one parent. In your case, you are running Just switch the parent and child frame in the input arguments and your problem should be solved. The result must be like this: |
2016-07-14 00:16:29 -0500 | answered a question | Local path planning with Kinect Ok what you are asking is a little complicated. Without SLAM or odometry it's simply not possible at all. You need a world fixed frame to understand where you are right now, without broadcasting the transform between Enough about the transformation, but please read REP 105 for more information. Since you don't have a laser scanner and there is no odometry is available, you can use packages that compute visual odometry using your Kinect data and provide Then comes the no global plan part. Just create a local costmap param file and pass it to a node you want to initialize your costmap with. In order to achieve your desired behavior the closest available source code is You are free to write your own code from the beginning, just subscribe to your goal topic and do the rest. I'm not sure why you don't want to use global planner, but if you are really insist on what you are doing, now you know what to do. Good luck. |
2016-07-13 15:04:31 -0500 | commented question | robot_localization does not produce odom->base_link transform It's a little spooky... your robot_localization parameters seem fine but it's strange that the tf tree shows that |
2016-07-13 09:34:12 -0500 | commented question | How to display laser scan data from Kinect I don't fully understand your problem. The laser scan data are in the polar coordinate system, if you are not familiar with that you can look it up in the internet and the data stored in the ranges array are distance in meters. Can you add a screenshot of your RVIZ screen displaying laser data? |
2016-07-13 08:53:47 -0500 | commented question | How to display laser scan data from Kinect You can visualize your laser scan data using RVIZ but just by looking at the data you can't tell whether it contains the information you want. |
2016-07-13 08:51:43 -0500 | commented answer | Setting default tolerance on Global Planner with Move Base Anytime. Just let me know if you needed more help. |
2016-07-13 02:25:48 -0500 | commented question | robot_localization does not produce odom->base_link transform Can you post a tf tree of the frames you've got? There are several things that I suspect might be wrong like running static_transform_publisher that broadcasts the imu->base which is obviously wrong or even base_link_frame which is "base"... but first post that tf tree! |
2016-07-13 02:11:44 -0500 | answered a question | Setting default tolerance on Global Planner with Move Base So I looked up the global planner source code in the GitHub and I found the problem. I checked the default_tolerance parameter in the code. As you can see there is a default_tolerance_ parameter which is read from parameter server. Afterwards I looked up the usages of this parameter. There's only a single line that uses this parameter: I dug further in ... Unfortunately the tolerance parameter in makePlan doesn't have any usages at all. which means changing this parameter won't change the behavior of the global planner. I didn't fully check your parameters but they seem fine. I think with a little modification in the global planner source code you'll get the result you want. |
2016-07-13 01:02:23 -0500 | commented question | How to display laser scan data from Kinect Ok, I'm not sure why you are facing this problem... anyway haven't you tried this line? |
2016-06-25 03:21:51 -0500 | received badge | ● Enthusiast |
2016-01-12 00:44:15 -0500 | received badge | ● Good Answer (source) |
2015-05-11 02:41:15 -0500 | received badge | ● Nice Answer (source) |
2015-01-29 01:25:51 -0500 | received badge | ● Teacher (source) |
2015-01-29 01:25:51 -0500 | received badge | ● Necromancer (source) |
2014-02-28 23:27:43 -0500 | received badge | ● Supporter (source) |
2014-02-24 20:48:08 -0500 | answered a question | Problem with ROS Networking. You also have two other options. I wanted to roslaunch a gui so after hours of searching I found a way! If you want to run the node and gui on the remote machine, after establishing ssh connection run this command. or (if the first one didn't work) the turtlesim node will be launched on the remote machine. but if you want to get the gui on your main computer just use: The good thing is by editing my env.sh file I was able to roslaunch my nodes. I just add the first command at the end of my env.sh file. |