ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

Arwen's profile - activity

2016-11-18 05:51:20 -0500 received badge  Necromancer (source)
2016-07-18 06:19:14 -0500 commented answer TF Tree Shared var

I think if the transform between M and A, B and C is fixed, you have to setup your tf tree like this: Global -> M -> A I prefer the fixed global frame at the top. But if you can't setup this configuration, manually compute and broadcast the transform.

2016-07-18 06:09:32 -0500 commented question Set offset to local costmap and rotate with robot

Sorry, I didn't understand what is the exact problem with setting the base frame in the center of the robot, especially the part that you said the ackermann kinematics. Can you please explain more?

2016-07-17 08:08:05 -0500 commented question Using move_base without map

Where are the plugins in your global_costmap_params.yaml?

2016-07-17 08:08:05 -0500 received badge  Commentator
2016-07-15 13:33:48 -0500 answered a question TF Tree Shared var

Well, no.

Check out the tf tutorials).

tf builds up a tree structure of frames; it does not allow a closed loop in the frame structure. This means that a frame only has one single parent, but it can have multiple children.

2016-07-15 07:29:39 -0500 received badge  Autobiographer
2016-07-15 07:26:59 -0500 commented question velocity smoother

I don't think you should be thinking about smoothing the velocity. The local planners already available provide acceleration control parameters that you can use in order to limit changes in the output velocity.

2016-07-15 02:20:55 -0500 commented question Set offset to local costmap and rotate with robot

costmap_2d has a parameter called robot_base_frame, I think there might be something wrong with this parameter. If the frame you set for this parameter is not the center of the footprint, well, the costmap won't be in the center of the footprint.

2016-07-15 01:38:50 -0500 commented answer robot_localization does not produce odom->base_link transform

@mohakbhardwaj: I think you should start with a simpler configuration, currently you have set several advanced parameters and finding the exact problem is not that easy. By the way I haven't tested robot_localization with only a single imu, but I don't think the result would be satisfying.

2016-07-14 03:59:55 -0500 answered a question Source data types for navigation stack (PointCloud or PointCloud2?)

Yes, you can set the data_type to PointCloud2.

More information is available in the wiki.

2016-07-14 00:40:19 -0500 answered a question robot_localization does not produce odom->base_link transform

Thanks to the information you've provided now I can help you solve your problem.

The ekf_localization_node that you run provides the transformation between odom and base, the static_transform_publisher instance also broadcasts transformation between imu and base. So this will be the result:

odom -> base , imu -> base

while the correct tf tree must be like this:

odom -> base -> imu

Your base frame has two parents which is clearly not possible. Each frame can only have one parent.

In your case, you are running static_transform_publisher with wrong input parameters.

    <node pkg="tf" type="static_transform_publisher" name="imu_to_robot_base" args="0 0 0 0 0 0 $(arg imu_frame_id) base 100" />

Just switch the parent and child frame in the input arguments and your problem should be solved. The result must be like this:

    <node pkg="tf" type="static_transform_publisher" name="imu_to_robot_base" args="0 0 0 0 0 0 base $(arg imu_frame_id) 100" />
2016-07-14 00:16:29 -0500 answered a question Local path planning with Kinect

Ok what you are asking is a little complicated. Without SLAM or odometry it's simply not possible at all.

You need a world fixed frame to understand where you are right now, without broadcasting the transform between map frame (or odom frame) and your base_link frame you can't tell how far you are from your goal. Let me explain it by an example. you publish a point which is 1 meter away in x axis from your robot. you pass it to a planner which computes a velocity command and the robot starts moving, but now you have to know how long you have traveled and how far you are from your goal. If the goal point is in the base_link, your goal is always 1 meter away from your position and you will never reach it. So the goal must be in the odom or map frame and transformation between the fixed frame and your base_link frame must be available.

Enough about the transformation, but please read REP 105 for more information.

Since you don't have a laser scanner and there is no odometry is available, you can use packages that compute visual odometry using your Kinect data and provide odom -> base_link transformation.

Then comes the no global plan part. Just create a local costmap param file and pass it to a node you want to initialize your costmap with.

In order to achieve your desired behavior the closest available source code is move_base but you need to modify it. The goal that it receives must be passed directly to the local planner using setPlan() function. In fact you have to remove all the usages of global costmap and global planner in the code. The velocity command is computed in your desired loop until you reach the goal.

You are free to write your own code from the beginning, just subscribe to your goal topic and do the rest. I'm not sure why you don't want to use global planner, but if you are really insist on what you are doing, now you know what to do.

Good luck.

2016-07-13 15:04:31 -0500 commented question robot_localization does not produce odom->base_link transform

It's a little spooky... your robot_localization parameters seem fine but it's strange that the tf tree shows that imu -> base is being broadcast by ekf_se, you may wanna post your robot_localization launch file and exact line you're using for running static_transform_publisher.

2016-07-13 09:34:12 -0500 commented question How to display laser scan data from Kinect

I don't fully understand your problem. The laser scan data are in the polar coordinate system, if you are not familiar with that you can look it up in the internet and the data stored in the ranges array are distance in meters. Can you add a screenshot of your RVIZ screen displaying laser data?

2016-07-13 08:53:47 -0500 commented question How to display laser scan data from Kinect

You can visualize your laser scan data using RVIZ but just by looking at the data you can't tell whether it contains the information you want.

2016-07-13 08:51:43 -0500 commented answer Setting default tolerance on Global Planner with Move Base

Anytime. Just let me know if you needed more help.

2016-07-13 02:25:48 -0500 commented question robot_localization does not produce odom->base_link transform

Can you post a tf tree of the frames you've got? There are several things that I suspect might be wrong like running static_transform_publisher that broadcasts the imu->base which is obviously wrong or even base_link_frame which is "base"... but first post that tf tree!

2016-07-13 02:11:44 -0500 answered a question Setting default tolerance on Global Planner with Move Base

So I looked up the global planner source code in the GitHub and I found the problem.

I checked the default_tolerance parameter in the code. As you can see there is a default_tolerance_ parameter which is read from parameter server.

    private_nh.param("default_tolerance", default_tolerance_, 0.0);

Afterwards I looked up the usages of this parameter. There's only a single line that uses this parameter:

    return makePlan(start, goal, default_tolerance_, plan);

I dug further in ...

    bool GlobalPlanner::makePlan(const geometry_msgs::PoseStamped& start, const geometry_msgs::PoseStamped& goal, double tolerance, std::vector<geometry_msgs::PoseStamped>& plan)

Unfortunately the tolerance parameter in makePlan doesn't have any usages at all. which means changing this parameter won't change the behavior of the global planner.

I didn't fully check your parameters but they seem fine. I think with a little modification in the global planner source code you'll get the result you want.

2016-07-13 01:02:23 -0500 commented question How to display laser scan data from Kinect

Ok, I'm not sure why you are facing this problem... anyway haven't you tried this line?

ROS_INFO_STREAM(scan->ranges[i]);
2016-06-25 03:21:51 -0500 received badge  Enthusiast
2016-01-12 00:44:15 -0500 received badge  Good Answer (source)
2015-05-11 02:41:15 -0500 received badge  Nice Answer (source)
2015-01-29 01:25:51 -0500 received badge  Teacher (source)
2015-01-29 01:25:51 -0500 received badge  Necromancer (source)
2014-02-28 23:27:43 -0500 received badge  Supporter (source)
2014-02-24 20:48:08 -0500 answered a question Problem with ROS Networking.

You also have two other options.

I wanted to roslaunch a gui so after hours of searching I found a way!

If you want to run the node and gui on the remote machine, after establishing ssh connection run this command.

export DISPLAY=:0

or (if the first one didn't work)

export DISPLAY=:0.0

the turtlesim node will be launched on the remote machine.

but if you want to get the gui on your main computer just use:

ssh -X

The good thing is by editing my env.sh file I was able to roslaunch my nodes. I just add the first command at the end of my env.sh file.