ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

RND's profile - activity

2022-05-26 02:55:33 -0500 marked best answer frontier exploration theory and usage

Hi,

I'm trying to understand how I can use the package frontier_exploration in order to use it for my project. I have studied the paper describing this theory by Yamauchi (1997). I would like to know whether this implementation of frontier exploration is based on the most basic theory (i.e. the greedy approach where the robot chooses its closest frontier) or whether some objective function is being used to decide upon the choice of the frontier to explore.

Thanks.

--EDIT:

When I tried to run the global_map.launch file and provided a polygon boundary + a start point, I got the following result:

[ WARN] [1425309222.899948026]: Please select an initial point for exploration inside the polygon
[ INFO] [1425309234.966230727]: Sending goal
[ INFO] [1425309235.049964691]: Using plugin "static"
[ INFO] [1425309235.207880621]: Requesting the map...
[ INFO] [1425309235.485626067]: Resizing costmap to 4000 X 4000 at 0.050000 m/pix
[ INFO] [1425309235.712224527]: Received a 4000 X 4000 map at 0.050000 m/pix
[ INFO] [1425309235.712476446]: Subscribing to updates
[ INFO] [1425309235.727304703]: Using plugin "explore_boundary"
[ INFO] [1425309235.923136494]: Using plugin "sensor"
[ INFO] [1425309235.969323279]:     Subscribed to Topics: laser
[ INFO] [1425309236.050620763]: Using plugin "inflation"
[ WARN] [1425309236.434748546]: Map update loop missed its desired rate of 5.0000Hz... the loop actually took 0.2870 seconds

Nothing worked and the exploration was not performed. The nodes that I am running include: rosAria (robot control), sicktoolbox_wrapper (laser driver), tf node, gmapping to build the map. Do i need to run some other node to perform actionlib?

2022-05-26 02:54:43 -0500 received badge  Good Question (source)
2020-12-31 08:58:44 -0500 received badge  Nice Question (source)
2019-12-03 09:38:49 -0500 received badge  Great Question (source)
2019-08-21 07:26:43 -0500 received badge  Favorite Question (source)
2019-06-24 18:03:16 -0500 received badge  Good Question (source)
2018-10-18 07:24:20 -0500 received badge  Nice Question (source)
2018-08-08 12:51:19 -0500 marked best answer Robot coordinates in map

Hi,

I am developing an exploration algorithm based on frontier exploration. In my code, I am subscribing to the topic /map and then I am processing the data[] vector that I obtain from that topic messages. On subscribing to /map I get a 4000 x 4000 pixels map (i.e. height = width = 4000 pixels). Each pixel has its own x and y coordinate and I am using the pixel values (0, +100 or -1) in order to extract the (x,y) coordinates of the free cells (no obstacles) that are adjacent to unknown cells (cells not yet explored).

Now I need to get the robot location in (x,y) coordinates within that map. This means that I must have some (x,y) coordinates within the 4000 x 4000 map, i.e. within the vector data[] that is returned by /map. I have tried to listen in to the transform between /map and /odom but once I get the robot pose with respect to the map (i.e. according to the transform), it is still not making a lot of sense. I am following this code and my robot pose output is of the following nature:

position: 
  x: -9.87982119798
  y: 16.8167273361
  z: 0.0
orientation: 
  x: 0.0
  y: 0.0
  z: 0.627633768897
  w: 0.778508736072

For me, this is not making much sense because I would need some (x,y) value that corresponds to some (x,y) coordinate in the map. This image might help you understand what I mean and the kind of result I'm after.

Thank you in advance.

2018-02-21 04:33:39 -0500 marked best answer gmapping localization information

Hello ROS community,

Since gmapping is a SLAM algorithm, I would like to use both the map and the localization being computed by this algorithm. From what I have seen so far, gmapping only publishes the /map and does not provide any such localization information (i.e. an estimate of the robot pose). My question is this:

Is the gmapping package in ROS performing SLAM? If so, can we get localization information directly from it, rather than having to use other packages like amcl, robot_pose_ekf and robot_localization?

I need to have a fairly accurate robot pose estimate within the map that is the result of SLAM gmapping, rather than using the robot pose estimated from the odometry data. I need to use this in an exploration algorithm.

Also, I seem to have a localization problem in gmapping since if the robot is rotated by 180 degrees in an alley, the resulting map is a mess as shown below:gmapping result

I appreciate any help. Thank you!

RND

2017-12-20 20:29:39 -0500 marked best answer ROS compatibility with PowerBot

Hi,

I'm a beginner to ROS and robot control in general. Are the libraries provided by ROS compatible with the PowerBot from mobilerobots.com ( PowerBot)?

Thanks. R.

2017-12-20 20:27:58 -0500 received badge  Famous Question (source)
2017-12-20 00:07:30 -0500 received badge  Popular Question (source)
2016-08-19 07:29:07 -0500 received badge  Nice Question (source)
2016-06-02 20:44:33 -0500 received badge  Famous Question (source)
2016-05-11 03:13:29 -0500 received badge  Famous Question (source)
2016-05-08 16:07:07 -0500 marked best answer Map not built in slam_gmapping

Hi,

I am trying to build a map using slam_gmapping. My platform is ActivMedia MobileRobots Powerbot equipped with sonar sensors and SICKLMS laser range finder. I am using the ROSARIA package to interact with the robot and to get the odometry data. Then I am using sicktoolbox_wrapper package in order to interface with the laser range finder. I have tested each node separately to make sure that the data I want is being published over the respective topics. I tested this with ROSARIA by running rostopic echo /RosAria/pose and also by writing code to subscribe to that topic. Similarly I ran rostopic echo scan to see that the laser range finder is really working and I can see the data being streamed. I also tried to visualize it on rviz but I got an error saying that the frame [laser] does not exist. I don't know how to fix this but since I saw the data being publish over rostopic I assumed that the laser scanner was set up and ready to be used.

So I followed this tutorial and built a bag by moving my robot around the room. The bag was created "successfully" since no errors had been given. At this point I wanted to verify that all the nodes were running appropriately and the right publishing and subcribing was being made. So I ran rqt_graph and got this: C:\fakepath\setup rqt.png. From what I know of ROS, everything is set up according to the tutorial, having the slam_gmapping node being subscribed to both /scan and /tf (this is the only difference from the tutorial: in the tutorial base_scan was used while my topic from sicktoolbox_wrapper was just scan).

I thought that having all this set up would result in a successful build of the map. However, when I ran gmapping and replayed the bag that I had recorded I got this error:

Warning: TF_OLD_DATA ignoring data from the past for frame base_link at time 1.42253e+09 according to authority /play_1422525800501358444
Possible reasons are listed at http://wiki.ros.org/tf/Errors%20explained
         at line 260 in /tmp/buildd/ros-hydro-tf2-0.4.12-0precise-20141016-1214/src/buffer_core.cpp

When the bag stopped replaying and I attempted to save the map, no map was saved and it just kept "Waiting for map". Also, during the process of 'building' the map, I tried to visualize it on rviz but again, I got a warning message telling me "Map is not found".

Can anyone help me identify the fault and explain what is going on? Thanks!

EDIT

From what I can see from this rqt_graph C:\fakepathslam_gmapping.png is that the node slam_gmapping is publishing on the topic /map. I have added a Map on rviz and set its topic to /map but in the Status I'm getting a Warning saying: No Map Received. C:\fakepath\Rviz Slam_gmapping.pngHow is this possible?

EDIT

I run the sicktoolbox_wrapper node as follows:

rosrun sicktoolbox_wrapper sicklms _port:=/dev/ttyS1 _baud ...
(more)
2016-05-08 16:06:26 -0500 marked best answer rosaria failed to connect to robot

Hi,

I want to use the RosAria package in order to send commands and interface with my robot. I have managed to successfully install and build this rosaria package and I have also tested it multiple times on simulation (by using MobileSim by MobileRobots).

Now I want to connect rosaria to the actual robot, Powerbot (a pioneer robot by Adept MobileRobots). I have successfully connected to the robot ONCE by running

rosrun rosaria RosAria _port:=/dev/ttyS0

Then I killed the node (^C) but when I tried to connect to it once again by re-running the same line (above), I got this error:

powerbot67@powerbot67-desktop:~/catkin_ws/src/rosaria$ rosrun rosaria RosAria
[ INFO] [1422444061.117081691]: RosAria: using port: [/dev/ttyS0]
Could not connect to simulator, connecting to robot through serial port /dev/ttyS0.
Syncing 0
No packet.
Syncing 0
No packet.
Trying to close possible old connection
Syncing 0
No packet.
Syncing 0
No packet.
 Robot may be connected but not open, trying to dislodge.
Syncing 0
No packet.
 Robot may be connected but not open, trying to dislodge.
Syncing 0
No packet.
Could not connect, no robot responding.
Failed to connect to robot.
[ERROR] [1422444067.613784823]: RosAria: ARIA could not connect to robot! (Check ~port parameter is correct, and permissions on port device.)
[FATAL] [1422444067.613882512]: RosAria: ROS node setup failed...

Now $USER is a member of the dialout group and I'm sure that the robot connection worked because: (a) I had connected to the robot through rosaria ONCE during the first time and (b) Before I installed rosaria I was working with the ARIA library directly and everything worked brilliantly.

In the error it is suggesting that I check that ~port parameter is correct, but I'm sure it is since I ran the same command previously.

Can somebody help me understand this please? Thanks!

2016-04-11 10:25:25 -0500 received badge  Famous Question (source)
2016-03-17 16:31:47 -0500 received badge  Famous Question (source)
2016-03-17 16:31:47 -0500 received badge  Notable Question (source)
2015-12-06 09:51:18 -0500 received badge  Famous Question (source)
2015-11-03 01:34:44 -0500 received badge  Notable Question (source)
2015-11-03 01:34:44 -0500 received badge  Popular Question (source)
2015-11-03 01:31:47 -0500 received badge  Notable Question (source)
2015-11-01 02:26:38 -0500 marked best answer Odom Axes not in line with base_link

Hi,

I am using Powerbot to be able to build a map using gmapping algorithm. To setup my robot, I am using the ROSARIA package to be able to have control on the motors, get pose estimates from odometry etc. This is an ROS wrapper for the ARIA library provided by ActivMedia mobilerobots.

I am aware that I need some tf configuration in order to align the odometry frame with the base_link and to align the laser frame with base_link frame. I have followed the tutorials and I have understood the concept. I came across this tutorial and followed it to be able to create my transforms while using ROSARIA. However, in doing so, I have noticed that the odometry axes is not aligned with the base_link axes. The laser axes are aligned perfectly as can be seen in this screenshot. The odometry axes as the ones far off from the other two.

I am aware that ROSARIA publishes its own tf as can be seen from rosgraph.png. The current transform tree according to my code is this. The code that I am using to build the transformations is the following:

#include <ros/ros.h>
#include <tf/transform_broadcaster.h>
#include <nav_msgs/Odometry.h>


void poseCallback(const nav_msgs::Odometry::ConstPtr& odomsg)
{
   //TF odom=> base_link

     static tf::TransformBroadcaster odom_broadcaster;
    odom_broadcaster.sendTransform(
      tf::StampedTransform(
                tf::Transform(tf::Quaternion(odomsg->pose.pose.orientation.x,
                                     odomsg->pose.pose.orientation.y,
                                     odomsg->pose.pose.orientation.z,
                                     odomsg->pose.pose.orientation.w),
        tf::Vector3(odomsg->pose.pose.position.x/1000.0, 
                    odomsg->pose.pose.position.y/1000.0, 
                    odomsg->pose.pose.position.z/1000.0)),
                odomsg->header.stamp, "/odom", "/base_link"));
      ROS_DEBUG("odometry frame sent");
      ROS_INFO("odometry frame sent: y=[%f]", odomsg->pose.pose.position.y/1000.0);
}

int main(int argc, char** argv){
    ros::init(argc, argv, "pioneer_tf");
    ros::NodeHandle n;

    ros::Rate r(100);

    tf::TransformBroadcaster broadcaster;

  //subscribe to pose info
    ros::Subscriber pose_sub = n.subscribe<nav_msgs::Odometry>("RosAria/pose", 100, poseCallback);

    while(n.ok()){
    //base_link => laser
        broadcaster.sendTransform(
            tf::StampedTransform(
                tf::Transform(tf::Quaternion(0, 0, 0), tf::Vector3(/*0.034, 0.0, 0.250*/0.385, 0, 0.17)),
                ros::Time::now(), "/base_link", "/laser"));

    ros::spinOnce();        
    r.sleep();
    }
}

Of course, the maps created using this setup are a mess. This happens after I move the robot with the joystick for a bit. Initially, on starting up ROSARIA and the transform node the axes are aligned. It is only after the robot moves that they lose their alignment. Can someone help me understand what is wrong with my transform tree and how can I fix this? Thanks!

EDIT

This is a typical example of what happens to my robot's odometry when I drive it around a little bit. Basically, it moved forwards, then turned around a desk and then moved forwards some more. I do not expect such bad odometry.

2015-09-04 10:15:55 -0500 received badge  Famous Question (source)
2015-07-13 17:32:50 -0500 received badge  Popular Question (source)
2015-07-13 17:32:50 -0500 received badge  Notable Question (source)
2015-07-08 11:28:29 -0500 commented question rosaria failed to connect to robot

can you specify the error that you are getting?

2015-07-08 02:29:07 -0500 commented question rosaria failed to connect to robot

@anirban you can either follow the above comments, or check that the port you are using is valid. How are you connecting to rosaria?

2015-06-18 07:47:09 -0500 commented answer move_base without base_local_planner

How do you configure the Costmap2DROS to work with navfn in code? I need to give a goal to navfn and obtain its planned path. I have opened a question here: navfn question

2015-06-18 07:35:15 -0500 commented question navfn and costmap_2d

Shall I place the code in a separate file? I would need others to see this code since I do not know what I'm doing wrong.

2015-06-18 05:22:32 -0500 asked a question navfn and costmap_2d

Hello,

I am trying to use the classes navfn::NavfnROS and costmap_2d::Costmap2DROS in order to calculate a path from a start location to a goal location. I need to do this such that I can evaluate the paths to a set of goals in the world. I have implemented the following code:

using namespace std;
using geometry_msgs::Point;

int main(int argc, char** argv) 
{

    ros::init(argc, argv, "dijkstra");
    ros::NodeHandle n;

    tf::TransformListener tf(ros::Duration(10));
    costmap_2d::Costmap2DROS costmap("global_costmap", tf);

    costmap_2d::LayeredCostmap layers("map", false, true);

    std::vector<Point> polygon;
    geometry_msgs::Point vector_point;
    vector_point.x = -0.5;
    vector_point.y = -0.33;
    vector_point.z = 0.0;
    polygon.push_back(vector_point);
    vector_point.x = -0.5;
    vector_point.y = 0.33;
    vector_point.z = 0.0;
    polygon.push_back(vector_point);
    vector_point.x = 0.5;
    vector_point.y = 0.33;
    vector_point.z = 0.0;
    polygon.push_back(vector_point);
    vector_point.x = 0.5;
    vector_point.y = -0.33;
    vector_point.z = 0.0;
    polygon.push_back(vector_point);
    layers.setFootprint(polygon);

    //ROS_INFO("Inscribed Radius = %f, Circumscribed Radius = %f", polygon.at(0).x, polygon.at(1).x);

    costmap.getCostmap();
    costmap.start();

    navfn::NavfnROS navfn;
    navfn.initialize("my_navfn_planner", &costmap);

    std::vector<geometry_msgs::PoseStamped> plan;   // Initialize vector that stores the plan poses

    double goal_x = 1.0;
    double goal_y = 0.6;

    // Declare file name that will be used to store path planned poses
    ofstream myfile("plan_poses.txt");

    if(!myfile) {
        cout << "Cannot open file for writing!" << endl;
    }

    ROS_INFO("Giving goal to path planner in the world");
    ROS_INFO("GOAL: x = %f, y = %f", goal_x,goal_y);

    // construct a pose message for the start pose
        geometry_msgs::PoseStamped start_pose;
        start_pose.header.frame_id = "map";
        start_pose.header.stamp = ros::Time::now();

    start_pose.pose.position.x = 0.0;
        start_pose.pose.position.y = 0.0;
        start_pose.pose.position.z = 0.0;
        start_pose.pose.orientation.x = 0.0;
        start_pose.pose.orientation.y = 0.0;
        start_pose.pose.orientation.z = 0.0;
        start_pose.pose.orientation.w = 1;

    // construct a pose message for the goal
        geometry_msgs::PoseStamped goal_pose;
        goal_pose.header.frame_id = "map";
        goal_pose.header.stamp = ros::Time::now();

    goal_pose.pose.position.x = goal_x;
        goal_pose.pose.position.y = goal_y;
        goal_pose.pose.position.z = 0.0;
        goal_pose.pose.orientation.x = 0.0;
        goal_pose.pose.orientation.y = 0.0;
        goal_pose.pose.orientation.z = 0.0;
        goal_pose.pose.orientation.w = 1;


    geometry_msgs::Point goal;
    goal.x = 1.0;
    goal.y = 0.6;
    goal.z = 0.0;

    navfn.computePotential(goal);
    double potential = navfn.getPointPotential(goal);
    ROS_INFO("Goal potential = %f", potential);
    navfn.makePlan(start_pose, goal_pose, plan);
    ROS_INFO("Size of path = %d", plan.size());

    int plan_size = plan.size();
    for (int i=0; i < plan_size; i++)
    {
        myfile << plan.at(i).pose.position.x << "," << plan.at(i).pose.position.y << endl;
    }

    return 0;
}

In this code I am attempting to get the global costmap of the current map and use it to plan a path by navfn. However, the output on my console is this:

[ INFO] [1434622104.004652179]: Using plugin "obstacle_layer"
[ INFO] [1434622104.171565187]:     Subscribed to Topics: 
[ INFO] [1434622104.221140799]: Using plugin "inflation_layer"
[ERROR] [1434622104.254045545]: You must specify at least three points for the robot footprint ...
(more)
2015-06-12 12:43:43 -0500 received badge  Popular Question (source)