ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

sumanth's profile - activity

2020-10-23 06:37:35 -0500 marked best answer Gmapping problem

Hello I am trying to get a map of the area with a differential drive mobile robot, for which the following are the steps followed.

  1. Open the kinect with openini using "roslaunch openni_launch openni.launch".
  2. Change the pointcloud to the laser scan using "rosrun depthimage_to_laserscan depthimage_to_laserscan image:=/camera/depth/image_raw", after step-2 I am able to see the "laser scan" in the RVIz properly.

  3. Running the gmapping with "rosrun gmapping slam_gmapping scan:=/scan tf:=/odom" , I am running the node which publishes the odom (the odometry data from the real robot).

  4. Then open rviz with "rosrun rviz rviz".

But in RVIZ I am unable to see any mapping.

Questions:

  1. The procedure which I follwed is correct or Is there something I am missing..?

  2. For publishing the odo data I am using the example from the following link: http://wiki.ros.org/navigation/Tutori... with the following changes:

    • Added a callback, so as to subscribe to the incoming data from the robot.
    • Changed the velocities to suit for differential drive robot.
    • Small change in calculation of the robot position based on odo.

But the odom published from this has many fields:

    nav_msgs::Odometry odom;
    odom.header.stamp = current_time;
    odom.header.frame_id = "odom";

    //set the position
    odom.pose.pose.position.x = x;
    odom.pose.pose.position.y = y;
    odom.pose.pose.position.z = 0.0;
    odom.pose.pose.orientation = odom_quat;

    //set the velocity
    odom.child_frame_id = "base_link";
    odom.twist.twist.linear.x = vx;
    odom.twist.twist.linear.y = vy;
    odom.twist.twist.angular.z = vth;

which one I should give to the gmapping...?

Please find the rqt_graph here image description

Please find the frames.pdf here image description

Many thanks in advance

2017-08-11 13:49:47 -0500 edited answer Fix for - OpenCV Error:Unspeicifed Error -

Sharing the source code where it gives an error will help to identify the problem.! But from the error "could not find

2017-08-11 13:47:44 -0500 answered a question Fix for - OpenCV Error:Unspeicifed Error -

Sharing the source code where it gives an error will help to identify the problem.! But as the error says "could not fi

2017-08-11 13:32:00 -0500 marked best answer How to use Navigation stack

Hi,

For my project, I am thinking to use navigation stack with kinect as the sensor for mapping.

But how to publish the odometry data to the navigation stack..? How to use the Twist Data coming form the navigation stack..?

thanks in advance

2016-12-08 16:16:16 -0500 marked best answer Problems with navigation stack

I have a map of the area, in which I want my differential drive mobile robot to move around to the goals given, for this I already mapped the area with kinect using the gmapping, now I want to use the navigation stack.

I have followed the setup as mentioned in the navigation_stack page http://wiki.ros.org/navigation/Tutori...

But after running the move_base.launch file I get the following errors.

[ WARN] [1408947873.713176792]: Request for map failed; trying again...
[ WARN] [1408947874.215030143]: Request for map failed; trying again...
[ WARN] [1408947874.716821881]: Request for map failed; trying again...
[ WARN] [1408947875.219003710]: Request for map failed; trying again...
[ WARN] [1408947875.721026682]: Request for map failed; trying again...
[ WARN] [1408947876.223351709]: Request for map failed; trying again...
[ WARN] [1408947876.725076883]: Request for map failed; trying again...
[ WARN] [1408947876.920729621]: Waiting on transform from base_link to map to become available before running costmap, tf error: 
[ WARN] [1408947877.227117599]: Request for map failed; trying again...
[ WARN] [1408947877.729052070]: Request for map failed; trying again...
[ WARN] [1408947878.230988466]: Request for map failed; trying again...
[ WARN] [1408947878.733617816]: Request for map failed; trying again...
[ WARN] [1408947879.235487601]: Request for map failed; trying again...
[ WARN] [1408947879.738477007]: Request for map failed; trying again...
[ WARN] [1408947880.240812581]: Request for map failed; trying again...
[ WARN] [1408947880.743107807]: Request for map failed; trying again...
[ WARN] [1408947881.245370459]: Request for map failed; trying again...
[ WARN] [1408947881.747222844]: Request for map failed; trying again...
[ WARN] [1408947881.971717380]: Waiting on transform from base_link to map to become available before running costmap, tf error: 
[ WARN] [1408947882.249109598]: Request for map failed; trying again...
[ WARN] [1408947882.750964052]: Request for map failed; trying again...
[ WARN] [1408947883.254429857]: Request for map failed; trying again...
[ WARN] [1408947883.756259117]: Request for map failed; trying again...

Here is my graph attached: image description

So What I am missing here, which is stopping the navigation stack to run.

many thanks in advance.

2016-10-06 18:38:58 -0500 received badge  Famous Question (source)
2016-09-06 15:23:40 -0500 marked best answer Mapping with Kinect sensor

I am working on a robot, which finally should move to the specified location in the map autonomously. (For which I am planning to use Navigation stack).

For which I want to map the environment. I have a kinect sensor available, so which mapping I should use so that I can generate a map (2D or 3D), which can be later fed to the navigation stack....??

I can fetch the odo ticks from the micro controller and give for the mapping stack. As the kinect is a fixed connection to the robot, I can publish a static transform from kinect to the base link.

Are these sufficient to use the mapping stack, but the question is which mapping stack to go with..? and any examples suggesting the use of the mapping stack will be great full to start with.

Many thanks in advance.

2016-05-08 15:31:38 -0500 marked best answer Can also link yaw rate sensor data with map

I am using gmapping to map the aera with my robot, I am presently using odo data and data from kinect for mapping.

update after dornhege answer

I get the odo ticks from the robot, which I convert to the distance travelled by each wheel and theta, which I get from the difference of the distances travelled by wheels divided by wheel base length. I give this to /odom.

Presently I am using this published /odom data for the mapping and navigation.

I even Have a yaw rate sensor on the robot which gives yaw angle (along with acceleration values), How can I use this with robot_pose_ekf which also accepts Roll and Pitch angles, but I only have yaw angels how can I achieve the fused odo data with yaw using robot_pose_ekf.?

Many thanks in advance.

2016-05-08 15:31:25 -0500 received badge  Famous Question (source)
2016-05-08 15:27:17 -0500 marked best answer Hector Mapping with out odometry

Hello,

I am using kinect sensor on my custom robot.

I want to achieve mapping with the kinect sensor with out using the odometry. How can I achieve this..?

Is hector_mapping should be used, if so how can I use this..?

2016-05-08 15:27:00 -0500 marked best answer Help in using serial library on Ubuntu for communication with ROS

Hello,

I am currently working on my custom robot project, for which I need my robot to communicate with the ROS running on a linux-ubuntu PC, so I need a Serial library running on linux Pc which is capable of communicating with the ROS and the Robot.

I can use rosserial but I don't have C++ support on my controller (Rather I can't write code in C++ as the existing code is in C, I want to reuse the already available framework), the micro controller on the robot is a non-aurdino one.

I have tried using the r2serial code from the link : http://code.google.com/p/team-diana/s... .

I am running this code on the ROS PC, with this I am able to receive the data from Linux PC to the robot, but I am able to send the data from the robot to the linux PC.

Can any one guide me on what's going wrong with the code I am using..? or there is any other solution/ serial library that I can use for my project.

Many Thanks in advance.

2015-11-12 10:26:25 -0500 received badge  Famous Question (source)
2015-09-29 06:43:44 -0500 received badge  Famous Question (source)
2015-07-28 02:42:03 -0500 received badge  Famous Question (source)
2015-06-24 13:01:34 -0500 received badge  Famous Question (source)
2015-01-05 04:45:25 -0500 received badge  Famous Question (source)
2014-12-05 05:36:48 -0500 received badge  Nice Question (source)
2014-11-18 13:29:11 -0500 received badge  Notable Question (source)
2014-11-18 10:52:28 -0500 received badge  Notable Question (source)
2014-11-14 08:36:40 -0500 received badge  Notable Question (source)
2014-11-14 08:36:40 -0500 received badge  Famous Question (source)
2014-11-12 22:54:39 -0500 received badge  Popular Question (source)
2014-11-06 05:45:21 -0500 asked a question Ethernet stack for ROS

I have a micro controller with TCP/IP and Ethernet stack available, This is capable of sending the messages over TCP/IP. But is there any possibility with ROS to receive these message with a Linux PC running ROS.

A similar package to rosserial or something..?

thanks in advance.

2014-10-06 09:52:11 -0500 received badge  Notable Question (source)
2014-10-06 09:52:11 -0500 received badge  Famous Question (source)
2014-10-01 12:40:15 -0500 received badge  Famous Question (source)
2014-10-01 09:11:12 -0500 received badge  Famous Question (source)
2014-09-25 09:17:06 -0500 received badge  Famous Question (source)
2014-09-04 00:44:57 -0500 commented answer Problems with static tf publisher

Many thanks

2014-09-04 00:44:50 -0500 commented answer Problems with static tf publisher

Perfect, the robot_state_publisher and static_publisher both are publishing the transform between base_link and camera_link. I have conformed this even by running the roswtf.

Solved the problem.

2014-09-04 00:42:33 -0500 marked best answer Problems with static tf publisher

I am using my own URDF file for my custom built robot, the robot is properly visible and moves properly in rviz, If I run the simulation with /odom publisher.

But when open the camera nodlet with openni_launch and publish a static tf, then the laser scanner in the simulation starts to rotate randomly, I don't have idea why this is happening.

I publish the static tf in launch file which contains:

<launch>
        <node pkg="tf" type="static_transform_publisher" name="base_to_laser" args="0.1 0 0.455 0 0 0 base_link camera_link 100"/>
</launch>

Below are my rqt_graph

image description

Please find the below frames.pdf

image description

and the slam_gmapping node gives the following warnings

[ INFO] [1409662993.035030893]: Laser is mounted upwards.
 -maxUrange 9.99 -maxUrange 9.99 -sigma     0.05 -kernelSize 1 -lstep 0.05 -lobsGain 3 -astep 0.05
 -srr 0.1 -srt 0.2 -str 0.1 -stt 0.2
 -linearUpdate 1 -angularUpdate 0.5 -resampleThreshold 0.5
 -xmin -100 -xmax 100 -ymin -100 -ymax 100 -delta 0.05 -particles 30
[ INFO] [1409662993.055169092]: Initialization complete
update frame 0
update ld=0 ad=0
Laser Pose= 0.257682 -1.50465 1.55431e-15
m_count 0
Registering First Scan
update frame 6
update ld=0.0282843 ad=1.5708
Laser Pose= 0.237682 -1.48465 -1.5708
m_count 1
Scan Matching Failed, using odometry. Likelihood=-0.0229554
lp:0.257682 -1.50465 1.55431e-15
op:0.237682 -1.48465 -1.5708
Scan Matching Failed, using odometry. Likelihood=-4266.67
lp:0.257682 -1.50465 1.55431e-15
op:0.237682 -1.48465 -1.5708
Scan Matching Failed, using odometry. Likelihood=-4266.67
lp:0.257682 -1.50465 1.55431e-15
op:0.237682 -1.48465 -1.5708
Scan Matching Failed, using odometry. Likelihood=-4266.67
lp:0.257682 -1.50465 1.55431e-15
op:0.237682 -1.48465 -1.5708
Scan Matching Failed, using odometry. Likelihood=-4266.67
lp:0.257682 -1.50465 1.55431e-15
op:0.237682 -1.48465 -1.5708
Scan Matching Failed, using odometry. Likelihood=-4266.67
lp:0.257682 -1.50465 1.55431e-15
op:0.237682 -1.48465 -1.5708
Scan Matching Failed, using odometry. Likelihood=-4266.67
lp:0.257682 -1.50465 1.55431e-15
op:0.237682 -1.48465 -1.5708
Scan Matching Failed, using odometry. Likelihood=-4266.67
lp:0.257682 -1.50465 1.55431e-15
op:0.237682 -1.48465 -1.5708
Scan Matching Failed, using odometry. Likelihood=-4160.76
lp:0.257682 -1.50465 1.55431e-15
op:0.237682 -1.48465 -1.5708
Scan Matching Failed, using odometry. Likelihood=-4266.67
lp:0.257682 -1.50465 1.55431e-15
op:0.237682 -1.48465 -1.5708
Scan Matching Failed, using odometry. Likelihood=-4266.67
lp:0.257682 -1.50465 1.55431e-15
op:0.237682 -1.48465 -1.5708
Scan Matching Failed, using odometry. Likelihood=-4266.67
lp:0.257682 -1.50465 1.55431e-15
op:0.237682 -1.48465 -1.5708
Scan Matching Failed, using odometry. Likelihood=-4266.67
lp:0.257682 -1.50465 1.55431e-15
op:0.237682 -1 ...
(more)
2014-09-03 11:58:51 -0500 received badge  Notable Question (source)
2014-09-03 01:47:06 -0500 commented answer robot is not algined to the laser scanner frame

Martin, Many thanks super help, I have over looked the coordinate frame of my scanner. thanks

2014-09-03 00:40:50 -0500 commented answer robot is not algined to the laser scanner frame

martin, I have modified the question with the URDF, can you please re-visit the question again.

2014-09-03 00:40:09 -0500 edited question robot is not algined to the laser scanner frame

I have my own URDF file, when I try to do mapping using the slam_gmapping node, then open the rviz with the map and robot model, then I can see that what the actual laser scanner is seeing is exactly 90 degree offset to how the lase scanner is oriented in the simulation.

for more details please go through the screen shot attached: image description

Please find my URDF here. :

<robot name="my_robot">

  <link name="base_link">
    <visual>
      <geometry>
        <box size="0.6 0.35 0.15"/>
      </geometry>
      <origin rpy="0 0 0" xyz="0 0 0"/>
      <material name="white">
        <color rgba="0.2 1 0.3 1"/>
      </material>
    </visual>
  </link>

 <link name="lwheel">
    <visual>
      <geometry>
        <cylinder length="0.04" radius="0.1"/>
      </geometry>
      <origin rpy="0 0 0" xyz="0 0 0"/>
      <material name="black">
        <color rgba="0 0 0 1"/>
      </material>
    </visual>
  </link>

  <joint name="base_to_lwheel" type="fixed">
    <parent link="base_link"/>
    <child link="lwheel"/>
    <origin xyz="-0.1 -0.2 -0.025" rpy="1.5708 0 0"/>
    <axis xyz="-0.1 -0.2 -0.025 " />
  </joint>

 <link name="rwheel">
    <visual>
      <geometry>
        <cylinder length="0.04" radius="0.1"/>
      </geometry>
      <origin rpy="0 0 0" xyz="0 0 0"/>
      <material name="black">
        <color rgba="0 0 0 1"/>
      </material>
    </visual>
  </link>

  <joint name="base_to_rwheel" type="fixed">
    <parent link="base_link"/>
    <child link="rwheel"/>
    <origin xyz="-0.1 0.2 -0.025" rpy="-1.5708 0 0"/>
  </joint>

  <link name="fwheel_left">
    <visual>
      <geometry>
        <cylinder length="0.05" radius="0.03"/>
      </geometry>
      <origin rpy="0 0 0" xyz="0 0 0"/>
      <material name="black">
        <color rgba="0 0 0 1"/>
      </material>
    </visual>
  </link>

  <joint name="base_to_fwheel_left" type="fixed">
    <parent link="base_link"/>
    <child link="fwheel_left"/>
    <origin xyz="0.22 -0.1 -0.095" rpy="1.5708 0 0"/>
  </joint>

  <link name="fwheel_right">
    <visual>
      <geometry>
        <cylinder length="0.05" radius="0.03"/>
      </geometry>
      <origin rpy="0 0 0" xyz="0 0 0"/>
      <material name="black">
        <color rgba="0 0 0 1"/>
      </material>
    </visual>
  </link>

  <joint name="base_to_fwheel_right" type="fixed">
    <parent link="base_link"/>
    <child link="fwheel_right"/>
    <origin xyz="0.22 0.1 -0.095" rpy="-1.5708 0 0"/>
  </joint>

   <link name="camera_link">
    <visual>
      <geometry>
        <box size="0.28 0.065 0.04"/>
      </geometry>
      <origin rpy="0 0 0" xyz="0 0 0"/>
      <material name="black">
        <color rgba="0 0 0 1"/>
      </material>
    </visual>
  </link>

   <link name="scan_support">
    <visual>
      <geometry>
        <cylinder length="0.36" radius="0.015"/>
      </geometry>
      <origin rpy="0 0 0" xyz="0 0 0"/>
      <material name="black">
        <color rgba="1 0.2 0.1 1"/>
      </material>
    </visual>
  </link>

  <joint name="base_to_scan_support" type="fixed">
    <parent link="base_link"/>
    <child link="scan_support"/>
    <origin xyz="0.10 0 0.255" rpy="0 0 0"/>
 </joint>

  <joint name="base_to_scanner" type="fixed">
    <parent link="base_link"/>
    <child link="camera_link"/>
    <origin xyz="0.10 0 0.455" rpy="0 0 -1.5708"/>
 </joint>

</robot>

Any insights for what might be wrong here..?

Many thanks in advance.

2014-09-03 00:38:58 -0500 received badge  Notable Question (source)
2014-09-03 00:01:08 -0500 commented answer robot is not algined to the laser scanner frame

But ideally this should be done..?

2014-09-02 20:18:14 -0500 received badge  Popular Question (source)
2014-09-02 14:53:30 -0500 received badge  Popular Question (source)
2014-09-02 10:25:35 -0500 commented answer robot is not algined to the laser scanner frame

No I have changed the Reference frame in Grid to my scanner frame, but still the observation is same

2014-09-02 08:36:34 -0500 commented question robot is not algined to the laser scanner frame

The reference frame in grid was <fixed frame="">