ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

jgdo's profile - activity

2020-04-28 09:54:26 -0500 commented answer How to load values in bash file to a cpp file??

Alternatively you can direclty read the environment variable in C++ using std::getenv() https://en.cppreference.com/w/cp

2018-05-10 11:12:44 -0500 received badge  Famous Question (source)
2018-04-26 02:50:06 -0500 received badge  Notable Question (source)
2018-04-26 02:50:06 -0500 received badge  Popular Question (source)
2018-04-07 03:13:20 -0500 asked a question Moveit rviz plugin generates invalid quaternion

Moveit rviz plugin generates invalid quaternion Hi together, I'm trying to setup Moveit for a custom robot arm. Basical

2017-03-13 07:24:46 -0500 received badge  Famous Question (source)
2016-07-21 09:00:52 -0500 received badge  Famous Question (source)
2016-05-09 03:30:26 -0500 received badge  Favorite Question (source)
2015-10-15 02:19:06 -0500 received badge  Notable Question (source)
2015-10-15 02:19:06 -0500 received badge  Popular Question (source)
2015-06-26 03:55:16 -0500 received badge  Notable Question (source)
2015-06-26 03:55:16 -0500 received badge  Notable Question (source)
2015-06-02 17:23:14 -0500 received badge  Notable Question (source)
2014-12-18 17:35:26 -0500 received badge  Popular Question (source)
2014-12-18 06:57:56 -0500 asked a question Catkin -DCMAKE_INSTALL_PREFIX flag

Hi,

according to the catkin tutorial I can specify the target location of the catkin package installation using the -DCMAKE_INSTALL_PREFIX flag:

catkin_make install  -DCMAKE_INSTALL_PREFIX=<path>

I noticed that the specified target location is only respected if some packaged have to be build first. If everything is already built, the specified location is ignored and instead the one from last successful build is used.

To make it always work you have add --force-cmake to catkin_make:

catkin_make --force-cmake install  -DCMAKE_INSTALL_PREFIX=<path>

Maybe it would be useful to mention that in the tutorial.

Regards - jgdo -

2014-09-01 21:36:53 -0500 received badge  Popular Question (source)
2014-05-20 00:53:36 -0500 asked a question Visualizing a map in Browser (ros2djs)

Hello everyone,

I'm trying to run the Example for visualizing a map with the 2d ROS javascript library in a web browser communicating through rosbridge, but the canvas which is supposed to show the map stays black. Curiously the 3D Example works perfectly and shows the published map, so I assume the rosbridge/map_server setup is right. Still I'd like to use the 2D viewer since I don't need the 3D capability.

Is there anything I could have configured wrong if the map is shown in 3D but not in 2D?

  • regards jgdo -
2013-11-25 02:42:01 -0500 received badge  Nice Answer (source)
2013-11-24 23:36:20 -0500 received badge  Teacher (source)
2013-11-24 23:11:44 -0500 answered a question An error with class methods subscribers

Hi,

Methods need an object they can be called with, so you have to add the object pointer when calling subscribe(). Also you need to specify the scope of RopCallback.

=>

node_listener_->subscribe( "rop_client", 1000, &RopClient::RopCallback, this);

See also The Publisher/Subscriber overview

Regards

  • jgdo -

BTW: didn't know there exists a russian gcc ;)

2013-11-24 03:56:16 -0500 asked a question App Manager multiple apps

Hi everyone,

I was wondering if there are any plans or considerations to add support for running multiple applications at the same time inside the rocon app manager. Currently the app manager supports only one active application at one time, which means that when switching from one app to other, everything has to be shut down and then restarted again.

When allowing multiple apps, there could be an app for basic functionality (e.g. motor control nodes and localization) which remain turned on the whole time, and the user can switch between higher-level apps like teleop or navigation without stopping everything.

  • jgdo -
2013-02-15 05:03:17 -0500 received badge  Enthusiast
2012-11-08 06:04:55 -0500 received badge  Famous Question (source)
2012-11-08 06:04:55 -0500 received badge  Popular Question (source)
2012-11-08 06:04:55 -0500 received badge  Notable Question (source)
2012-09-05 15:40:23 -0500 received badge  Famous Question (source)
2012-09-05 15:40:23 -0500 received badge  Notable Question (source)
2012-07-25 12:18:53 -0500 received badge  Popular Question (source)
2012-06-08 11:13:24 -0500 commented answer gmapping seems to ignore odometry data

Btw. I have a selfmade motor controller connected to the notebook and not a roomba, so I don't use the turtlebot driver at all but a custom driver node. Is it right that you just have to multiply delta(angle) by odom_angular_scale_correction when doing the odometry calculation?

2012-06-08 11:04:03 -0500 commented answer gmapping seems to ignore odometry data

Ok it has to do something with the number of particles you use. If I set it to 300 I get a god map, at 100 a bad one (loop closing doesn't work then). But I wonder how fast your computer is, I have a i3 2100 with 3.1 GHz and I had to slow down the whole playback to r = 0.05 to get good results.

2012-06-08 06:51:07 -0500 commented answer gmapping seems to ignore odometry data

No I just get the standard messages like "Registering Scans:Done"

2012-06-08 06:33:07 -0500 commented answer gmapping seems to ignore odometry data

How exactly did you get that map, especially the right one? As you can see in the bag file, I got pretty worse results..

2012-06-08 06:26:47 -0500 commented answer gmapping seems to ignore odometry data

I have tried the calibration and it said "Multiply the 'turtlebot_node/odom_angular_scale_correction' parameter with 1.007505". What do I have to do with this value now?

My robot does't have a gyro yet, so it can't mess up anything (or can it?)

2012-06-07 10:58:23 -0500 answered a question gmapping seems to ignore odometry data

Hi,

here is a bagfile containing scan, tf, odometry and the map: link text

As you can see at the beginning gmapping tries to match the new scans to the existing part of the map instead of extending it, which causes the estimated position to jump. To get a good map its is necessary to drive backwards to a corner so the kinect can see two opposite walls at same time.

I hope this data is enough for you, if not i can make a longer example, but the behavior is the same.

Thanks again and regards - jgdo -

2012-06-07 10:51:01 -0500 received badge  Supporter (source)
2012-06-06 11:55:41 -0500 received badge  Student (source)
2012-06-06 11:50:33 -0500 answered a question gmapping seems to ignore odometry data

Hi Hunter,

thanks four Your reply! I'm not using the turtlebot but a self made robot with kinect and a notebook, and I so don't have any calibration software for that. What does the turtlebot calibration do? At my robot i adjusted the odometery by manually driving and rotating and they seem to work good since i can drive around and when i come back to the original place, the computed coordinates are near to zero.

  • jgdo -
2012-06-06 10:53:55 -0500 asked a question gmapping seems to ignore odometry data

Hi,

I'm new to ROS and just managed to get the Kinect fake lasescan and a driver node for my robot hardware working. Now I tried the slam_gmapping to create a map of my room, but the map creation quality is far away from the maps you can see on youtube, e.g. http://www.youtube.com/watch?feature=...

It seems to me like gmapping ignores the published odometry data and frames because the estimated position jumps the whole time. Even if the robot rotates on the same spot it happens that the estimated position jumps for around 30-40cm. The curious thing is that my odometry is quite good and the laser scans would fit absolutely perfectly to the available map without any jumping.

Is there a possibility to say gmapping that it should stick to the odometry and make only small corrections?

Thanks and regards - jgdo -


Update:

thanks four Your reply! I'm not using the turtlebot but a self made robot with kinect and a notebook, and I so don't have any calibration software for that. What does the turtlebot calibration do? At my robot i adjusted the odometery by manually driving and rotating and they seem to work good since i can drive around and when i come back to the original place, the computed coordinates are near to zero.


here is a bagfile containing scan, tf, odometry and the map.

As you can see at the beginning gmapping tries to match the new scans to the existing part of the map instead of extending it, which causes the estimated position to jump. To get a good map its is necessary to drive backwards to a corner so the kinect can see two opposite walls at same time.

I hope this data is enough for you, if not i can make a longer example, but the behavior is the same.

Thanks again and regards - jgdo -

2012-01-02 13:15:12 -0500 marked best answer remote gui for ros applications

If you have ROS installed on the other machine, you can use the graphical tools (rviz, rxgraph, etc) across your network: http://ros.org/wiki/ROS/Tutorials/MultipleMachines

2011-12-09 08:18:49 -0500 received badge  Editor (source)
2011-12-09 08:18:49 -0500 edited question remote gui for ros applications

Hi

I was wondering if there is something like a remote GUI applications can use? Something like a server on a desktop pc where clients can connect to and request to display a window with buttons and so on so the user can interact with the application without having to forward X?

regards jgdo