ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange |
2015-04-08 05:50:28 -0500 | received badge | ● Famous Question (source) |
2014-09-06 03:13:46 -0500 | received badge | ● Notable Question (source) |
2014-09-06 03:13:46 -0500 | received badge | ● Popular Question (source) |
2014-08-29 16:49:31 -0500 | received badge | ● Enthusiast |
2014-07-04 16:00:48 -0500 | received badge | ● Supporter (source) |
2014-07-04 15:21:44 -0500 | commented question | Failed to install yaml-cpp0.2.6-dev You can use check-install command to install it so that rosdep will not complain: sudo checkinstall --pkgname=yaml-cpp --provides=yaml-cpp --pkgversion=0.3.0 (Use relevant pkgversion based on what you download) |
2014-07-02 14:35:31 -0500 | answered a question | Raspberry Pi TurtleSim Build Error The solution i found for this problem is to edit the file turtle.cpp (mentioned in the error ) and replace 0.0 in the file with qreal(0.0) and it will compile fine after that |
2013-10-07 06:59:01 -0500 | received badge | ● Editor (source) |
2013-10-07 06:57:20 -0500 | asked a question | How to provide yaw input in asctec_hl_interface Hi, I am working on pelican and trying to use the asctec_hl_interface with a motion capture system. In the asctec_hl_interface tutorial for position control, it is said that the yaw is taken from the external pose estimate. But when I checked this by moving the markers around, only the position is being considered from the motion capture system and the yaw is totally controlled by the compass. For some weird reasons, the compass is not able provide good enough yaw estimation in the indoor environment I am working in. So I need to transfer or put a higher trust on the yaw from the motion capture system. I do not know how to do this. The paper which describes the implementation also does not talk about how the yaw from the motion capture system is fed into the position controller. Please let me know if any one can provide details on this. |