ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange |
2019-06-18 10:57:22 -0500 | received badge | ● Nice Answer (source) |
2016-04-14 16:34:58 -0500 | received badge | ● Famous Question (source) |
2016-01-21 04:14:42 -0500 | received badge | ● Enthusiast |
2016-01-18 09:50:50 -0500 | answered a question | Include Ueye into ROS Hydro [Solved] UEye ros package is included in Hydro so YES: http://wiki.ros.org/ueye . Try to check the official Wiki before asking ;) You will also need to install the UEye drivers https://en.ids-imaging.com/download-u... |
2016-01-18 02:15:01 -0500 | commented answer | Huge drift using single IMU state estimation Thanks. I was specting bad result but not so bad. Then I will use it with visual Odometry from the begining. |
2016-01-18 02:13:17 -0500 | received badge | ● Scholar (source) |
2016-01-15 09:35:47 -0500 | received badge | ● Notable Question (source) |
2016-01-14 16:14:50 -0500 | received badge | ● Popular Question (source) |
2016-01-14 10:26:09 -0500 | asked a question | Huge drift using single IMU state estimation I am using an XSens MTi IMU to get Odometry. I don't expect it very precise becuase using robot_localization with only an IMU it's not enough. But the data doesn't make any sense. Here's a sample IMU message laying on a table: I checked REP 103 also and seems fine. Here's the robot localization launchfile: The result after a few seconds lying on the table is this: I can see linear twist changes when I move the imu in the table but at the end the speeds are continuously increasing at high rate. What may be ... (more) |
2016-01-13 04:44:32 -0500 | commented question | Poor reference frame for local sensing? What I understand is that the robot-map relative position is known but it is subject to errors so whenever is possible sensor data and local actions should be be done with respect to the robot's frames, which are less prone to change. |
2016-01-13 04:38:38 -0500 | answered a question | TurtleBot ROS - How to move to specific angle (using Twist) What do you want is a controller. If the TurtleBot doesn't come with one you should implement it. As you say the easiest way to do it is to make a node that listens for desired poses and sends a twist to the robot. The most straightforward is the porportional control in which the speed is proportional to the desired angle. Of course there are other options but I don't see why is this ugly. |
2016-01-13 02:33:21 -0500 | received badge | ● Teacher (source) |
2016-01-13 02:21:23 -0500 | commented answer | Tf origin is away from the base_link Well but in the 3rd picture you clearly see the problem: Origin_base_plate should be on top of Origin_global but instead has the same offset. Try to move the origin of the robot in SolidWorks and see If this relationship changes. |
2016-01-13 02:18:18 -0500 | commented answer | writing tf baselink -> laser On the other hand, that code uses the information in TF (so it needs the static transformation also to be set) to transform the frame from the laser reference frame to the robot reference frame. With this the points info can be used properly. Do you get the difference? |
2016-01-13 02:15:38 -0500 | commented answer | writing tf baselink -> laser I think you should write this as a comment to my answer next time. haha. A static transform publisher adds the relationship between those frame to TF tree so this way every other ROS program can obtain this relationship. |
2016-01-12 07:09:27 -0500 | commented answer | Tf origin is away from the base_link Nevertheless, if you don't care inertial values you can offset the base_plate visual to be where the base_plate is and then offset the rest of the links to move beside the base plate. Once you figure the offset this should be fairly easy. |
2016-01-12 07:07:38 -0500 | commented answer | Tf origin is away from the base_link I think it won't be easy. Moreover your inertial values will probably be wrong also If you change things manually. I usually design the pieces separatelly and build the URDF file manually. It is not very difficult and allows more control, but it is not suited for bigger robots. |
2016-01-12 06:47:52 -0500 | commented question | How to use camera calibration with ubuntu 14.04? This is a problem caused by the migration of some non-free features from OpenCV to external packages. I cannot recall what I did exactly to solve this but you should check in the OpenCV forums. It happened to me with a SURF feature detector. |
2016-01-12 06:47:51 -0500 | answered a question | writing tf baselink -> laser Your laser will output a point cloud or an array of points in a topic ("/scan" if I undestood you). This messages will have a frame_id (for example, "base_scan"). To use this points in your program you have to subscribe to this topic (see Basic Tutorials) and if you want to use your points with respect to the robot base you will have to publish the transformation from base_link to base_scan. It can be done with a a static transform publisher. |
2016-01-12 06:47:51 -0500 | answered a question | Tf origin is away from the base_link I think your problem is that the robot is not placed in the center of coordinates in SolidWorks. As a result the first model (the base plate) is displaced this respect to the base frame and all the subsequent joints are also set with respect to this SolidWorks origin. You should try to put the robot in the solidworks origin and then export it again. |
2016-01-12 03:17:22 -0500 | received badge | ● Supporter (source) |