ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange |
2020-04-16 09:21:13 -0500 | received badge | ● Famous Question (source) |
2020-04-16 09:21:13 -0500 | received badge | ● Notable Question (source) |
2019-10-31 05:57:51 -0500 | received badge | ● Famous Question (source) |
2019-10-31 05:57:51 -0500 | received badge | ● Notable Question (source) |
2019-10-31 05:57:51 -0500 | received badge | ● Popular Question (source) |
2019-10-15 17:37:47 -0500 | received badge | ● Famous Question (source) |
2018-06-15 17:15:53 -0500 | marked best answer | transform (x,y,z) coordinate from kinect to /map frame using tf Hi, I'm tracking an object using a kinect and trying to publish the tf transform of that object in a "/map frame. I'm able to return the object's x, y and depth data using OpenCV. However, I'm struggling with the tf stuff. I've written a function which is meant to update the transform and in my launch file But when I run Rviz, using "/map" as a fixed frame, the drone/base_link tf isn't shown in the screen. I've attached a picture of what it looks like at this link. My x,y and depth values are okay from what the kinect but the transform is wrong. Am I doing something wrong here? Also here's my tf_tree which looks about right to me. Link here |
2017-10-07 13:26:52 -0500 | received badge | ● Notable Question (source) |
2017-08-01 07:03:55 -0500 | received badge | ● Famous Question (source) |
2017-08-01 07:03:55 -0500 | received badge | ● Notable Question (source) |
2017-07-12 17:06:38 -0500 | received badge | ● Famous Question (source) |
2017-04-28 10:25:18 -0500 | asked a question | Transform from map to odom for robot using rtabmap Transform from map to odom for robot using rtabmap I built a custom robot using URDF and I'm trying to view the position |
2017-04-20 14:02:27 -0500 | commented answer | transform (x,y,z) coordinate from kinect to /map frame using tf I think I'm probably not calculating the transform properly. -438 and -323 are actual pixel positions of the object I'm |
2017-04-20 09:24:35 -0500 | received badge | ● Notable Question (source) |
2017-04-20 09:24:35 -0500 | received badge | ● Popular Question (source) |
2017-04-20 07:02:23 -0500 | commented answer | transform (x,y,z) coordinate from kinect to /map frame using tf I took your advice and took a look at TF -> Frames. The X,Y,Z values in the relative position are about right for dro |
2017-04-19 17:44:13 -0500 | received badge | ● Popular Question (source) |
2017-04-19 10:55:36 -0500 | asked a question | transform (x,y,z) coordinate from kinect to /map frame using tf transform (x,y,z) coordinate from kinect to /map frame using tf Hi, I'm tracking an object using a kinect and trying to |
2017-04-19 02:06:06 -0500 | received badge | ● Popular Question (source) |
2017-02-23 18:15:59 -0500 | commented question | Visualising positions of a detected object in Rviz using find-object-2d @matlabbe, I wasn't publishing tf when running the kinect bridge. Silly error really. I also managed to solve my first question. Thanks for all your help |
2017-02-20 07:33:27 -0500 | asked a question | Visualising positions of a detected object in Rviz using find-object-2d I'm using find-object to detect an object together with a Kinect 2 and I'm trying to display the trajectory or points that the object has moved within a room using RVIZ. I just want to draw lines between the last known position of the object and it's current position. I modified tf_example_node.cpp but I'm having a few problems. My Issues are:
This is my code (more) |
2017-01-29 13:47:41 -0500 | received badge | ● Enthusiast |
2017-01-20 06:35:25 -0500 | asked a question | Help 3D navigation using a known map Hello, I'm using rtab map and a kinect to generate a map of a corridor. I was wondering if it was possible to fly a UAV to a known point within the generated map. My project statement requires me to:
I'm able to use rtab map to generate a map of an environment however, I'm not sure where to start for the second point. I can't seem to find anything relating to the second point.. Could someone please point me in the right direction. |
2016-11-13 08:25:25 -0500 | asked a question | How to visualize laser sensor data from a bag file I've been looking at laser sensor datasets online over the weekend and I'm not quite sure how to visualize the output in Rviz. I don't have a laser sensor but I want to get an understanding of what the output of these devices look like. For example, at the following link, there's a bag file which you can download and play using rosbag play. I run the command and I can get an output of the x y z position data by using
on the /microstrain/imu topic but when I run rviz nothing shows up. I guess my question is:
|
2016-11-12 06:55:46 -0500 | received badge | ● Popular Question (source) |
2016-11-11 09:55:32 -0500 | commented answer | catkin_make won't recognise ROS Package Thanks, that makes sense now. I was wondering what |
2016-11-11 09:40:28 -0500 | received badge | ● Scholar (source) |
2016-11-11 09:09:58 -0500 | commented answer | catkin_make won't recognise ROS Package Yes it is. |
2016-11-11 09:08:20 -0500 | asked a question | catkin_make won't recognise ROS Package Hi, I'm new to ROS and I've been trying to check out the package at the following link: https://github.com/dimatura/loam_cont... . I've cloned it to my catkin workspace but when I try to build the package using it doesn't seem to recognise the package. This is my output: Am I doing something wrong? Could someone please point me in the right direction? |