Ask Your Question

Icehawk101's profile - activity

2019-09-09 02:05:57 -0600 received badge  Nice Answer (source)
2019-07-16 12:03:08 -0600 received badge  Nice Answer (source)
2019-03-06 14:13:46 -0600 marked best answer Problem with rtabmap_ros and nonfree OpenCV

Hello,

I am trying to use rtabmap_ros with SIFT and SURF. I've installed nonfree OpenCV and can find the folder in /usr/include/opencv2. For some reason though whenever I run rtabmap_ros I get the message

[ WARN] (2016-04-15 10:21:56.026) Features2d.cpp:349::create() SURF/SIFT features cannot be used because OpenCV was not built with nonfree module. ORB is used instead.

Anyone know why rtabmap_ros can't find nonfree?

2018-02-03 01:14:15 -0600 received badge  Famous Question (source)
2017-10-30 10:47:53 -0600 received badge  Famous Question (source)
2017-10-02 07:03:47 -0600 commented answer Realsense depth calibration

We wound up using a laser scanner instead so I did the bother continuing with this.

2017-08-21 11:06:12 -0600 received badge  Notable Question (source)
2017-06-29 13:46:43 -0600 received badge  Good Answer (source)
2017-04-14 05:27:18 -0600 marked best answer Any problems with RPLIDAR's low scan rate?

Hey All,

I am considering getting an RPLIDAR for 3D mapping and navigation of an indoor environment using an autonomous UAV. The sensor has a much lower scan rate than most lidar systems I've used, 5.5 Hz compared to 40-50 Hz. Has this presented any issues to anyone?

2017-03-22 13:49:37 -0600 marked best answer Realsense depth calibration

Hey All,

I am using a Realsense F200 with the realsense_camera package for 3D mapping. The camera doesn't provide registered depth images, which everything seems to need. I looked at the depth_image_proc package but it requires rectified depth and rgb images as well as the calibration data. I should be able to calibrate the rgb camera with the camera_calibration package, but how do I calibrate the depth camera? I looked at how to do it for the Kinect but I don't think that will work for the Realsense as I just get a black image when I cover the laser projector. If anyone knows how to calibrate the depth camera or get registered depth images out of this camera I could use the help.

Thanks

2017-03-01 07:38:50 -0600 received badge  Necromancer (source)
2017-02-11 13:14:37 -0600 received badge  Famous Question (source)
2017-01-05 03:24:19 -0600 received badge  Notable Question (source)
2016-11-21 08:20:22 -0600 received badge  Popular Question (source)
2016-11-14 15:14:00 -0600 received badge  Famous Question (source)
2016-11-13 17:15:07 -0600 received badge  Popular Question (source)
2016-11-03 08:46:55 -0600 commented question How to setup tf for real robot?

Closing this as it has been answered

2016-11-03 08:42:59 -0600 asked a question Roslaunch not remotely starting master

Hello,

I have run into an odd little problem. I am using a UAV with an on-board computer running ROS and a separate base station computer. I've done the network setup and the two computers are communicating properly. I'm using roslaunch files with machine tags to control which computer individual nodes start on. This works fine except for one thing. The terminal tells me that it cannot connect to master unless I go into the UAV computer and run roscore first. I find this odd as generally one does not need to run roscore on its own when using a launch file. Has anyone else run into this or know what's causing it?

Thanks

[EDIT]
I don't have ROS_IP set on either of them. What I do have is export ROS_MASTER_URI=http://UAVSartrex:11311 in the .bashrc of both the base station and uav computers.

2016-11-03 08:38:30 -0600 received badge  Civic Duty (source)
2016-11-03 08:27:34 -0600 commented question why my differencial drive base moves like a snake?

Far more information is needed in order to answer your question

2016-09-21 10:01:49 -0600 answered a question Quadcopter navigation with obstacle avoidance?
  1. You have to generate that. You can make a URDF model of the quadcopter or build a transform publisher to get the TF data. For the odometry you can get the imu data from /mavros/imu/data and if you have a px4flow (or knockoff) from /px4flow/raw/optical_flow_rad. I also use laser scan matching to simulate odom data using the lidar. Pass the data through an EKF or UKF (I use robot_localization) to get a combined odometry estimate.

I am planning to use moveit for 3D navigation but have only just started looking into it. If you have gotten any further in this I would like to hear about it.

2016-09-21 09:54:31 -0600 commented question Quadcopter navigation with obstacle avoidance?

Have you gotten any further in this? I am doing something similar

2016-09-13 06:48:31 -0600 commented answer Integration of px4, px4flow with AMCL localization for quadrotor

We ordered a PX4Flow to see if it would work with our project. Haven't gotten it yet but interested to see how it does. I'm thinking of using an EKF to fuse the odom from the flow with the odom from laser scan matching for increased accuracy. I never had much luck with rtabmap.

2016-09-12 10:48:43 -0600 commented answer Integration of px4, px4flow with AMCL localization for quadrotor

Have you made any headway in the last few months?

2016-08-30 09:45:55 -0600 received badge  Nice Answer (source)
2016-08-29 10:44:55 -0600 received badge  Famous Question (source)
2016-08-29 07:28:59 -0600 edited answer How to setup tf for real robot?

You need to look at the TF Tutorials. #1 for both C++ and Python is setting up a transform broadcaster.

[EDIT]

Been a few months :P. You can use the URDF model to publish the transforms. Using the <joint> tag allows you to enter the parent ID, child ID, and origin transforms between links.

If you want to make a quick tranform tree though, here is the transform broadcaster I made for a robot while working on my MASc. Making the transform tree doesn't give you a visual, but it does provide the transforms between parts.

2016-08-17 09:24:07 -0600 received badge  Notable Question (source)
2016-08-03 14:21:55 -0600 commented question ROS for Aerial Robotics

What do you mean by aerial robotics? There is mavros for interfacing with Pixhawk/PX4/APM flight modules if that's what you are looking for.

I've only used ROS on Linux because that is the OS it is native to. I've heard that it is less stable on other platforms but I've never tested that.

2016-07-28 10:42:36 -0600 commented answer Trying to a build a map from sensor data (Gmapping)

You can follow that tutorial if you want to build your own transform broadcaster. As you only need a single link for this at the moment, you can also use the tf static_transform_publisher

2016-07-27 11:59:25 -0600 answered a question Trying to a build a map from sensor data (Gmapping)

More or less. Gmapping subscribes to the /scan topic, so you will need to remap /my_robot/laser/scan to /scan or vice versa.

For the tf, you need a transform from the laser sensor to the robot base. You can add the sensor to the URDF model if you want a visualization of it in RVIZ or you can use a static transform publisher.

Gmapping requires an odometry source. You don't state if you have one or not so I figured I would just point it out.

2016-07-25 07:18:14 -0600 commented answer get odometry from car-like robot

step to get the change in positions, then add them for each variable.

2016-07-25 07:17:54 -0600 commented answer get odometry from car-like robot

Skid-steering and differential drive are pretty much the same thing. For the odometry, you need the x, y, and theta velocities and positions. The positions can be found by integrating the changes in the velocities over time. In other words, for each odom message multiply the velocities by the time

2016-07-21 13:28:23 -0600 commented question Hector Navigation Questions!!

Unfortunately, I've never used hector_navigation. I've always teleoperated around an area to build the map then autonomously navigated on the map. Stefan Kohlbrecher might be able to help though. I can't @ him since he has a two part name. Search his user name though.

2016-07-20 07:26:37 -0600 commented answer hector slam and arduino help please

Ok, so you've built a map using hector_slam and now you want to autonomously drive around that map? Is that right?

2016-07-20 07:23:52 -0600 commented answer get odometry from car-like robot

The kinematics of differential drive vehicles are well known. Search google for the kinematics and then write a node to convert the robot velocities to wheel velocities. Then, send those wheel velocities to the motor drivers.

2016-07-20 01:15:28 -0600 received badge  Famous Question (source)
2016-07-19 09:34:16 -0600 commented answer hector slam and arduino help please

Hector_slam does not require odomtry

2016-07-18 10:05:27 -0600 answered a question hector slam and arduino help please

Hector_slam is used for mapping, not autonomous control. You could try taking a look at hector_exploration_planner.

Normally the way it works is a path planner will publish a set of velocities for the robot as a whole. You will need to write a node that will use the inverse kinematic equations for your particular robot to determine the individual wheel velocities required to achieve the desired motion.

2016-07-15 10:39:05 -0600 commented question hector localization package against known map

Should I assume nothing has happened with this?

2016-07-11 13:48:30 -0600 commented answer How does gMapping work?

Probablistic Robotics is a great book

2016-07-08 14:18:56 -0600 answered a question How to publish and subscribe to a boolean message?

If you are using move_base to autonomously drive to a location, then using the service provides a return message when it reaches the goal. Otherwise, if you want to use the bool method, it is the same as any other callback. Make a global variable to hold some flag, called reached below, and populate it with the data value from the message.

bool reached = false;

void positionreached(const std_msgs::Bool::ConstPtr& Reached)
{
  reached = Reached->data;
}

[EDIT]

Right, Arduino. Try:

void positionreached(const std_msgs::Bool Reached)
{
  reached = Reached.data;
}

Also, reached only needs to be bool, not std_msgs::Bool.

2016-07-07 10:20:22 -0600 received badge  Nice Answer (source)