How to get robot pose, odometry & sensor data for Genetic algorithm input?

asked 2019-07-27 20:26:49 -0600

MalarJN gravatar image

updated 2019-08-01 06:03:59 -0600

I am using ROS Kinetic in an Ubuntu 16.04 LTS VM.

I have a 2-part question.

1) I used the following commands to simulate SLAM using a turtlebot2 in Gazebo:

$   roslaunch turtlebot_gazebo turtlebot_world.launch world_file:=/home/malar/turtlebot/src/turtlebot_simulator/turtlebot_gazebo/worlds/

$ roslaunch rtabmap_ros demo_turtlebot_mapping.launch simulation:=true

$ roslaunch rtabmap_ros demo_turtlebot_rviz.launch   $ roslaunch frontier_exploration global_map.launch

Now the question is;

I need the robot's pose, odometry data and sensor data at every instance 't' to generate a set of potential locations and input this data through a GA to generate the best location. This location should then be the input as the intermediate destination for the robot in Frontier exploration.

How do I get the robot's pose, odometry data and sensor data? Will this information be the output from turtlebot_mapping.launch or the frontier_exploration?

How would I then be able to input this data to frontier_exploration? Can someone help me with sample code to achieve this or point me in the right direction?


2) To implement the GA I am planning to use DEAP. Has anyone used DEAP before? I would appreciate any help, directions or tips on how to use DEAP (anything that would make it easier for me to implement the setup described in '1' above).


Note: I am new to using ROS, Python and Ubuntu. I do have experience implementing GA using MATLAB, so I am hoping I would be able to achieve this with a little assistance.

Thank you,


edit retag flag offensive close merge delete