How to get robot pose, odometry & sensor data for Genetic algorithm input?
I am using ROS Kinetic in an Ubuntu 16.04 LTS VM.
I have a 2-part question.
1) I used the following commands to simulate SLAM using a turtlebot2 in Gazebo:
$ roslaunch turtlebot_gazebo turtlebot_world.launch world_file:=/home/malar/turtlebot/src/turtlebot_simulator/turtlebot_gazebo/worlds/Malar_Maze.world
$ roslaunch rtabmap_ros demo_turtlebot_mapping.launch simulation:=true
$ roslaunch rtabmap_ros demo_turtlebot_rviz.launch $ roslaunch frontier_exploration global_map.launch
Now the question is;
I need the robot's pose, odometry data and sensor data at every instance 't' to generate a set of potential locations and input this data through a GA to generate the best location. This location should then be the input as the intermediate destination for the robot in Frontier exploration.
How do I get the robot's pose, odometry data and sensor data? Will this information be the output from turtlebotmapping.launch or the frontierexploration?
How would I then be able to input this data to frontier_exploration? Can someone help me with sample code to achieve this or point me in the right direction?
Reference: http://wiki.ros.org/rtabmap_ros/Tutorials/MappingAndNavigationOnTurtlebot http://wiki.ros.org/frontier_exploration
2) To implement the GA I am planning to use DEAP. Has anyone used DEAP before? I would appreciate any help, directions or tips on how to use DEAP (anything that would make it easier for me to implement the setup described in '1' above).
Reference: http://wiki.ros.org/deap
Note: I am new to using ROS, Python and Ubuntu. I do have experience implementing GA using MATLAB, so I am hoping I would be able to achieve this with a little assistance.
Thank you,
Malar.
Asked by MalarJN on 2019-07-27 20:26:49 UTC
Comments