Creating /tf (position) from IMU, 2D/3D SLAM (LiDAR+IMU+Octomap) (no need for high accuracy))
Hello, Iam trying to build a simple SLAM system consisted of:
- RPLiDAR A2 M8
- Razor IMU 9DOF M0
used packages:
- octomap_mapping,
- razor_imu_9dof (razor-pub.launch)
- rplidar_ros (rplidar.launch)
own packages
- mctr-132_ros (sends transform of world 0,0,0 and transform of laser which has location 0,0,0 and orientation is quaternion from imu)
- mctr-132_laser (only converting LaserScan to PointCloud)
System: Ubuntu 16.04, ROS Kinetic
Iam in the phase that I put those points into map and I can visualize it in Rviz, but I only take IMU rotation into account. The position is being published into the /tf topic as 0, 0, 0. I can therefore build map of the surrounding environment. But I'd like to move my device around. So I need a proper /tf with position transform. I was thinking about robot_pose_ekf, but it needs odometry and I have no idea how to create fake odometry from 9dof imu.
Q1: Is it possible to create fake odometry from IMU and if so how?
Many thanks
ps.: I would upload also a photo but 5 points are required...
ps.: I tried google cartographer, hector_localization, rf2o_laser_odometry but I failed to build/run them...