ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

jcardenasc93's profile - activity

2023-01-26 04:34:33 -0500 received badge  Famous Question (source)
2018-10-22 12:19:22 -0500 received badge  Teacher (source)
2018-10-22 12:19:22 -0500 received badge  Self-Learner (source)
2018-01-22 07:18:58 -0500 received badge  Student (source)
2016-05-23 01:19:18 -0500 received badge  Famous Question (source)
2016-05-11 21:06:27 -0500 received badge  Notable Question (source)
2016-04-27 11:47:06 -0500 received badge  Famous Question (source)
2016-04-27 11:14:28 -0500 received badge  Popular Question (source)
2016-04-25 23:48:25 -0500 asked a question Cannot visualize the map Hector SLAM

Hi everyone I don't know if I have a problem with the setting of Rviz or I don't set correctly the tf transforms, but this is my problem: Firstable I'am using a kinect to make SLAM process through Hector Slam stack. So I transform the RGB-D info to Laser Scan info using depthimage_to_laserscan package, then I set the necessary tf transforms according to Hector Slam wiki. Everything looks that works fine, but the map cannot be seen on RViz, on the image you can see that it's possible to see the laserscan info but the Mapping process just don't works. Here is my launch file:

<launch>
    <include file="$(find freenect_launch)/launch/freenect.launch">
       <arg name="depth_registration" value="true"/>
    </include>

 <!-- Kinect cloud to laser scan -->
      <node pkg="depthimage_to_laserscan" type="depthimage_to_laserscan" name="depthimage_to_laserscan">
      <remap from="image"     to="/camera/depth_registered/image_raw"/>
      <remap from="camera_info" to="/camera/depth_registered/camera_info"/>
      <remap from="scan" to="/kinect_scan"/>
      <param name="range_max" type="double" value="4"/>

    </node> 

    <param name="pub_map_odom_transform" value="true"/>
    <param name="map_frame" value="map"/>
    <param name="base_frame" value="base_frame"/> 
    <param name="odom_frame" value="base_frame"/>

<!-- static tf's -->
      <node pkg="tf" type="static_transform_publisher" name="map_2_basef" args="0 0 0 0 0 0 /map /base_frame 100"/>
     <node pkg="tf" type="static_transform_publisher" name="basef_2_baselink" args="0 0 0 0 0 0  /base_frame /base_link 100" />
     <node pkg="tf" type="static_transform_publisher" name="baselink_2_laserlink" args="0 0 0 0 0 0  /base_link /camera_link 100" />
     <node pkg="tf" type="static_transform_publisher" name="base_2_nav_link" args="0 0 0 0 0 0 /base_frame /nav 100"/>

   <node pkg="rviz" type="rviz" name="rviz" args="-d $(find hector_slam_launch)/rviz_cfg/mapping_demo.rviz"/>

   <include file="$(find hector_mapping)/launch/mapping_default.launch"/> 
   <include file="$(find hector_geotiff)/launch/geotiff_mapper.launch"/>

</launch>

image description

2016-04-18 20:33:15 -0500 received badge  Famous Question (source)
2016-04-13 07:51:24 -0500 received badge  Notable Question (source)
2016-03-26 12:01:17 -0500 asked a question Rtabmap fake 2D laser without odometry

I would like if it's possible to setup a launch file for RGB-D SLAM through the package Rtabmap that enable the fake 2D laser from Kinect. I follow this example from the wiki. I only extract the part of the "Kinect cloud to laserscan" and add to the launch file rgbd_mapping.launch of the RTAB-Map stack. But when I run the launch file I get this error after open RViz:

[ERROR] (2016-03-26 11:41:54.956) Rtabmap.cpp:890::process() RGB-D SLAM mode is enable and no odometry is provided. Image 1190 is ignored!

I don't understand it because on the launch file the visual odometry is set to "true".

2016-03-21 22:49:05 -0500 commented answer RTAB-Map with odometry and laserscan

Thanks for the info and I really like your loop closure method

2016-03-18 20:26:42 -0500 received badge  Notable Question (source)
2016-03-18 20:23:43 -0500 commented question RTAB-Map with odometry and laserscan

As you said the problem is when there are not enough features on the current image. The odometry info probably improve the perfomance. I don't think that laser scan alleviate that, it is use to get better results on the mapping but not have any influence on the issue that you mention.

2016-03-18 20:18:17 -0500 commented question bash: /opt/ros/jade/setup.bash: No such file or directory

I don't know why is that happening, by the other hand Have you tried to install indigo version that is the right version for ubuntu 14.04?

2016-03-18 20:08:35 -0500 commented answer RGBDSLAM alternatives for Indigo

Actually I'm using RTAB-Map, and it works really fine out of the box. It's very configurable according your needs.

2016-03-18 20:01:20 -0500 commented question How to install perception_oru package from github?

try to clone it to your workspace directory that is source on your .\bashrc

2016-03-18 20:01:20 -0500 received badge  Commentator
2016-03-18 19:56:04 -0500 answered a question rosserial_python Error

A simple way to check the correct serial port is using the Arduino IDE , on the tools menu go to "Port" and it shows the current port where Arduino is connect. Once you check the arduino the _port:= param has to be the same

2016-03-18 19:46:21 -0500 commented question Robot Position in slam_gmapping in python

Are you using Rviz to visualize the occupancy grid? If the answer is Yes, you can see the current position of the platform with the correct tf settings

2016-03-18 19:38:30 -0500 commented question Install ROS on Linux Mint 17.3?

Did you try to install? I think that there is not problem because LM 17.3 still based on Ubuntu 14.04. Just follow the tutorial on the Wiki for Indigo version

2016-03-18 19:25:24 -0500 answered a question Exploration and SLAM

Hi, I really don't work with hector_slam yet, but for your question of checking the map you can use Rviz to visualize the current state of your /map topic

2016-03-18 19:16:08 -0500 commented answer RGBDSLAM errors

Hi, first what kind of sensor are you applying? By the other hand there is a really great package based on RGB-D SLAM check it RTAB-Map, I used it to make SLAM with a handheld kinect and works well

2016-03-18 19:06:10 -0500 answered a question Control DC Motor with arduino

It depends of your application, What kind of control do you want to apply? speed, direction, etc... However that kind of control over a DC motor it's more related to control circuits, I guess that the only control trough a ROS node with Arduino implementation is a On/Off control. Please give us more info about what exactly you want to do.

2016-02-10 19:35:13 -0500 received badge  Popular Question (source)
2016-02-10 19:13:19 -0500 answered a question ros servo control arduino fails

The problem as @gary-servin said below it was for insufficient current. That solves with an external power supply.

2016-02-10 19:10:13 -0500 commented question ros servo control arduino fails

Many thanks that was the problem there isn't enough current. With an external power supply it works fine

2016-02-01 22:48:10 -0500 commented question ros servo control arduino fails

Yes @gary-servin I changed the permission with chmod and at the moment to run the rosserial_python serial_node.py I gave the port parameter (in my case /dev/ttyACM0). At the beginning it looks like works, but when the servo begin to move the node crashes.

2016-01-31 15:51:34 -0500 asked a question ros servo control arduino fails

I want to get the control of a servo under ROS, for that I am using arduino UNO, following this tutorial. When I run the rosserial_python serial_node.py at the moment when I publish the angle I got this error:

Traceback (most recent call last):


 File "/home/jcardenasc93/catkin_ws/src/rosserial/rosserial_python/nodes/serial_node.py", line 82, in <module>
    client.run()



File "/home/jcardenasc93/catkin_ws/src/rosserial/rosserial_python/src/rosserial_python/SerialClient.py", line 495, in run
    self.requestTopics()
  File "/home/jcardenasc93/catkin_ws/src/rosserial/rosserial_python/src/rosserial_python/SerialClient.py", line 392, in requestTopics
    self.port.flushInput()
  File "/usr/lib/python2.7/dist-packages/serial/serialposix.py", line 500, in flushInput
    termios.tcflush(self.fd, TERMIOS.TCIFLUSH)
termios.error: (5, 'Input/output error')

Could anyone help me please? I don't understand why this is happening

2015-10-25 09:38:46 -0500 received badge  Notable Question (source)
2015-10-20 17:53:36 -0500 commented question rosserial_client avr Tutorial Error

@gvdhoorn thank's for your attention, yes it is copy paste, anyway I run the node with the "_port:= " and "_baud:" argument but it still without working

2015-10-16 19:02:33 -0500 received badge  Popular Question (source)
2015-10-15 22:29:23 -0500 asked a question rosserial_client avr Tutorial Error

So this is the problem I want to get that my Atmega16 works as a ros node, for that reason I just followed this tutorial, so when I just run the rosserial_python node mentioned (rosrun rosserial_python serial_node.py /dev/ttyUSB0, that is the correct port in my case) I just get the following error:

[ERROR] [WallTime: 1414579883.599380] Unable to sync with device; possible link problem or link software version mismatch such as hydro rosserial_python with groovy Arduino

I checked my rosserial version installed and match with my ROS version, so I am really confused about that error. My ROS version is Indigo and to be sure about the rosserial version I didn't install the rosserial package form the debian repository I just clone it on my workspace from this repo. I really appreciate any info or answer to solve this.

Thanks to ROS community

2015-09-22 01:22:37 -0500 received badge  Famous Question (source)
2015-09-07 19:00:08 -0500 commented answer rtabmap_ros + Kinect = 2D map. How to? (newbie)

Hi @troman Could you explain me how do you get hector_slam works? I've been tried using this package to create a 2D map with kinect but still I cannot even have a map. I really appreciate any info than you can give me.

2015-09-07 18:51:02 -0500 marked best answer Is possible to use openni with ROS jade?

I recently install the Jade version, so I want to use the openNI driver to use the Microsoft Kinect because this driver because it has a great performance.

2015-08-31 15:58:32 -0500 received badge  Notable Question (source)
2015-08-31 12:22:47 -0500 received badge  Editor (source)
2015-08-31 12:20:36 -0500 received badge  Famous Question (source)
2015-08-31 04:40:58 -0500 received badge  Popular Question (source)
2015-08-30 14:02:16 -0500 asked a question Build a map using kinect

Fisrt of all I am new on ROS framework so I start a new project that requires to build a map, for that I want to use the kinect sensor. So my problem is this.... I am building a custom robot, until this moment I haven't finished the construction of the robot, however I created a basic URDF file with the description of my robot to simulate it on RViz, I looking for mapping tutorials and I found some packages that can help me, but I am not sure how start, for example with 'gmapping' it requires odometry info, so I can't understand how to give that kind of information on a RViz simulation, I just want to see how the map start build on a live simulation over RViz using kinect sensor. Again I don know How start to do this. Could anyone help me or give me some searching source to accomplish that task? Thanks to ROS community

2015-07-28 15:44:42 -0500 received badge  Notable Question (source)
2015-07-21 12:01:59 -0500 commented answer Is possible to use openni with ROS jade?

Hi Christian thanks for your answer, I am not really sure how to build it by source. Is just to clone it in catkin_ws/src and then source the packages with source ~/catkin_ws/devel/setup.bash?. And my other question is ...That packages really works on Jade version?

2015-07-21 09:43:15 -0500 received badge  Popular Question (source)