ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

JamesWell's profile - activity

2020-02-07 07:19:12 -0500 received badge  Famous Question (source)
2020-02-07 07:19:12 -0500 received badge  Notable Question (source)
2019-09-24 04:16:53 -0500 received badge  Nice Answer (source)
2019-04-04 08:29:57 -0500 received badge  Famous Question (source)
2018-04-30 13:30:52 -0500 received badge  Famous Question (source)
2018-02-21 03:06:49 -0500 received badge  Famous Question (source)
2017-07-24 08:34:32 -0500 received badge  Notable Question (source)
2017-07-24 08:34:32 -0500 received badge  Popular Question (source)
2017-04-02 22:49:06 -0500 received badge  Enlightened (source)
2017-04-02 22:49:06 -0500 received badge  Good Answer (source)
2017-02-22 14:35:24 -0500 received badge  Famous Question (source)
2017-02-12 02:57:57 -0500 received badge  Notable Question (source)
2017-02-12 02:57:57 -0500 received badge  Popular Question (source)
2017-02-01 07:32:03 -0500 commented answer Arduino /cmd_vel and /odom with Pololu motors w/ encoder

Haha recommending rosserial is totally fine, I just don't want to be linked to the tutorials. I will write a node on the Arduino that will listen to the /cmd_vel topic and write the PWM value to the motors. I hope you can give feedback after I post the code as an edit in the original question.

2017-02-01 07:25:40 -0500 received badge  Notable Question (source)
2017-02-01 03:42:19 -0500 received badge  Popular Question (source)
2017-01-31 16:11:14 -0500 received badge  Critic (source)
2017-01-31 16:02:41 -0500 received badge  Notable Question (source)
2017-01-31 16:02:41 -0500 received badge  Popular Question (source)
2017-01-31 15:59:42 -0500 asked a question Arduino /cmd_vel and /odom with Pololu motors w/ encoder

Hi guys,

I have been struggling with my robot for 3-4 days now. I want to publish to the /cmd_vel topic and then to the /odom topic using the Arduino MEGA connected to 4 Pololu motors with encoders. I have installed rosserial and gone through the tutorials, I have tried modifying ros_arduino_bridge to fit my hardware with no success, so please do not suggest these as solutions :) I have also tried getting the ticks from the motors with a mild success, but nothing useful. I have tested all motors by sending a variety of PWM signals from the MEGA and they seem to function just fine.

Is it even possible with the motor controllers I am using? I am honestly hoping someone can direct me in the right direction. Everything online is just people linking to the rosserial tutorial page..

My robot is a 4WD platform with

A schematic showing the pinouts of the DC Motor Driver Carrier from the Pololu page MAX14870 Motor Driver

2016-07-07 10:15:21 -0500 commented question Converting the remaining part of a catkin cmakelist to rosbuild

Using the work from an article presented at ICRA2015 - github repo here https://github.com/angusleigh/leg_tra...

2016-07-07 08:16:41 -0500 asked a question Converting the remaining part of a catkin cmakelist to rosbuild

Hi have tried to convert this catkinized cmakelist to rosbuild but without success. I'm running ROS Fuerte. Here is the original:

cmake_minimum_required(VERSION 2.8.3)
project(leg_tracker)

find_package(catkin REQUIRED COMPONENTS
  roscpp
  rospy
  std_msgs
  tf
  interactive_markers
  rosbag
  image_geometry
  message_generation
)

add_message_files(
  FILES
  Person.msg
  PersonArray.msg
  Leg.msg
  LegArray.msg
)

generate_messages(   
  DEPENDENCIES  
  std_msgs  
  geometry_msgs
)  


catkin_package(
  CATKIN_DEPENDS message_runtime
)


include_directories(
    INCLUDE include
    ${catkin_INCLUDE_DIRS}
)



add_executable(
  detect_leg_clusters 
  src/detect_leg_clusters.cpp
  src/laser_processor.cpp
  src/cluster_features.cpp
)
target_link_libraries(
  detect_leg_clusters 
  ${catkin_LIBRARIES}
)


add_executable(
  local_occupancy_grid_mapping
  src/local_occupancy_grid_mapping.cpp
  src/laser_processor.cpp
)
target_link_libraries(
  local_occupancy_grid_mapping 
  ${catkin_LIBRARIES}
)


add_executable(
  extract_positive_training_clusters
  src/extract_positive_training_clusters.cpp
  src/laser_processor.cpp
)
target_link_libraries(
  extract_positive_training_clusters
  ${catkin_LIBRARIES}
)


add_executable(
  train_leg_detector
  src/train_leg_detector.cpp
  src/laser_processor.cpp
  src/cluster_features.cpp  
)
target_link_libraries(
  train_leg_detector
  ${catkin_LIBRARIES}
)

add_dependencies(local_occupancy_grid_mapping ${PROJECT_NAME}_generate_messages_cpp)
add_dependencies(local_occupancy_grid_mapping ${PROJECT_NAME}_gencpp)
add_dependencies(local_occupancy_grid_mapping ${PROJECT_NAME}_gencfg)
add_dependencies(detect_leg_clusters ${PROJECT_NAME}_generate_messages_cpp)
add_dependencies(detect_leg_clusters ${PROJECT_NAME}_gencpp)
add_dependencies(detect_leg_clusters ${PROJECT_NAME}_gencfg)
add_dependencies(extract_positive_training_clusters ${PROJECT_NAME}_generate_messages_cpp)
add_dependencies(extract_positive_training_clusters ${PROJECT_NAME}_gencpp)
add_dependencies(extract_positive_training_clusters ${PROJECT_NAME}_gencfg)
add_dependencies(train_leg_detector ${PROJECT_NAME}_generate_messages_cpp)
add_dependencies(train_leg_detector ${PROJECT_NAME}_gencpp)
add_dependencies(train_leg_detector ${PROJECT_NAME}_gencfg)


install(
  PROGRAMS scripts/joint_leg_tracker.py
  DESTINATION ${CATKIN_PACKAGE_BIN_DESTINATION}
)


install(
  PROGRAMS scripts/individual_leg_tracker.py
  DESTINATION ${CATKIN_PACKAGE_BIN_DESTINATION}
)

I've reduced the file down to this

    cmake_minimum_required(VERSION 2.4.6)
include($ENV{ROS_ROOT}/core/rosbuild/rosbuild.cmake)

# Set the build type.  Options are:
#  Coverage       : w/ debug symbols, w/o optimization, w/ code-coverage
#  Debug          : w/ debug symbols, w/o optimization
#  Release        : w/o debug symbols, w/ optimization
#  RelWithDebInfo : w/ debug symbols, w/ optimization
#  MinSizeRel     : w/o debug symbols, w/ optimization, stripped binaries
#set(ROS_BUILD_TYPE RelWithDebInfo)

rosbuild_init()
include($ENV{ROS_WORKSPACE}/rosbuild.cmake)

#set the default path for built executables to the "bin" directory
set(EXECUTABLE_OUTPUT_PATH ${PROJECT_SOURCE_DIR}/bin)
#set the default path for built libraries to the "lib" directory
set(LIBRARY_OUTPUT_PATH ${PROJECT_SOURCE_DIR}/lib)

#uncomment if you have defined messages
rosbuild_genmsg()
#uncomment if you have defined services
#rosbuild_gensrv()

rosbuild_add_executable(
  detect_leg_clusters 
  src/detect_leg_clusters.cpp
  src/laser_processor.cpp
  src/cluster_features.cpp
)

rosbuild_add_executable(
  local_occupancy_grid_mapping
  src/local_occupancy_grid_mapping.cpp
  src/laser_processor.cpp
)

rosbuild_add_executable(
  extract_positive_training_clusters
  src/extract_positive_training_clusters.cpp
  src/laser_processor.cpp
)

rosbuild_add_executable(
  train_leg_detector
  src/train_leg_detector.cpp
  src/laser_processor.cpp
  src/cluster_features.cpp  
)

And the package.xml has been renamed to manifest.xml and changed accordingly. Here is the original package.xml:

 <?xml version="1.0"?>
 <package>
  <name>leg_tracker</name>
  <version>0.0.0</version>
  <description>The leg_detector package</description>
  <maintainer email="angus@todo.todo">angus</maintainer>
  <license>TODO</license>


  <buildtool_depend>catkin</buildtool_depend>
  <build_depend>roscpp</build_depend>
  <build_depend>rospy</build_depend>
  <build_depend>std_msgs</build_depend>
  <build_depend>tf</build_depend>
  <build_depend>rosbag</build_depend>
  <build_depend>image_geometry</build_depend>
  <build_depend>message_generation</build_depend>
  <build_depend>libfftw3</build_depend>

  <run_depend>roscpp</run_depend>
  <run_depend>rospy</run_depend>
  <run_depend>std_msgs</run_depend>
  <run_depend>tf</run_depend>
  <run_depend>rosbag</run_depend>
  <run_depend>image_geometry</run_depend>
  <run_depend>message_runtime</run_depend>  
  <run_depend>libfftw3</run_depend>


</package>

Which I've reduced and commented out the message_generation and libfftw3 since running a rosmake couldn't find these packages:

<package>

  <name>leg_tracker</name>
  <version>0.0.0</version>
  <description brief="The leg_detector package">The leg_detector package</description>
  <maintainer email="angus@todo.todo">angus</maintainer>
  <license>TODO</license>

  <depend package="roscpp"/>
  <depend package="rospy"/>
  <depend package="tf"/>
  <depend package="rosbag"/>
  <depend package="image_geometry"/>
  <!--<depend pacakge="message_generation"/> missing dependency fuerte-->
  <!--<depend package="libfftw3 ...
(more)
2016-06-24 04:20:00 -0500 asked a question Calculating incident angle of laserscan and laserpoints 2D

Hi guys,

I'm trying to calculate the angle of incident between a laserscan and laserscanner points. I'm having trouble calculating the normal of the surface from neighbouring points before actually calculating angle of incidence. Here's a figure of what I'm trying to accomplish (the laserscanner is at the corner of the robot). I'm trying to find alpha1 and alpha2

angle of incidence

To calculate the normal vector for each point I've tried using PCL, but the values are just (0,0,-1) for all points. I suspect this is because the pcl function expects a 3D pointcloud and I'm actually parsing a 2D pointcloud. I've used the code directly from the pcl documentation:

#include <pcl/point_types.h>
#include <pcl/features/normal_3d.h>

{
  pcl::PointCloud<pcl::PointXYZ>::Ptr cloud (new pcl::PointCloud<pcl::PointXYZ>);

  ... read, pass in or create a point cloud ...

  // Create the normal estimation class, and pass the input dataset to it
  pcl::NormalEstimation<pcl::PointXYZ, pcl::Normal> ne;
  ne.setInputCloud (cloud);

  // Create an empty kdtree representation, and pass it to the normal estimation object.
  // Its content will be filled inside the object, based on the given input dataset (as no other search surface is given).
  pcl::search::KdTree<pcl::PointXYZ>::Ptr tree (new pcl::search::KdTree<pcl::PointXYZ> ());
  ne.setSearchMethod (tree);

  // Output datasets
  pcl::PointCloud<pcl::Normal>::Ptr cloud_normals (new pcl::PointCloud<pcl::Normal>);

  // Use all neighbors in a sphere of radius 3cm
  ne.setRadiusSearch (0.03);

  // Compute the features
  ne.compute (*cloud_normals);

  // cloud_normals->points.size () should have the same size as the input cloud->points.size ()*
}

If there's an easier way to calcuate the normal for each point please let me know. Thank you!

2016-06-16 05:25:10 -0500 commented question What does "use_tf_scan_transform" parameter in Hector slam represent?

I think you are right about "use_tf_scan_transformation, see this answer to a similar question: http://answers.ros.org/question/48778...

2016-06-15 05:09:27 -0500 received badge  Popular Question (source)
2016-06-11 03:27:46 -0500 commented question How to plot laser scan data into Matlab ?

That's a python error. You should indent your python code properly. You can paste the full script here and I can check the indents if necessary, but try correcting them yourself first.

2016-06-09 06:07:38 -0500 commented question How to plot laser scan data into Matlab ?

Have you tried msg.min... etc?

2016-06-09 03:59:52 -0500 commented question How to plot laser scan data into Matlab ?

Try running a rostopic list to see if your laser scans are published (they probably are). Then you can start by changing the '/odom/ to the laser topic.

2016-06-07 17:19:56 -0500 commented answer joint_state_publisher gui unresponsive

My bad, still learning to use rqt_graph :) here it is

2016-06-07 14:41:30 -0500 commented answer joint_state_publisher gui unresponsive

I know that the joint_state_publisher isn't meant for 'controlling a robot'. I'm writing a walkthrough for a friend of mine so we can build a robot together. The reason I asked was because I suspect that Kinetic distro might have something to do with it. node graph.

2016-06-07 14:27:09 -0500 commented question joint_state_publisher gui unresponsive

Thank you will remember that for future questions.

2016-06-06 14:34:07 -0500 asked a question joint_state_publisher gui unresponsive

I have this problem when I'm trying to move the sliders it is near impossible to move them anywhere, the gif below probably doesn't display so here's the imgur link to an example of a non responsive joint_state_publisher gui non responsive joint_state_publisher gui

This is on Kinetic, Xenial 16.04. Can anyone explain why and help me resolve this? I'm considering just creating a gui with rqt or just control it through a script if I can't resolve the above :)

2016-06-06 02:52:51 -0500 received badge  Nice Answer (source)
2016-06-03 13:03:06 -0500 received badge  Commentator
2016-06-03 13:03:06 -0500 commented question When will rosbag::Player be released?

You could create an issue on their github page. Even if they respond with a 'no' it will be nice for them to know the comment is there and remove it accordingly.

2016-06-03 12:50:00 -0500 answered a question problems in hector slam

I've used Hector SLAM a lot myself and all of your issues are common issues with Hector SLAM. It is important to remember what environment(video) TeamHectorDarmstadt used the robot in. The particular environment is rich in features and there are literally no long corridors, so for them, the Hector SLAM mapping and navigation algorithms were perfect.

Now using the algorithm in areas with great laser ambiguity that's another story. I'm going to try and give a short explanation to each .. question? And give you a couple of ideas to code a solution.

  1. The corridor problem occurs when one scan looks almost exactly as the next, imagine that a lot laserscanner points are registering the walls next to the robot but none of the laserscanner points ahead and behind the robot is reaching any surface (assuming the robot is travelling forward). You can actually enable odometry to use the estimated position w.r.t odometry to help the laserscanner. But make sure you have some awesome odometry, I combined odometry and a cheap IMU to first get a better odometry estimate and then enabling it in Hector SLAM, it helped some.

  2. I've never experienced this, the robot here travels on very flat surfaces. Sudden turns will ruin the mapping though.

  3. This can be resolved by tuning the parameters in the mapping_default.launch file, specifically tune your angle update and number of map resolution. I've actually run in to this problem mostly because of memory issues.. not sure why but it really helped to tune the parameters!

There are different things you can do to make HSLAM more robust one is loop closure and another is this Modeling Laser Intensities For Simultaneous Localization and Mapping awesome method which would be really cool to implement!

Hope this is useful, if all fails you could look in to mapping with vision based systems like LSD-SLAM, ORB-SLAM, RTAB etc.

2016-06-03 12:29:27 -0500 commented question How to use urdf_to_graphiz command to see the link relationships of my xacro models?

What distro are you running? Try rosrun xacro xacro --inorder model.urdf.xacro > model.urdf. But maybe write what distro you are using first. xacro.py is deprecated in newer distros.

2016-06-03 12:27:16 -0500 commented answer I cannot find beginner_tutorials

Also your path environment might not be setup correctly, can you write the output of ' echo $ROS_PACKAGE_PATH ' ? Without the hyphens

2016-06-03 12:24:18 -0500 answered a question E: Unable to locate package ros-kinetic-desktop-full

"E: Unable to locate package ors-kinetic-desktop-full", didn't you just mistype 'ros'?