Ask Your Question
0

py_trees_ros experiences

asked 2018-12-03 13:23:56 -0600

ChriMo gravatar image

Hello community,

I'm interested in users, who have experiences with py_trees_ros. Is there anyone who is using python behavior trees in real worlds robots. Is py_trees_ros powerful enough to solve small autonomous wheeled bot behaviours ? It seems for me, that py_trees_ros is an excellent highlevel prototyping environment for learning ros and robotics.

Thanks in advance for some hints or links to realworld examples.

Cheers Chrimo

edit retag flag offensive close merge delete

Comments

Not an answer but there are some discussions that might help. I just listed on py_tree_ros's page.

130s gravatar image130s ( 2018-12-03 13:43:37 -0600 )edit

yes, I saw this theoretical discussions and I read all the related docs. What about practice ? Pros and cons at real life projects ?

ChriMo gravatar imageChriMo ( 2018-12-05 12:29:35 -0600 )edit

1 Answer

Sort by ยป oldest newest most voted
1

answered 2019-03-23 11:06:26 -0600

Daniel Stonier gravatar image

updated 2019-03-23 11:07:50 -0600

I have been involved in the development of py_trees over the years:

Before it was released as open source, it was originally developed at Yujin Robot to handle the application layer of the fleet robots that they were developing. Among the original goals were:

  1. Simple enough for an intern to take care of the application layer (req: python programming skills)
  2. Apply to the robots we were deploying at various field tests around the world
  3. Provide sufficient tooling for monitoring, logging, replay - basically allow us to root cause problems at the application level
  4. Work easily with other tools that mock the robot so we could put the application layer under CI

It took a few iterations to get there, but ultimately we met all of our goals. We even found that the control engineers started moving their decision making logic to the behaviour trees when it didn't have low-latency requirements (e.g. docking) simply because it got their components into a framework which was monitored, logged, mocked and allowed them to interact with other parts of the robot (e.g. the notifications subsystem) without having to drag that subsystem in as a dependency for the control module (e.g. docking LED or sound notifications as it passed through different states in the docking process).

These days it has been picked up by CARLA for their scenario layer in autonomous driving simulations and Toyota Research Institute is doing the same internally. Blue Ocean Robotics has been using it for a while and I have been notified of a few other robotics companies in the valley area taking advantage of it, but I do not know to what extent.

edit flag offensive delete link more

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

Question Tools

3 followers

Stats

Asked: 2018-12-03 13:23:56 -0600

Seen: 59 times

Last updated: Mar 23