ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

Stage + Camera

asked 2011-04-13 11:55:57 -0600

Sagnik gravatar image

updated 2011-04-13 13:20:32 -0600

Eric Perko gravatar image


I would like to use Stage to simulate a robot with a camera (or a blob-finder) as a sensor. I have added a camera in the world file but I am not able to get camera data published on the /camera/image_raw and other camera sensor topics. With a real-life camera on a robot, I would assume one would have to install the necessary driver for the camera and start it get data published on /camera/image_raw. Do I need to install any camera driver for this arbitrary camera ? What am I missing here ?

Thanks, Sagnik

edit retag flag offensive close merge delete

3 Answers

Sort by ยป oldest newest most voted

answered 2011-04-13 13:20:18 -0600

Eric Perko gravatar image

updated 2011-04-15 06:55:55 -0600

stageros only supports laser and odometry output. If you want support for cameras in ROS, simulated by Stage, you should file an enhancement ticket against the simulator_stage stack.

I'm pretty certain that there is support for simulation a camera in the simulator_gazebo stack, but I've never used it.

Update with responses to Sagnik's further questions:

The URDF tutorials are a good place to start, though it is true that getting a robot into Gazebo is a lot more work than getting it into Stage. You'll also have to try to build the Ackermann steering for your model, as there is no simple Ackermann steering controller that I am aware of. See and for hints on how to go about doing that.

If you have no need for the full simulation of the robot dynamics, you could always look into how Stage simulates the camera (what format is the image in when it returns it) and add support for outputting simulated cameras to the stageros package.

edit flag offensive delete link more


Just in case anyone in the present is checking, stageros does have a camera sensor these days.

SL Remy gravatar image SL Remy  ( 2014-12-04 19:26:06 -0600 )edit

answered 2011-04-14 18:16:55 -0600

BlackManta gravatar image

Its true a lot of the player/stage functionality was taken out. (Mainly because a lot of it would have been redundant). However, I too miss the built in blob tracking. I am not sure exactly what you are doing but you could use a real camera and hold up a certain colored card if you needed the robot to respond a certain way. (It would be good for testing at least). I recommend the opencv package included in Ros.


edit flag offensive delete link more

answered 2011-04-15 06:43:22 -0600

Sagnik gravatar image

Thanks guys for your responses.

I am working on simulation of a car-like robot with Ackermann steering. The end goals are to make the robot do basic tasks like navigation (using laser sensors), object detection using a camera sensor and maybe even VSLAM later on. The simulation effort is of course to speed up the dev and debugging process.

Having said that, I already have a simulation running in Stage which simulates the navigation, and right now I am trying out usage of the visual sensors.

@Eric: I checked out the camera sensor usage in Gazebo as you suggested, but simulating a robot from scratch in Gazebo, as compared to simulating a robot in Stage seems like a much more difficult task (as it does an accurate simulation of the robot dynamics). Could you suggest a link/doc as a head-start ? (Most of the tutorials I find are about using a built-in simulation)

@BM: Thats a useful suggestion. I'll try that out.

Thanks, Sagnik

edit flag offensive delete link more


I've updated my answer to address your question.
Eric Perko gravatar image Eric Perko  ( 2011-04-15 06:56:33 -0600 )edit

Question Tools

1 follower


Asked: 2011-04-13 11:55:57 -0600

Seen: 1,214 times

Last updated: Apr 15 '11