ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

Odometry with optical mouse?

asked 2014-07-01 07:47:37 -0600

Antoine gravatar image

Hi guys,

I am currently looking at using odometry with the position given by an optical mouse.

As I couldn't find any package/node driver that would allow me to have access to the position of the mouse, I was wondering if anyone had already looked into that and could give some feedback.

Thanks in advance for the replies

edit retag flag offensive close merge delete



Very good approach ! Thinking outside the box at its best !

ccapriotti gravatar image ccapriotti  ( 2014-07-01 16:58:53 -0600 )edit

can't really take any credit for that, many people have already successfully done what I'm trying to do :), just wondering if anyone has done with ROS

Antoine gravatar image Antoine  ( 2014-07-01 19:59:34 -0600 )edit

Since we are thinking outside of the box here, you could use two mouse-based sensors: one on the robot, close to the floor, and another on your wheel's surface (mechanics permitting) , so you could measure skidding, implement cliff-avoidance, and some other good stuff.

ccapriotti gravatar image ccapriotti  ( 2014-07-02 23:01:11 -0600 )edit

I am definitely interested in any idea that may help the odometry work better, although I do not see where you want to put the second sensor. Could you maybe draw a quick sketch or just elaborate a bit might be enough for me to understand?

Antoine gravatar image Antoine  ( 2014-07-03 01:50:20 -0600 )edit

I think he means to fix the second mouse above one of the wheels, with very small distance between the mouse's sensor and the wheels surface (wheel has to be flat in this case)

Mehdi. gravatar image Mehdi.  ( 2014-07-03 02:14:13 -0600 )edit

Hi, did you finally happen to work on this type of Odom? any good resources?

stevej_80 gravatar image stevej_80  ( 2015-11-26 07:02:17 -0600 )edit

4 Answers

Sort by » oldest newest most voted

answered 2014-07-03 15:42:59 -0600

ccapriotti gravatar image

updated 2014-07-05 02:51:45 -0600

Just adding the sketch as requested:

@Mehidi: Your description is right. Indeed, that was what I meant.

One comment: if the sensor on the base is too far from the floor, it may not work, or you will need lenses to focus/compensate the distance.

This also reduces the hight of your base and the possibility of going over obstacles.

But it is cheap, and tends to be precise: optical mouses can work with resolution of typically 300 dpi.

Quick explanation of sketch: gray wheel: free wheel. Black wheel (yellow core) drive wheels. "Robot" is mounted on a triangular base.

image description

Ok, so let's think about the possibilities on what kind of data you can gather from this.

The concept is simple physics: distance over time.

  1. If the wheel sensor reads movement, you can actually calculate the linear speed of the wheel.

2) If the linear speed of the wheel and the linear speed detected by the base sensor are the same, you robot is moving.

3) If the linear speed of the wheel is greater than the linear speed of the "floor", your drive wheel is skidding.

4) If the linear speed of the floor is greater than the linear speed of the wheel, your robot is being dragged or going down a very slippery path.

5) if the base sensor detects movement in the X and Y axes, your robot is either making a curve, or spinning. You can calculate angular speed of the curve, if you think you need it.

6) If you detect that the wheel speed is fluctuating, your robot may be on some sort of low cohesion soil, like pebbles.

edit flag offensive delete link more


For tracking the wheels encoders seem more suited although they work the old fashioned style.

dornhege gravatar image dornhege  ( 2014-07-03 18:25:51 -0600 )edit

With regard to odometry from mice: If I remember correctly from experiments a few year ago your comment about the distance is indeed a problem. Unless you have a very flat surface like a table top (i.e. the same a mouse works on) it won't work.

dornhege gravatar image dornhege  ( 2014-07-03 18:27:14 -0600 )edit

Thanks for the sketch ccapriotti. So if the sensor on the wheel tells you that the wheel is rotating but the one at the back tells you that the robot isn't moving then you can tell that you're robot is skidding. Is that about right or can you actually get more out of your sensors?

Antoine gravatar image Antoine  ( 2014-07-04 19:53:49 -0600 )edit

I mean except for the position from the back sensor and the rotation direction of the wheel from the other sensor

Antoine gravatar image Antoine  ( 2014-07-04 20:15:07 -0600 )edit

Thanks ccapriotti for the explanation, you kept it nice and simple, I got it now. I strongly recommend the link to the first paper that Stefan Kohlbrecher gave, great practical solutions.

Antoine gravatar image Antoine  ( 2014-07-06 22:01:42 -0600 )edit

2nd link actually

Antoine gravatar image Antoine  ( 2014-07-07 01:34:08 -0600 )edit

answered 2014-07-04 23:10:17 -0600

updated 2014-07-04 23:12:52 -0600

If you put other lenses on the sensors you can actually get some distance between it and the surface. There´s been very interesting research on using mouse sensors for obstacle avoidance in UAVs as well as using multiple sensors on ground vehicles for highly accurate odometry. See for example or .

edit flag offensive delete link more

answered 2014-07-02 21:27:20 -0600

Mehdi. gravatar image

You can check this

and this

The Python code to get mouse coordinates on the screen is really short and you can easily integrate it into your project. In that case you maybe need to calibrate by yourself the ratio between Pixels/Meters by doing some tests. If the mouse gets stuck at the border of the screen you could also implement something to automatically take the mouse to the opposite border such that it could continue moving.

If you are more into hacking hardware and getting directly odometry data without plugging the mouse to a computer then check this :

edit flag offensive delete link more


Thanks for the ideas Mehdi. My first approach was to get the real world coordinates of the mouse from the position of the cursor on the screen. But you can not use this method if you are ever to use more than one of these sensors, or at least I do not see how you could make it work.

Antoine gravatar image Antoine  ( 2014-07-03 02:07:55 -0600 )edit

I am actually using the rosserial_arduino package with an arduino Uno to read the data out of the sensor. Seems to works well enough, though I haven't put it on an actual robot yet.

Antoine gravatar image Antoine  ( 2014-07-03 02:11:04 -0600 )edit

Try it then , and keep us updated.

Mehdi. gravatar image Mehdi.  ( 2014-07-03 02:12:46 -0600 )edit

answered 2014-07-03 16:14:17 -0600

What operating system will the mouse driver run on? In Linux, reading mouse data is a simple file read:

You can just create a ROS node that reads the mouse file data into the I/O event data structure and then your code can just look at the type/code values from the event data structure to tell which direction and how far the mouse was moved and publish that information into an ROS Topic for other ROS nodes to consume.

edit flag offensive delete link more


... and can tell speed too !

ccapriotti gravatar image ccapriotti  ( 2014-07-03 16:25:30 -0600 )edit

Thanks for the the link, I will definitely look into that. Although I am focusing more on solutions including 'hacking' by connecting to the output of the sensor. It seems to be the way to go as I am probably going to buy off the shelves sensors in the future (as opposed to buying an actual mouse).

Antoine gravatar image Antoine  ( 2014-07-04 20:29:42 -0600 )edit

Question Tools



Asked: 2014-07-01 07:47:37 -0600

Seen: 3,298 times

Last updated: Jul 05 '14