Neato XV-11 to ROS, SLAM – Tutorial


Documentation on ROS-Wiki can sometimes be a little hard to understand for some beginners. In this tutorial, we’re going to go through software steps for connecting Neato XV-11 sensor (which I will refer as Neato in short) to ROS and later also to SLAM. For this tutorial the reader is expected to have the basic knowledge of ROS. All this was done on Linux 14.04.



The first thing to do is to connect Neato to your computer and test it out. The documentation on ROS Wiki HERE is pretty good, so I’m just going to skip this step. Although my setup is a little different, it should work just the same.

My setup:




Next you have to setup the driver for reading data from Neato. Here are the commands for terminal, but if you get lost HERE is the source.

cd catkin_ws/src

git clone

cd ..


source devel/setup.bash


in new terminal:

source devel/setup.bash

rosrun xv_11_laser_driver neato_laser_publisher _port:=/dev/ttyUSB0

in new terminal:

rosrun rviz rviz

Now a new window called Rviz will be opened and you have to configure it so it shows the raw data from sensor. Again, follow the source HERE for details, but here is a pic of what you should do:

.RVIZ setup



Then you should be able to see something like this:



Problems that can occur:

  • If you see the points on rviz, but they are randomly dispersed you should use this command to start the driver:
    rosrun xv_11_laser_driver neato_laser_publisher _port:=/dev/ttyUSB0 _firmware_version:=2
  • If you don’t see any points on the screen, check your Rviz config again and check the voltage on your power supply. It should be exactly 3V.
  • If you get weird output from laser (dots change colours to all red, dots disappear and appear again) check your connections again, and wire it exactly as seen on picture, FTDI can’t power the laser radar alone… I can’t stress this enough… this took a long time for me to realize -.-
  • If you get errors about permissions check out THIS link on how to give permission for using USB.
  • Also check that Neato is recognized as ttyUSB0. If this isn’t the case, change the command for running Neato driver accordingly. You can check the number of connected USB with the command:
    ls /dev/tty*
  • If at any point your scan is “kindof blinking”, this can cause problems with SLAM. This can be due to bad connection of Neato lidar. Make shure you solder a good connector or something.


Make sure that you are getting the right data from sensor before going to the next step!



Neato XV-11 sensor with SLAM

Next up is setting up the Hector_SLAM package to work with Neato.

Download the SLAM package from HERE and unzip the files into “src” folder in your workspace next to the “xv_11_laser_driver” package. “catkin_make” and “source devel/setup.bash” the whole workspace again.

Now some files have to be changed in order to prepare the SLAM for Neato sensor. If you want to understand exactly what you are changing, you can read about it HERE, but following is a brief description:

When reading the text below, look at the picture above.



ROS has something called REP, which comes in handy when transforming from one coordinate system to another.  For example if your LIDAR is not quite in centre of your robot, you can tell ROS, the offset of the sensor, and it will be automatically included in localization. Take 5 minutes and read THIS great, short explanation of the image above.

You can tell ROS the offset data with the following .launch files:

Go to directory:


and add neato.launch file found on this Github.

and in directory


1.) change the name of mapping_default.launch to mapping_default_old.launch.

2.) Then copy mapping_default.launch from GitHub to the launch directory.

Just to be clear – you should now have two files in your lunch directory: a) The original one, named mapping_default_old.launch and b) The one from github named mapping_default.launch


Now you have to change the driver a little bit:

cd YOUR WS/src/xv_11_laser_driver/src/

nano neato_laser_publisher.cpp

Find the word  “neato_laser” and change it to “laser”. (without the quotes). The whole line then looks like:

priv_nh.param(frame_id, frame_id, std::string(laser));

Because you changed a cpp file you have to catkin_make the whole workspace again:





After that you should be able to run Hector mapping with commands:

rosrun xv_11_laser_driver neato_laser_publisher _port:=/dev/ttyUSB0 _firmware_version:=2

and in new terminal:

source devel/setup.bash

roslaunch hector_slam_launch neato.launch

A new Rviz window should appear and again you have to set it up. Here is a picture of my config:






You should now see something like this:


Things that can go wrong here:

  • If you get your map, and after closing and reopening everything you don’t any more: Restart the whole roscore. With the cheap FTDI drivers USB probably doesn’t close properly and it causes problems. That is why the USB driver is launched separately from mapping, so you can close the Rviz as often as you like, making changes while the coms with Neato are always running in background.
  • If at any point your scan is “kindof blinking”, this can cause problems with SLAM. This can be due to bad connection of Neato lidar. Make shure you solder a good connector or something.


Next challenge: Making an odom messege and transform in ROS HERE,

all ROS tutorials HERE,

and the latest news about my work can be found HERE.






Share on FacebookShare on LinkedInTweet about this on TwitterEmail this to someone

You may also like...

32 Responses

  1. Brent Lamb says:

    I am trying to duplicate your setup. I have added all of the files, and when I run the launch file, I get:
    lookupTransform base_link to neato_laser timed out. Could not transform laser scan into base_frame.
    I would appreciate any help.

  2. Brent says:

    Here is a copy of my node list:
    brent@brent:~$ rosnode list

    Also, here is the warning:
    [ WARN] [1440939624.108133570]: No transform between frames /map and /base_link available after 20.007983 seconds of waiting. This warning only prints once.

    • Janez says:

      Yes, there should be a /scan node in your rosnode list (did you rename it?) Did you start the driver in one terminal, went in another tab, sourced the WS and only then started the neato.launch? Can we talk on email where you can send me screenshots?

      • Tyler says:

        Hey thank you for everything. This has really helped. I am having the same problem as above. My tf tree looks the same as the frames.pdf on your github and I am getting a map it just jumps around in rviz and can’t seem to match the scans properly because it says No tranform between frames /map and /base_link. Any thoughts would be great. my email is Thanks in advance.

  3. Brent Lamb says:

    I copied, and pasted your neato.launch file into launchg, hector_slam_launch, and copied and pasted your mapping_default file into launch, hector_mapping. I start the neato driver in a seperate terminal before starting the neato.launch. I can see data from rviz with a different configuration, but when I change it to your configuration I don’t get any thing.

  4. Sanjaya Kumar says:

    Thanks so much for this awesome writeup! I was able to get my turtlebot-with-neato-lidar drawing hectormaps. First I turned off Odometry as in your script and after that was working, I removed the /odom/base_link static transform, but added a /nav/odom static transform. Then I started the minimal Turtlebot, turned on the neato-lidar, and drove it around using joystick teleop. On my Ubuntu desktop, I had your main launch script running (which basically read the Laser Scan Data from the XV11 and the Odometry data from the iCreate) and it drew a pretty good map!

  5. Janez says:

    Always nice to see a good reply… Thanks!

  6. dchang0 says:

    Hi– thanks for the awesome write-up! I followed your instructions and got pretty far, but the hector map that is produced has the entire map getting copied and overlapped onto the existing map whenever I turn/rotate the sensor. After a few turns, the resulting map is a huge mess, basically a giant blob, and totally unusable.

    So it seems there are large rotational errors and small translational errors.

    The two primary differences between my setup and yours are:

    a) I am using a different serial-to-USB setup sold here:

    b) I am not using odometry of any kind. In fact, I am using the launch files from this other GitHub/YouTube example:

    If you watch his video, he does a lot of the same moves I am trying to do, walking/turning at a fairly quick speed, and yet his map doesn’t start overlapping onto itself.

    Any ideas or suggestions as to what may be going wrong?

    I do notice that in your video of the laser scan (no mapping) where you are scrunching your hand in front of the sensor, all the rainbow-colored dots in the scan are much closer together than in my scan. Perhaps your LIDAR is rotating slower or taking more measurements per second, or something.

    Thanks in advance for any help you can provide!

    • Janez says:

      Yeah I dad the same problem with my SLAM, which i solved with adding odometry. I have also seen and wondered at how the guy in the video does it without odom, but decided not to bother further in that direction.

      One cause for “jumps” in SLAM could be a bad connection of LIDAR. I ended up soldering all the contacts, just to be sure (because in rviz /scan image was blinking uncontrollably)

      I am using the neato ROS driver mentioned in my tutorials and Neato LIDAR does not output data when spinning to fast or too slow, so i think that’s not the case. In the video i’m pretty close with my hand to the lidar, so that may be the cause of confusion :)

      • dchang0 says:

        Thanks for your quick and helpful reply. Sure enough, you are right about the blinking of the LIDAR. When I run the first test (the simple /scan), the map will be good for a while and then almost the entire scan will start blinking in and out on its own (almost every dot disappears except maybe one or two). I did not do the soldering in this case–it was purchased as a complete assembly from Get Surreal’s online shop. I am contacting him for support right now.

        Just to confirm–is your Neato XV-11 scan now “rock solid” with very few blink-outs? What should I expect this XV-11 to be capable of?

        One more question: which version of ROS were you using? There seems to be a difference in hector_slam between Indigo and Jade in that the code from both you and njzhngyife does’t work at all in Indigo and worked fine in Jade…

        Thank you again for your help!

        • dchang0 says:

          To clarify my question about your XV-11 being “rock solid”–is there anything you can do or are there any conditions under which you can cause the LIDAR scan to start blinking (aside from the too fast or too slow possibility)? For instance, sudden turns or such? Or does it really just work all the time once you got it soldered correctly?

          • Janez says:

            No, problem :) I had the very same problem in the beginning and after soldering the scan was still not “rock solid”. But it was good enough for amcl and Hector SLAM(aside from occasional “jumps”). Also right now i have lidar connected on RPI2 GPIO serial pins and it works much better then on cheap FTDI. With this setup fast turns of robot and shaking do not corrupt the scan at all.

            All in all Neato is a GREAT low cost lidar. When you get it to output data, its quite accurate and without much noise, so i recommend it for project that dont need great accuracy. AMCL works OK, SLAM returns good map, I am now setting up move_base and it seems OK.

            I use Jade and havent tried it on indigo yet…

          • dchang0 says:

            Happy New Year!

            Okay, a follow-up on my flickering/blinking laser scan. The seller suggested I watch the ASCII data coming out of the unit, and so I set it up so that I could watch the data (using rostopic echo /scan) and the laser scan in RViz at the same time. When RViz started flickering, I would quickly look at the ASCII data. And, what I saw was that the data would be entire sets of 0.0mm readings. That is to say, 360-degrees of 0.0mm all the way around, for one or more revolutions.

            So this is not the same as not sending any data at all. It is actively sending 0.0mm readings.

            Hope this helps someone else. I will be sending in the XV-11 for replacement. If this happens again with the new unit, then perhaps something like insufficient USB power or a bug in the xv_11_laser_driver code or in the Teensyduino 2.0 sketch (the UART to USB code) is occurring. I will cross that bridge if I get there…

            Thanks again for your help, Janez! If you have the time, please answer my other questions. If not, I’m sure I’ll figure it out eventually.

          • Janez says:

            Happy new year to you too :)

            Yes i also get all zeros if i echo /scan when it flickers. If you have a Raspberry pi, try connecting it to Serial GPIO. I have a tutorial of it here:

            This way you can eliminate connectors and LIDAR controller and see if you get better results. I wouldn’t rush to return the LIDAR before testing this… :)

  7. dchang0 says:

    Very interesting–you attached the LIDAR directly to the Raspberry Pi! How did you get the motor to keep the RPM steady at 300rpm +/-2rpm?

    I did another run, and the map went well until one point where the unit got lost again and the map jumped/overlapped itself. The axes stayed in place, so it made it seem like the room had rotated by a large amount around the sensor.

    I wonder–is it possible to mitigate/prevent this by using an IMU? After all, the sensor did not actually rapidly rotate in that instant, and data from the gyroscope and accelerometer showing that no rapid rotation occurred should cancel out the flicker-out of the LIDAR data for that instant.

    Did you include an IMU in your setup (other than the odometer)?

    I am thinking of using an STM32 Discovery board instead of the Teensyduino that GetSurreal uses and having the STM32 board wirelessly send the data to a computer that can then convert it into ROS through the xv_11_laser_driver. The inspiration is this video:

    He is sending the raw data to MATLAB over Bluetooth. The STM32 board is much more powerful than the Teensyduino, so it should never choke on the sheer amount of data coming out of the XV-11 LIDAR…

    • Janez says:

      Oh i connected the motor to a separate power supply of 3volts, and the sensor is connected to 3V3 on GPIO Raspberry (you could need 5V if you are using a newer version of neato).

      No i havent tested anything with IMU, but it’s defenetly an interesting idea!
      I see that HectorSLAM does not natively support IMU but maybe it can be added with other packages (

      And the guy on the video does not experience flickering at all!

      • dchang0 says:

        Thanks for the quick reply, and thanks for the link!

        I have been looking into the IMU + hector_slam question and running into confusing answers and a lack of instructions/tutorials. Some say the IMU is necessary, others say it is not. The Team Hector member Stefan Kohlbrecher has said that it is necessary and named the part number for the handheld setup as a “IMU Analog Combo Board Razor – 6DOF Ultra-Thin IMU” from Sparkfun, SKU number SEN-10010 in the comments under this video:

        I have played back the bag file from the above video, and while it is obvious that the IMU is in use, it does not seem that it is actually necessary. I suspect that all it does is make slightly cleaner maps.

        In the Team Hector presentation, they show they used an MTI XSens (I think for the ground robot, definitely not for the handheld unit): They did not specify which of the XSens models they used.

        I am thinking of using the Bosch BNO055 IMU fusion sensor chip (approx $20 to $40) for the IMU, but I still have no idea how to actually integrate it into the hector_slam system, and anyone who has asked the Team Hector guys for an answer has not gotten one yet. I will probably end up not integrating it since I am not familiar enough with the code and math behind it. The cost is low enough to buy one and play with it just in case I find a way to integrate it.

        These guys show how to hook the Bosch BNO055 up to ROS:

        But that does not actually show how to integrate it into hector_slam. The best thing about the BNO055 is that it does not require calibration; that is a requirement for our project.

        Thanks for the help with the Raspberry Pi. If I use the PWM on the Get Surreal XV-11 controller to keep the LIDAR running at 300rpm, I can run the serial connection to the RPi and see if by bypassing the Teensyduino, the flickering goes away. But I am running out of time, so this will likely not happen. We will likely just return or exchange it and buy a Hokuyo LIDAR instead.

  8. Christian says:

    Thank you, excellent tutorial!!

  9. Etienne says:


    Thanks for this great tutorial, after having a few problems installing ros on a RPi 2 with ubuntu 14.04 (I will probably give more details in your specific tutorial for RPi and Ros), I’ve connected my Neato Lidar directly to the RPI serial port. I’m using this piece of hardware to make the lidar turn at the right speed :
    I’ve checked with a separate program that I have indeed data coming in on the serial port.
    Now I’m stuck at the last step of this tutorial : when I type “roslaunch hector_slam_launch neato.launch”
    I get this error : “Invalid roslaunch XML syntax: not well-formed (invalid token): line 52, column 70″
    Any idea of the problem or at least where I should search ?

    • Janez says:

      This error means, you have a syntax problem in the neato.launch file. Have you been modifying it yourself? If yes, try correcting the syntax or download and replace the file again. Let me know if it helped :)

    • Etienne says:

      Ok I found the mistake. I didn’t copy the neato.launch file properly. Sorry for posting.
      Now I have another problem, the node “/neato_laser_publisher” doesn’t appear in the node list when I launch the “roslaunch hector_slam_launch neato.launch”.
      And since I couldn’t launch rviz to check what’s hapening, I’m in the dark (I’m on a RPI2, I’ll post in the other tutorial about launching rviz on a RPI2)

  10. Kevin says:

    Hello, i need some help for my problem
    I’ve installed ROS Indigo in my Ubuntu 14.04.5 LTS using this
    and then i try to running the xv-11 laser node (
    but in Part “0.1 Startup the XV11 driver”, it says that I must have “cwru-ros-pkg environment setup” in my terminal, but i don’t have any clue for setup this cwru-ros-pkg in ROS Indigo.
    I don’t have the idea what’s wrong with my steps T.T Is there something missing with my ROS Indigo while installing?
    Thank you

  11. Gloer says:


    Tanks for an nice tutorial. It worked right out of the box, except I had to use /dev/ttyACM0 instead of /dev/ttyUSB0.
    How to integrate this to turtlebot. Can you pls give me some direction.


  12. Jane says:

    Hi, thanks for the great post about xv-11 lidar & hector slam.
    I have some problem with hector slam.
    First, I follow this step ( and the test was ok.
    I want to test xv-11 with slam, so I follow all of your job. (copy neato.launch, copy mapping_default.launch, modify the xv-11 source, catkin_make and source devel/
    and setting a new Rviz window like yours. (set Grid reference frame to scanmatcher_frame, set Map topic to /map, set Axes reference frame to scanmatcher_frame ) but with warning (No transform between frames /map and /base_link available after 20.002310 seconds of waiting), I cannot get any data from xv-11 lidar.
    I think I have a same problem with “Brent Lamb”.
    Could you give me an advice?

    • Janez says:

      Hi Jane.
      Try running all of your sysetm and thein in a nother sourced terminal type “rosrun tf view_frames”. This will make a pdf of your transforms setup. If you send me this maybe i can tell you whats wrong

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>