Give your next Robot 3D vision: Kinect V1 with ROS Noetic

Kinect sensors are an amazing piece of technology. I would even call it revolutionary. But sadly, the product was a bit ahead of its time. And now big ‘M’ has stopped producing it. They are going to bring back Kinect in another avatar but I don’t think it will be as cheap as its older brother.

Kinect 360’s primary target was gaming, which never took off. But another set of people recognised its value and adopted it. Roboticists, DIY enthusiast and Researchers in the field of computer vision recognised it for what it was, a cheap way to test and build their ideas. This, my dear friend is a fact even now. How do I know? I just bought it for the same purpose, it just arrived today.

Buut, this amazing sensor has a bit of a learning curve when it comes to making it run on your system for the first time. So if you are ever going to be in my shoes, let me try and make it at least a little easier for you. Follow me along as I take control of my Kinect 360 using ROS. I have tested this on Pop_OS! 20.04 with ROS Noetic installed.

As always, start with an update and upgrade.

sudo apt-get update
sudo apt-get upgrade

Install the dependencies

sudo apt-get install git-core cmake freeglut3-dev pkg-config build-essential libxmu-dev libxi-dev libusb-1.0-0-dev

Get the libfreenect repository from GitHub

git clone git://github.com/OpenKinect/libfreenect.git

Make and install

cd libfreenect
mkdir build
cd build
cmake -L ..
make
sudo make install
sudo ldconfig /usr/local/lib64/

To use kinect without sudoing every time

sudo adduser $USER video
sudo adduser $USER plugdev

Next we will add some device rules

sudo nano /etc/udev/rules.d/51-kinect.rules

Paste the following and ctrl+q to save.

# ATTR{product}=="Xbox NUI Motor"
SUBSYSTEM=="usb", ATTR{idVendor}=="045e", ATTR{idProduct}=="02b0", MODE="0666"
# ATTR{product}=="Xbox NUI Audio"
SUBSYSTEM=="usb", ATTR{idVendor}=="045e", ATTR{idProduct}=="02ad", MODE="0666"
# ATTR{product}=="Xbox NUI Camera"
SUBSYSTEM=="usb", ATTR{idVendor}=="045e", ATTR{idProduct}=="02ae", MODE="0666"
# ATTR{product}=="Xbox NUI Motor"
SUBSYSTEM=="usb", ATTR{idVendor}=="045e", ATTR{idProduct}=="02c2", MODE="0666"
# ATTR{product}=="Xbox NUI Motor"
SUBSYSTEM=="usb", ATTR{idVendor}=="045e", ATTR{idProduct}=="02be", MODE="0666"
# ATTR{product}=="Xbox NUI Motor"
SUBSYSTEM=="usb", ATTR{idVendor}=="045e", ATTR{idProduct}=="02bf", MODE="0666"

Now we need to generate audio drivers for support, run the following to get ‘audios.bin’ file. Open a terminal inside the freenect folder and type,

python3 src/fwfetcher.py

Now we need to copy ‘audios.bin’ to a specific location.

sudo cp src/audios.bin /usr/local/share/libfreenect

Run the following to check the audio.

freenect-micview

You should see this

The waveform should reflect your speech. Next we will try the depth camera.

freenect-glview

Congratulations! Your Kinect now works on Ubuntu!

Now lets make it work with ROS. We need to get some launch files for visualization.

Create your catkin workspace directory, skip this step if you already have it setup.

mkdir -r ~/catkin_ws/src

Now we will download the required ROS package.

cd ~/catkin_ws/src
git clone https://github.com/ros-drivers/freenect_stack.git

Now we will use catkin make command to get the ROS on our system to recognise the examples.

cd ..
catkin_make

Source the setup file in newly created ‘devel’ directory so that our ROS environment can recognise the launch files.

source devel/setup.sh

Now we will launch the freenect example for depth registration which allows you to get the point cloud with RGB data superimposed over it.

roslaunch freenect_launch freenect.launch depth_registration:=true

Let’s now visualize the topics from Kinect on Rviz, open a new terminal and launch rviz.

rviz

We will now need to setup some parameters on rviz to visualize the depth registration data.

  1. In the ‘Global Options’ set the ‘Fixed Frame’ to ‘camera_link’.
  2. Add ‘pointcloud2’ object and set the topic to ‘/camera/depth_registered/points’

Now wait for a few seconds to get the points on display!

Me looking like a zombie

Hope I was of some help, best of luck on your Kinect Adventures! See you in the next post!

References:

  1. https://doc.bwbot.org/zh-cn/books-online/xq-manual-en/topic/557.html
  2. https://naman5.wordpress.com/2014/06/24/experimenting-with-kinect-using-opencv-python-and-open-kinect-libfreenect/
  3. http://www.choitek.com/uploads/5/0/8/4/50842795/ros_kinect.pdf
  4. http://wiki.ros.org/ROS/Tutorials/CreatingPackage

10 thoughts on “Give your next Robot 3D vision: Kinect V1 with ROS Noetic

  1. I have followed the procedure exactly but im left without a point cloud, freenect reads the following

    [ INFO] [1631988467.434906603]: Stopping device RGB and Depth stream flush.

    note: i am yet to preform calibration

    Like

    1. This might happen for two reasons:
      1. Your Kinect is not getting sufficient power
      2. If you are using an RPi then it sometimes happens due to either low power or a full RAM / Processor

      Like

  2. Hey there! awesome demo! I was wondering how you figured out how to do this. I am working on a mapping project using the Kinect with an Rpi and ROS. I was able to get the demo to work, but to fully understand why and how its working, I would need to read up on this. Is the list of references sufficient?

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: