Range Finding

One method of determining the distance to an object is by using a laser and camera. Here I implement a simple laser range finder with a Logitech C310, ROS and OpenCV.

Theory of Operation

The principle of operation I based my work off was taken from Todd Danko’s DIY Laser Range Finder. The basic idea is that if we setup the camera and laser pointer just right, we can find the distance to an object by the distance of the laser dot from the center of the image. This is illustrated below.

A camera and laser pointer are mounted parallel. The distance to where the laser dot hits can be found by using the vertical position of where the laser dot appears in the camera image. (Taken from DIY Range Finder)

Demonstration of how pixels from center (pfc) changes as objects get farther away. "Image" is the actual image we see from the camera.

If we can find θ and measure H (the height), then by basic trigonometry we can find the distance, D, using

Now, how do we find θ given we know H and assume we can find the pixels from center (pfc) of the laser dot. We can solve this by assuming that θ is a linear function of pfc.

So, how do we find a1 and a0? This is just a matter of solving a linear equation, which is not too difficult if we know the input (pfc) and the output (θ). For this calibrating phase we can find θ by measuring the pfc with a known D:

So, if we take some measurements, for example with H = 3.6 cm at fixed, known distances and record the pfc:

pfc (px) D (cm) arctan(H/D) (rad)
229 12 0.29145679
139.5 20 0.17809294
92 30 0.11942893
34 75 0.04796319
22 100 0.03598446
11 177 0.02033618
2.5 296 0.01216156

After recording the data, we can then use matlab’s or numpy’s polyfit to solve for a1 and a0:

x = [229, 139.5, 92, 34, 22, 11, 2.5]
y = [0.29145679, 0.17809294, 0.11942893, 0.04796319, 0.03598446, 0.02033618, 0.01216156]
a1, a0 = polyfit(x, y, 1)

Now that we know a1 and a0, we can find D by plugging in the linear approximation of θ on our very first equation:

So, how well does the linear approximation perform? Here is a comparison:

The measured distance versus the distance using a linear approximation of θ with pfc.

Finding the Dot / pfc

Example image with laser dot hitting the wall.

An issue with using a camera to detect the laser dot, as opposed to an infrared sensor is that we only sense in the visible spectrum. Hence, to detect the laser dot we assume that if something is “reddish” then it is the laser dot. Before that, let us first convert the image from RGB to Hue-Saturation-Value (HSV). HSV allows us to look at a colour regardless of its intensity (i.e. how dark or bright it is). By just looking at colour, we can then just filter out “reddish” things (a good post on this can be found here) a lot easier then if we had to deal with the various combinations of green and blue that go into reds with RGB. A good resource for figuring out the range of colours your looking for is this online HSV colour map. Now for filtering out the “reddish”,  we will use OpenCV’s InRangeS. The result is a binary image where 1 (white) pixels are pixels that were “reddish” while 0 (black) pixels are not.

Thresholding out "reddish" things -- very liberally. The door ends up being thresholded as well.

By just looking around, you can probably find some “reddish” things in your immediate vicinity. These “reddish” things can make finding the dot more confusing. That is why in the next step, we employ a mask. For our purposes a mask blocks out everything but a central bottom column where the red dot can appear.

Mask for the dot. Only the white part of the image is considered, everything else is "masked" out.

With the mask in place, let us see what is left:

The image thresholded for "reddish" w/ the mask applied. The white speck is the laser dot.

That white speck is the laser dot, which can be cleaned up with closing if desired, but may not be necessary). We can then use OpenCV’s FindContours to detect blobs, specifically our laser dot. It finds the boundaries of blobs for us, and using those boundaries we can easily find the number of y-pixels a blob is from the center, which gives us pfc. With pfc we can find the distance.

The detected blob with its red bounding box around the laser dot. This bounding box can easily be used to determine the y-pixels from center (pfc).

Implementation

iRobot Create robot used for the tracking part of project.

Lego NXT robot used briefly.

In this project I worked mainly with the iRobot Create and a bit with Lego NXT 2.0. In both cases, robots were controlled by ROS.

Soldered Laser Pen.

For the camera I used a Logitech C310 and for the laser I used a $20 Staples laser pen. The laser pen was powered by 2xAAA batteries and was activated by pressing a button. I cut the laser pen down and soldered ground to the ground springy thing and positive to the shell of the laser pen which is made of brass. The laser is wedged into the mount in such a way as to have the button pushed in at all times, so it is always active.

Rather than using batteries I just grabbed a Seeeduino Mega board and powered it off the 3.3V pin. To mount the camera and laser pointer I printed a plastic part using a Makerbot Thing-O-Matic 3D Printer (mk 6).

The camera and laser mount model for the Makerbot 3D printer. It was designed using Autodesk 123D. It is on its side for easier printing.

3D printed camera / laser mount.

Results

The operating range in my setup was approximately 15 – 300 cm, with last 200-300 cm being not very reliable. The minimum range is defined by H and the field of view of the camera.

This setup works OK around 16 – 100 cm, but has some issues as demonstrated in the video below. Notice the detection of the laser reflection in some cases.

As previously mentioned, the system works in the visible spectrum and we assume anything “reddish” is the laser dot. This results in an obvious disadvantage: anything red in the mask area will likely lead to incorrect results.

A red object that is also thresholded along with the laser dot. No reading could be taken with it present.

The biggest issue (in my opinion) with this method is the decreased resolution/accuracy when dealing with farther distances (>60 cm). The function to find D using θ/pfc is asymptotic (i.e. f(x) = 1/x), as illustrated in the comparison plot far above. This translates into a wildly uneven distribution of pixels to distance. For example, the range 60 – 300+ cm is represented by about 50 pixels. Because it is asymptotic, there is no great way to improve upon this except by moving towards a typical laser range finder, which uses the time of flight of a laser pulse to determine the range. Given that a web camera, such as the one I am using, can at best reach about 30 fps (~33ms poll rate) without image processing, it is unlikely that it has the temporal resolution required to be used as a half decent detector for a time of flight method.

Code

All of the code for my project can be downloaded from https://bitbucket.org/raw/csc578c.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s