Manual Exposure vs Auto Exposure for ELP 2 MP USB Camera

For our drone flying project, we have been using the ELP 2 Megapixel USB Camera. The auto exposure on this camera works in most situations, but we found that it does not always adjust to bright sunlight. In preparation for demonstrating our computer-controlled drone at the Maker Faire, I wanted to have a plan in case we were outdoors. It was a good thing too, since we were assigned an outdoor booth next to the Drone Combat arena.

We detect the location of our drone by using blob detection on four paper circles that we have taped to the top of the drone. Originally, we were using a medium green color, but we found that under some lighting conditions, our code would confuse the light blue color on the body of the drone with the green circles. I thought about making our blob detection code more robust, but the Maker Faire was quickly approaching! Instead we decided to make our flying area more controlled. We used white poster board as the background for our flying area and I tested some different colors for the paper circles. Red circles were good, except that our code got confused if one of our hands was reaching into the flying area. Black was not good in dim light. In the end, we decided on a dark purple with a blue undertone.

Testing different circle colors
The winning color: dark purple

OpenCV provides a way to set a webcam’s manual exposure, but there are two problems. The first is that OpenCV is not well-documented. I could find the documentation stating that I should be able to set the exposure value, but it was not at all clear what values to pass! The second problem is that your particular webcam may not support programmatic setting of the exposure. Thus, when your code doesn’t work, it can be difficult to determine if your code is wrong or if your webcam just won’t allow it!

OpenCV’s VideoCapture.set() is the method to use. If you look at the documentation, you will see that there is a property named CV_CAP_PROP_EXPOSURE. It took me some time to discover that depending on the version of OpenCV you are using, the property’s name might actually be CAP_PROP_EXPOSURE.

There is no hint as to what the exposure value should be set to, but luckily for me, I found a mapping for the ELP 2 MP webcam on this page by Joan Charmant. He shows that the exposure values range between -1 and -13. You can programmatically set the exposure in this manner:

vc = cv2.VideoCapture(1)

Unfortunately, I could not figure out a programmatic way to set the exposure back to auto exposure. If you know how, please add a comment! Please be aware that for some webcams, such as this one, the manual exposure setting is stored in its on-board memory, which means that turning off your program and even turning off the webcam itself, the manual exposure will still be set!

As a workaround, I found a way to bring up the DirectShow property pages so that I could use the DirectShow GUI to set the manual exposure or to turn auto exposure back on.


Here’s the code to launch the DirectShow property page:


During the Maker Faire, our demonstration area was shaded by a tent for most of the day, but around 2 PM our flying area was part sun and part shade. We delayed the inevitable by moving our table back with the shade, but eventually we had to move our table back to the front of the booth and into the sun. On Saturday, the afternoon was mostly overcast, and the camera’s auto exposure worked most of the time. I was surprised that our blob detection code even worked when people walked in front of our booth and made our flying area partly shaded by their shadows.

Sunday was mostly sunny, and the webcam’s auto exposure did not work when it was very bright. At these times, I opened up the DirectShow property pages and set the exposure manually so that our demo would still work. Maker disaster averted!

Blob Detection With Python and OpenCV

In my previous post, I described how to set up Python and OpenCV on your computer. Now I will show you how to use OpenCV’s computer vision capabilities to detect an object.

OpenCV’s SimpleBlobDetector will be the primary function that we will be using. With the SimpleBlobDetector, you can distinguish blobs in your image based on different parameters such as color, size, and shape.

As an OpenCV novice, I searched Google to help me get started with the Python OpenCV code. You will find that OpenCV is very powerful and extensive, but unfortunately it is not well documented. Some classes and functions are described well, but some just list a method’s parameters with a terse description. I suppose we can’t have everything. On the bright side, there are many tutorials and examples to help you out.

Here are a few tutorials that we found helpful:

  • Blob Detection using OpenCV – a nice brief introduction to SimpleBlobDetector.
  • Ball Tracking with OpenCV – this example is more extensive, and he has a nice animated gif at the top of his page showing the ball tracking in action. We use cv2.inRange() like he does, but we then use SimpleBlobDetector_Params() instead of findContours().
  • OpenCV’s Python Tutorials Page – I don’t have the patience to go through tutorials when I just need a quick solution, but I did look through a few of the tutorials on this page when the need arose. We based some of our color threshold code on the example shown if you go into the Image Processing in OpenCV section and then to the Changing Colorspaces tutorial.

For our drone flying project, we put four colored paper circles on top of our Cheerson CX-10 mini-drone to make detection simpler.

Drone image taken by webcam

When we were testing out our detection, we took a bunch of jpg photos with our webcam under different conditions and we put them in the ./images directory. In this code example, we loop through the image files and we try to detect the purple circles on our drone for each image.

The full code is and is up on Github with the rest of the project.  Here is the beginning of the code. We set up our import statements, and then we need to undistort the image. For our webcam, the image is distorted around the edges – like a fishbowl effect.

Now to the heart of our code. We run cv2.GaussianBlur() to blur the image and which helps remove noise. The webcam image is in the BGR (Blue Green Red) color space and we need it in HSV (Hue Saturation Value), so the next call is cv2.cvtColor(image, cv2.COLOR_BGR2HSV).

We need to separate the purple circles from the rest of the image. We do this by using cv2.inRange() and passing in the range HSV values that we want separated out from the image. We had to do some experimentation to get the correct values for our purple circles. We used this range-detector script in the imutils library to help us determine which values to use. Unfortunately, the range of HSV values varies widely under different lighting conditions. For example, if our flying area has a mixture of bright sunlight and dark shadows, then our detection does not work well. We control this by shining bright LED lights over the flying area.

Result of running cv2.inRange() to separate out only the purple pixels
Result of running cv2.inRange() to separate out only the purple pixels

Now we use SimpleBlobDetector to find our blobs and they are saved in keypoints.

If we found more than 4 blobs, then we keep the four largest. We draw green circles around the blobs we found and we display these four images:

  1. The original image after undistort and Gaussian blur (frame)
  2. The image with the purple circles separated out and shown in white (mask)
  3. The image with the purple circles separated out and shown in their original color (res)
  4. The original image with green circles drawn around the purple circles (im_with_keypoints)
Image after blob detection (im_with_keypoints)
Image after blob detection (im_with_keypoints)

If there are multiple images in the directory, then we go through this whole process for the next image. Now our code can see where our drone is!

Find Out Which Channels You Can Get For Free With an Antenna

There’s a great tool on to help you determine which channels you can receive over the air (OTA) at your house. Yes, your house. You type in your address and it will give a list of channels that you will be able to receive for free with an antenna! It will even show you where the signals are coming from so that you can optimize your signal strength by pointing your antenna in that direction.

Check out TV Fool’s TV Signal Locator

We use a Winegard FlatWave Amplified Antenna mounted at the top of the wall above our media cabinet about 8 feet off the ground. We live in a flat suburban area and we are able to get all of the main network channels in HD for free! I love that we can watch the Super Bowl and the Oscars in HD. We get lots of kids channels and even re-runs of The Brady Bunch. My kids have watched almost every episode of this good, wholesome show.

Once you have your HD antenna, take your set up to the next level by adding a DVR. With a DVR, you can record your OTA shows and watch them at your leisure. Our DVR comparison guide is here to help you choose the DVR that is right for you!

How to Set Up Your Python OpenCV Development Environment

For our drone flying project, we needed a way for our computer to detect the location of our mini-drone through the use of a webcam mounted above the flying area. We are not at all familiar with computer vision algorithms, but we do know how to call functions from a Python library! We made use of OpenCV (Open Source Computer Vision), which is available for Python and C++.

For our Python environment, we chose Python(x,y). Python(x,y) is a version of Python developed specifically for scientific calculations and visualizations. If you are a fan of Matlab, then you will feel right at home with Python(x,y).

This is what you need to do to set up a Python(x,y) development environment with OpenCV.

    1. Install the latest revision of the python(x,y) package. This includes Spyder (Scientific PYthon Development EnviRonment). Download Python(x,y) here.
    2. For the Python(x,y) install, choose Custom install and select the PythonPySerial 2.7-1 component. PySerial is needed to communicate with an Arduinopython install
    3. Optional: We also like to add the OtherWinMerge component when installing Python(x,y), but it is not required.
    4. You will also need to install the opencv2 package. Download opencv2 here.
    5. Unzip the opencv2 package and copy
      opencv\build\python\2.7\x86\cv2.pyd to <python dir>\Lib\site-packages\ where the default Windows location for <python dir> is C:\Python27

Note: If your computer supports it, copy opencv\build\python\2.7\x64\cv2.pyd instead of x86. I decided which to run by first trying the x64 copy, but the x64 version did not work for me when run. So I copied the x86 version instead. See below for how to check if OpenCV is loading properly.

Now it’s time to check if your development environment is working. Start Python(x,y) and you will see this window:


Click on the small blue and red icon Spyder button that looks like a spider web to start the Spyder IDE. Here is what the Spyder IDE looks like:

Spyder IDE

The bottom right portion of the IDE shows the IPython console. You can run scripts or call Python commands directly in the IPython console.

In the IPython console, type import cv2 and hit enter.

If there is a problem, then you will receive an error, likely an error about “No module named cv2”. If that happens, then check that you copied the OpenCV files to the correct location as described in Step 3 above.

If everything is working, then the console will accept your command and show a prompt for your next command like this:

import cv2

Hooray, you have successfully set up Python(x,y) and OpenCV! Nothing to it, right? Now let’s see what we can do with OpenCV. Take a look at our post on blob detection with OpenCV.

Teach your PC to fly a Mini-Drone!

A few months ago, I watched this TED talk where they setup an indoor arena and did some amazing things with drones.  It got me thinking, and it inspired me to build something like that for myself – but on a much smaller and cheaper scale.

In the video they use an expensive real-time infrared motion tracking system (I am guessing something like these Optitrack systems) to measure the position of the drones, and then uses a computer to calculate and send control signals to coordinate the drones. At a high level, my setup works in a similar way, as shown in this diagram:

Here’s a photo of what my setup looks like:

drone setup
Photo of the first working setup.

This is a list of the items needed to build this:

Component Description
USB Camera ELP 2megapixel Hd Free Driver USB Camera Support Mjpeg Linux Android Windows Developing Board,usb Camera Module
Arduino Microcontroller Board Arduino UNO R3 Board Module With DIP ATmega328P(Blue)
Nordic Semiconductor 2.4GHz Wireless Card Addicore nRF24L01+ Wireless AddiKit with Socket Adapter Boards and Jumper Wires
Cheerson CX-10 Mini-Drone Cheerson CX-10 Mini 29mm Diameter 4CH 2.4GHz 6 Axis Gyro RC Quadcopter UFO RTF Green
2 blade guards for the Cheerson CX-10 Upgrade Cheerson Cx-10 Propeller Prop Blade Guard Cover Bumper Protection Protector Green White

Total cost for these items was around $85. In addition to the above, you might also need a folding table and stack of books to hold up the webcam as I did, but you can probably think up something more refined!
Here is a video of it working:

Here are some links to further information on how this all works:

  1. Setting up the programming environment on your PC
  2. Detecting the circles from the webcam
  3. Finding a low-latency web camera

Source code:

  1. nrf24_cx10_pc  – The source code for the Arduino to send 2.4GHz wireless signals to the drone
  2. pc-drone – The Python / OpenCV code used to track the drone and decide on how to adjust the drone controls

We will also be sharing this project at the Bay Area Maker Faire from May 20-22, so please stop by the booth and check it out!

What is an OTA DVR and why would I want one?

OTA DVR stands for Over-the-air Digital Video Recorder. Basically, this is a digital video recorder (think Tivo) system that allows you to record programs from over-the-air broadcast signals.

Long ago, before there was such a thing as cable or satellite TV, everyone watched TV by attaching antenna to the television set and pulling in a signal that transmitted by a rabbit ear antenna.

These signals are still being broadcast in most of the country, although the format and the quality are much improved.  Instead of fuzzy analog images like before, stations are now broadcasting in 720p or 1080i high definition digital signals.  In many case, the bandwidth and image quality is equal or even better than what your cable or satellite provides.  Plus the antennas don’t look as dorky anymore (hopefully you have a better TV too!)

Why Do I need a DVR if I have Streaming?

When I first cancelled my cable subscription (“cut the cord”) and setup an antenna way back in 2011, the main thing I missed was the DVR that was previously provided by the cable provider.  I could still get all the local broadcast networks (ABC, CBS, Fox, NBC, PBS). When ESPN became available through streaming on Sling TV (it’s also now available on Sony’s Vue), then I had access to pretty much everything I needed. However, I still missed my DVR because:

  1. I couldn’t skip commercials anymore!  Basically, if you’re streaming, they can force you to sit through as many commercials as they want.
  2. The skip-forward/skip backward functions are pretty crappy in every streaming app I’ve ever seen, especially compared to the responsiveness of a DVR.  I’d rather watch a recorded TV show from a broadcast channel versus from a streaming service just because of this.
  3. Not all streaming channels work that well, especially when watching live sports events.  There are glitches, dropouts, and sometimes the resolution gets lower (and I have 100Mbps internet service). At least in my case, the reliability of the streaming signal is not as good as the reliability of the antenna signal.  The Antenna signal is slightly less reliable than a cable signal, but the antenna is free, so I find it to be a good compromise.

Anyway, that’s my rant about why DVR’s are still nice to have. If you got this far, then you’re probably want to know how to get your own OTA DVR.  Checkout my guide here. 


Finding a low-latency webcam

When trying to use a webcam in a computer vision application as part of a real-time control system, the latency is often just as important as the frame rate. Unfortunately, the latency for a webcam is often not specified, especially not for low-cost webcams.

One simple way to measure the webcam latency is to point the camera at a computer screen that is displaying the view from the camera and also printing the current time on the screen.  You end up with infinite recursion images like this:

Latency Test Images
Measurement of ELP-USB500W02M-L21 webcam latency
Measurement of ELP 5mp webcam latency
Measurement of ELP-USBFHD01M-L36 webcam latency
Measurement of ELP USB high-speed webcam latency
Measurement of PS3 Eyecam Latency
Measurement of PS3 Eyecam Latency

The difference between the time which is overlaid on the image (the largest type) and the time shown in the image from the webcam (the next largest) is the latency.  The Python / OpenCV2 code I used to capture these screenshots is up on Github.

Here are the results for three cameras I measured:

Camera Model No. Latency (ms) Frame-Rate (fps)
ELP USB with Camera 2.1mm Wide Angle Mjpeg 5megapixel Hd Camera USB for Industrial, Machine Vision ELP-USB500W02M-L21  ~115 to ~130  10
ELP 2megapixel Hd Free Driver USB Camera Support Mjpeg Linux Android Windows Developing Board,usb Camera Module ELP-USBFHD01M-L36  ~105  30
PlayStation Eye PS3 Eyecam ~75  60

All cameras were set to capture at 640 x 480.  The above cameras are all consumer grade cameras, costing about $45 for the ELP models, and only $5 (!) for the PS3 Eyecam.  As a comparison point, $280 would get you the Slim-3U from Optitrack which is specifically designed for motion capture and has a 8.33ms latency. Let me know if you find any other sub-$100 cameras that perform better!

ELP 5 Mega-pixel USB camera with 2.1mm Wide Angle Lens

This camera has a nice image quality, but the frame rate is slow and the latency was inconsistent.  The lagginess of this camera is very evident upon first usage of the camera.

ELP 2megapixel USB Camera with 3.6mm lens

This camera had good image quality, tolerable latency, and a 30fps rate.  I tested the version with a 3.6mm lens, but the base camera model USBFHD01M is also available with a  170degree fisheye lens or 2.1mm lens.  There is a nice review of this camera here.

This is the camera that I ultimately chose for my computer vision project.

PlayStation Eyecam

For a cost of $5, this is a very interesting camera.  The latency of this camera was the most consistent, and it is also capable of higher frame rates.  In order to use it on a Windows system, you will want to purchase a driver from Code Laboratories at a cost of $2.99 (it works very well). One tip, you may need to create a cleye.config file and save it in “C:\Program Files\Code Laboratories\CL-Eye Driver” to get greater than 30fps from the camera/driver.  This file contains this text:

<?xml version=”1.0″ encoding=”UTF-8″?>
<item name=”mode” value=”advanced” />

The image quality from this camera was not that great though (you can see it’s image is more blurry than from the other two camera), so that is why I did not choose this camera for my project.  It would be great for applications where there is fast motion and image quality is not as critical.

Industrial cameras and other links

In addition to the Optitrack camera mentioned above, here are a few more cameras I found discussed on Reddit that offer low latency. These are higher cost industrial cameras:

This is a project that measured latency using the same technique shown here using a Pi camera on a Raspberry pi.

How to Get Your Boss to Buy You a Quadcopter

Every year my boss asks for suggestion for a teambuilding activity to have with his staff.  Past activities have included thrilling activities like bocce ball and an indoor trampoline park (tip: this is a great idea if your team consists of 8 year-olds). This year I suggested that he buy everyone some drones and that we run some drone races. My suggestion was rejected – we went bowling instead.
Well, I took matters into my own hands. I manage a team of engineers myself, and I was determined to do something a little different this year.

Step 1. Plan

Instead of the usual holiday lunch, I bought everyone on my staff a Cheerson CX-10 quadcopter.  These are tiny little things – only about 2.5″ across, and they’re great for flying around indoors.  Best of all, they only cost about $16 on Amazon. I gave them to everybody during a staff meeting, and scheduled another “meeting” about 3 weeks later to give everyone time to practice.  Encourage them to practice – most likely they’ll need it.

The original basic version of the Cheerson CX-10.
$16 on Amazon. 

Step 2. Execute

At this next “meeting,” we had two events on the agenda:

Obstacle course

We put 5 foam pads on the ground and on some tables.  Most of them were about 8″x 8″, but one was about 4″ x 4″ for an extra challenge.  The goal was to take off and land on each of the foam pads and return to the starting point.  A successful landing on the large landing pad earned 50 points or 100 points for the smaller one. We had a time limit of 3 minutes to complete the course.  Any crashes or other mishaps that require the pilot to touch the quadcopter (such as flipping it back upright) to were penalized with -30 points.

Foam landing pad. 50 points!


We setup a simple race course using a few poles as markers to make a loop.  We had heats with 4 quadcopters flying at a time.  Going around in a loop is harder than it looks.  In both heats, the winner was the one who managed to fly three laps without crashing.  Practice helps.


During the 3 weeks of practice, about 1/4 of my team managed to damage their quadcopters, so make sure that you have some spare parts available. Refer them to this repair guide.

Step 3. Profit

In addition to having a fun afternoon, I got to introduce a bunch of my coworkers to flying quadcopters, and we still fly them around sometimes (when no one else is watching).

Don’t forget this final critical step: Record your expenses and submit them under the category “Internal Meeting / Meal / Entertainment.”

Next year I think we’ll scale things up and go for a larger quadcopter we can fly outside – maybe a Hubsan X4.

3D Printed Spaceship Blade Guards for the Cheerson CX-10

If you’ve got access to a 3D-printer, or are just looking for an excuse to use one, these little blade guards are a neat project.  Making your quadcopter look like a spaceship seems to be quite the thing, here’s links to some examples.

Starship and Millenium Falcon Propeller guards.  This Thingiverse project has designs for both the Hubsan Q4 and Cheerson CX-10 – make sure you choose the right one!
Millenium Falcon Bladeguard for Cheerson CX-10
Millenium Falcon Bladeguard for Cheerson CX-10. I think this may have been the original one, the others may have been variants.
Millenium Falcon Spaceship Cheerson CX-10 bladeguard Remix
Millenium Falcon Spaceship Cheerson CX-10 bladeguard Remix
Cheerson CX-10 Quadcopter Wheels
This project is called “quadcopter wheels”, but it could pass for a rudimentary Tie-Fighter too.



Cheerson CX-10 Repair Guide and Tips

For all of you who received Cheerson CX-10 mini-quadcopters as holiday gifts and are learning to fly, bookmark this page because you’ll need it soon.  Here’s a list of some common repairs – you might as well learn about it now.

FYI – The Cheerson CX-10 is also sometimes sold as the Cheerson Q4.  This repair guide also largely applies to the similar Cheerson CX-10A and Cheerson CX-10C models.

Also just for reference, here is the manual for the Cheerson CX-10.

0.05 Ounces of Prevention

If you received this as a gift:

Cheerson CX-10

and it didn’t come with a prop guard (which weighs 0.05 ounces / 1.5g), then you should definitely get one. It will dramatically reduce the number of broken propellers you have to deal with, particularly while you are first learning.

Blade Guard and 16 replacement propellers for Cheerson CX-10
2 blade guards for the Cheerson CX-10

Replacing the propellers on your CX-10

Removing the propellers: The propellers can be removed from the motor shaft by pulling them straight upwards.  Sometimes they are loose, but sometimes it can take a little force. A pair of wirecutters gently put around the prop can help pry off the more stubborn ones.  Just take care not to bend the motor shaft or nick it.

wirecutter for propeller removal

Proper Propeller Placement: You will need to remove the propellers when they break, but even more often something will get tangled in them (in my house, that tends to be hair or carpet fuzz). Removing the propeller makes it easy to remove the offending tangle.

Cheerson CX-10 propeller placement
Arrows indicate motor rotation direction. Make sure you put the propeller on so that when it rotates it pulls the quadcopter up!

The key thing to remember is that when you put the propellers back, you need to match the direction of rotation with the propeller.  There are two types of propellers- some of them rotate clockwise, and some rotate counterclockwise. There is actually a letter code marking the propellers, but it’s easy enough to just remember the direction of rotation of the motor, and make sure that when the propeller rotates it should be pulling the quadcopter upwards.

Shorted out motor wires

This is a common issue. On one of my quadcopters, the motor wiring was worn and exposed on all four motors.  This is what it looks like:

shorted motor wire on Cheerson CX-10IMG_1066-zoom
Worn insulation and exposed wiring on Cheerson CX-10 motor

The problem is that the wiring will sometimes short against the metal body of the motor.

In my case, the quadcopter would still turn on and pair with the controller, but then once you tried to fly it the motors would spin for a half-second and then stop.  The LEDs would then start blinking like it need to be charged again (even though it had just been charged).

On another quadcopter, it would fly for a few seconds, and then fall out of the sky.  Amusing, but frustrating.

Here is the fix:

  1. Remove the propellers
  2. Unscrew the four screw in the bottom.
  3. Gently pry off the the bottom white enclosure (note the clips on each end of the PCB)
  4. Take off the blue case

You should end up with something that looks like this:

CX-10 parts disassembled
CX-10 PCB, motors, and battery.

5. Fixing the insulation on the worn motors is tricky because it might make it too thick to slide the motor back into the case.  In one case, I just removed the motor and remounted it so that the exposed part is away from the motor body.  In another case I used tweezers to put in a place a very small piece of black electrical tape to insulate the wire.

6. Put everything back together (be careful when replacing the propellers!)

Some other posts about the CX-10