VIEWPOINT: Choosing the right high-speed imaging system for military applications

May 1, 2006
With a wide range of systems currently available, it can be difficult to select a slow-motion imaging system for a particular military application.

By Andrew Bridges

With a wide range of systems currently available, it can be difficult to select a slow-motion imaging system for a particular military application. Whether checking the sabot separation from the latest tank-killing shell or recording a missile launch, here are the performance parameters and specifications that users should consider before purchasing a slow-motion imaging system.

Today’s high-speed video cameras operate over a wide range of frame rates, from 60 frames per second (fps) to 250,000 fps. All high-speed video cameras operate at full resolution up to a certain speed, and then reduce resolution, or window, to achieve faster speeds.

When recording a cyclical event that takes place a certain number of times per second, the camera generally requires a minimum of three images per cycle to view and understand the phenomenon. If a chain gun fires 600 rounds per minute (or 10 rounds per second), for example, the camera must record the firing process at a minimum of 30 frames per second to capture the phenomenon for easy viewing.

A high-speed digital camera captures images of an Atlas missile launch.
Click here to enlarge image

With a noncyclical event such as a missile launch, careful planning is critical for capturing the action at the most significant moment. It is important to determine what temporal detail must be measurable in the finished image sequence or output video. For example, if a projectile is traveling at 500 meters per second (the Sidewinder missile easily exceeds this), and there is a one hundred meter field of view (FOV), the projectile will pass through the image window in one fifth of a second or 200 milliseconds.

However, if you need to capture 100 frames within this 100-meter FOV, you will need a camera that can take an image every 2 milliseconds, or 500 fps. If the FOV shrinks to 10 meters (still a relatively large FOV) with all other criteria remaining the same, the camera will require 10 times (5,000 fps) the speed to capture the same one hundred frames.

Frame rate will determine how many images the user will see, regardless of whether it is a cyclical or single event.

Record duration

Record duration simply refers to how much (in seconds) of the event you need to record. Currently, all high-speed videos use onboard digital random access memory (RAM) to save images. It is possible to “push” or extend record duration by reducing speed or resolution, but in essence you have to determine how long you need to record. Though costs for RAM have recently come down, it is still costly to load a PC-based system such as the 1024 PCI with 24 gigabytes of RAM.

If the event occurs intermittently, the user must trigger the camera to capture the appropriate action. Today’s digital high-speed cameras can remain in record mode almost indefinitely as they cycle data through their memory buffers on a first in, first out (FIFO) basis.

When a buffer is full, the first image recorded is automatically overwritten. The system continues to overwrite data until it receives a trigger signal, such as an optical or audio trigger, switch closure, or digital TTL trigger (for example, a fire command or keyboard keystroke). Depending upon how the operator configures the system, it can save all images recorded before the trigger signal is received, save everything only after the signal comes in, or save a variable percentage of pre- and post-trigger images.

Sophisticated systems should offer the ability to record a pre-determined number of frames whenever the system receives a trigger signal. For example, if an artillery piece fires a round every three seconds, you can record multiple firings of perhaps 64 frames without having to stop to download.

Furthermore, you can partition the available memory into as many as 64 segments. This enables users to make many short recordings without having to stop to offload a full buffer of images. Some systems can automatically download some or all of the saved images to a networked hard drive, perhaps thousands of miles away, before automatically rearming to await the next trigger signal.

Spatial resolution

New developments in motion-tracking algorithms enable motion analysis software to track accurately to about one tenth of a pixel, but whenever possible, you should have the full quota of pixels necessary to discern what you are viewing.

To achieve the desired framing rate (camera speed), you may be forced to sacrifice some resolution, as it is currently impossible to record mega pixel resolution digital images at 10,000 fps.

Because all high-speed video cameras reduce resolution to achieve higher speeds, users should determine what the pixel resolution is at the speed required before selecting a camera. However, if your field of view can be covered by a lower quota of pixels, and with irregular dimensions (such as 1,024 pixels wide by 624 high), then it is possible either to increase the record speed (to 5,000 fps for the APX-RS), or to increase the record duration, to improve the chances of covering an unpredictable event.

Bit depth: monochrome and color

Bit depth refers to how many shades of gray a sensor uses to switch from pure white to pure black. Older systems use 8-bits, or 256 shades, to change from white to black. Newer systems offer either 10-bits (1,024 shades) or 12-bits (4,096 shades).

For the most part, eight or 10 bits are more than enough, given that Microsoft Windows is an 8-bit operating system. To appreciate an additional two or four bits, you would need to invest in specialized, expensive hardware and displays. However, the additional bits are useful in advanced.

All high-speed systems should be available in monochrome and color. Both cameras use the same basic monochrome sensor, but the color versions have a color filter attached. Most systems adopt a color matrix known as a Bayer pattern to produce acceptable looking colors, from what is in reality a black and white sensor.

This simulated color requires three bits (one each for red, green, and blue) to every monochrome pixel, which is why color sensors have three times the number of bits; 24 versus 8, or 30 versus 10. If you do not have a critical need for color images, it is best to select monochrome systems, as they tend to be less expensive and significantly more sensitive while providing comparable image quality.

Shuttering and light sensitivity

Shutter speed is often confused with framing rate, but they are in fact two very separate components. If a high-speed camera is recording at 1,000 frames per second, ideally it is gathering light (exposing the sensor) for one thousandth of a second. However, with complex digital gating electronics, the actual time the sensor is exposed to light can be reduced to microseconds.

Maintaining the proper frame rate to shutter exposure time ratio is crucial to avoid blurring-a very important consideration when working with high-speed events. For example, if the velocity of a projectile is known, it is relatively easy to calculate the maximum acceptable shutter time to avoid blur-this is especially important when using a high-speed camera fitted with automatic exposure control, as an incorrectly programmed system can cause the camera to increase exposure time when ambient lighting drops below a certain threshold.

If users need to save an image sequence for later review or analysis, there are a few options.

You may be able to use the camera’s standard (RS-170) video output to connect to a frame grabber or good old-fashioned VCR. Most PCs, including laptops, will contain Ethernet, FireWire, or USB ports to connect to external devices. You should be able to download images directly into a recognized and usable format (for example, AVI, JPEG, TIFF).

It is best to stay away from protocols requiring specialized hardware unless your requirements demand them. For example, if you are filming explosives or projectiles and need to operate a camera from several miles away, downloading images becomes more complex, and some sort of telemetry or onsite download capability will likely be required.

In hostile environments, clients are often concerned with quickly offloading data to prevent it from being lost (such as in the event shrapnel or other FOD penetrates the camera’s enclosure). Unfortunately, onboard flash memory tends to offer little capacity and fairly long download times, and is just as likely as the camera to be “rearranged.”

Some systems require post-mission file conversion (which can be very frustrating, especially if a client is waiting anxiously) or can download quickly to a modified controller, but then may take days to download into the “real world.” PC-based systems operate at 1,000 fps directly inside the PC.

The system downloads directly to the hard drive (or networked hard drive) via the computer’s PCI bus for the fastest transfer. When testing different cameras, be sure to see how long it takes to access real data in a format and media type you can use.

System housing

What type of physical package does your application require? There are a wide variety of systems available, from inexpensive, low-resolution plastic units to huge systems built specifically for long record times (for example, for covering a missile’s launch or reentry into the Earth’s atmosphere). Some PCI systems use lower cost CCD or supersensitive megapixel CMOS sensors and have been adapted for use in personal computers or with laptops. More complex systems require housing that is engineered to operate reliably in the most hostile environments.

For systems that are operated remotely by computer, it is essential to become familiar with the software that is supplied with the camera. The software should be easy to use and intuitive. Some manufacturers will supply their systems with an SDK (Software Developer’s Kit) to enable advanced users to develop their own interface or to integrate camera control into an existing custom interface.

Andrew Bridges is with Photron Inc. in San Diego (www.photron.com).

Voice your opinion!

To join the conversation, and become an exclusive member of Military Aerospace, create an account today!