Physics-based radar simulation

Real-time, high-fidelity visual scene simulation is becoming ubiquitous in the training, modeling, and simulation community. With this trend comes a growing need for more than simple "out-the-window" scene simulation. Full training, modeling, and simulation require simulation systems designers to include actual sensors to bring complete realism to an exercise.

By Chris Blasband, William Jorch, and Mark Sigda

Photon Research Associates Inc.

Real-time, high-fidelity visual scene simulation is becoming ubiquitous in the training, modeling, and simulation community. With this trend comes a growing need for more than simple "out-the-window" scene simulation. Full training, modeling, and simulation require simulation systems designers to include actual sensors to bring complete realism to an exercise.

As radar becomes increasingly sophisticated - and test flights more and more expensive - the need increases to simulate radar performance as well as simulate and test pilot and radar operator performance. To do this, simulation systems designers need the ability to simulate the output of different types of radar - especially imaging radars such as synthetic aperture radar (SAR), Doppler beam sharpening and real beam ground map (RBGM).

Engineers from Photon Research Associates in San Diego have developed a real-time simulation of imaging radar that operates against synthetic environments. They base this new technology on modeling true physics versus statistical representations of a scene. First, the physics-based approach maps scene materials such as dirt, snow, concrete, and metal to radar cross section values, and then it uses the radar cross section values as inputs to the simulated sensor display.

RadarWorks, a module in the Vega environment from MultiGen-Paradigm Inc. in Dallas, generates radar scenes and simulates specific radar displays. The first release of RadarWorks includes two imaging radar modes: one for SAR and the other for RBGM.

RadarWorks is a commercial off-the-shelf (COTS) package with two main software components: RadarVision, and the sensor model. RadarVision in real time generates a pixelized radar cross section map for frequencies from 1 to 27 GHz and for four polarization pairs - VV, VH, HV, and HH. The sensor model emulates the actual radar device. Currently, RadarWorks models one imaging radar mode: RBGM.

RadarWorks is available as a custom development system or as a run-time library. Since RadarWorks is physics-based, it uses the Texture Material Mapper to generate material textures from all of the visual database RGB textures. RadarWorks replaces the photo textures in the 3-D polygonal database with the material textures. The system uses material information, radar/polygon geometry, and radar parameters to generate radar cross-section values. The radar cross section values come from a table lookup and interpolation on the natural-materials radar cross-section database and cultural features database.

The RadarWorks sensor model then uses the radar cross section map generated from this process. The sensor model generates a radar display by convolving the radar cross section clutter map with user-defined values for radar parameters such as beamwidth, range bin size, antenna pattern, and signal-processing information.

RadarWorks is built on top of Vega, IRIS Performer, and OpenGL and is currently designed to operate on Silicon Graphics Onyx computers with InfiniteReality or RealityEngine2 graphics hardware.

RadarWorks generates an "ideal" SAR image by developing a pixelized radar cross-section map based on user-selected ground squint angle and radar incident angle. The SAR image is "ideal" because it lacks specific sensor effects such as defocusing and motion compensation. RadarWorks also simulates an RBGM display, as a radar operator sees it, in real-time as the user flies through the Vega scene. The user selects an imaging radar mode and specifies important RBGM radar parameters such as frequency, polarization, beamwidth, minimum/maximum range extent, minimum/ maximum scan angle, and scan time.

Vega is MultiGen-Paradigm`s software environment for real-time visual simulation and virtual reality, which provides for rapid prototyping, building, editing, and running applications quickly and easily. The software runs on Silicon Graphics workstations with the IRIX operating system version 6.2 or above, as well as on Windows NT-based PCs and workstations.

MultiGen-Paradigm software engineers build Vega for the Silicon Graphics on the Silicon Graphics IRIS Performer, IRIS GL, and OpenGL, while they base the Windows NT version on the company`s rendering engine and OpenGL. Vega has two primary components: the Vega development libraries, and the LynX graphic user interface. Vega also has several graphics tools, a C application programming interface, sample databases, and support for all optional Vega modules. Using Vega, the user may load and manipulate a database, create and move observers and players, and control the visual and audio environment.

RadarWorks, meanwhile, uses the same 3-D polygonal databases as Vega. Vega, while running on the Silicon Graphics, supports all visual database formats for which there is a Performer loader. In addition to the standard Vega input database of polygons and textures, RadarWorks uses the following supplemental inputs:

n Material-coded textures. These are generated using Paradigm`s Texture Material Mapper, with which the user creates material maps for each Vega photographic texture by converting each texel from RGB colors into a linear combination of three materials. Materials can be natural (tree, dirt, snow, etc.), or man-made. The user can easily expand the database describing these materials.

n Database of mean backscatter coefficients for natural terrain. RadarWorks uses an extensive terrain database from validated measurement programs, which updates at regular intervals, to generate a high-resolution, real-time radar scene for all radar frequencies, polarizations, and incident angles. It contains all of the backscatter coefficients from different terrain (soil, rock, asphalt, etc.) as a function of the radar parameters. Photon Research obtained this database from Artech House Inc. in Norwood, Mass.

The database of radar cross section values, is a function of frequency, polarization, and geometry, for each cultural feature in the scene. RadarWorks comes with a database of sample cultural features, such as bridges, water towers, and buildings, as a function of frequency, polarization, incident angle, and azimuth angle. The format of the cultural feature database is documented so the user can incorporate any measured or modeled data desired.

Paradigm`s Texture Material Mapper enables the user to transform all scene database textures into material codes needed at runtime. Paradigm`s infrared simulation software environment, SensorVision, uses this material information to render spatially variant radiance on scene polygons. RadarWorks uses this information to render proper radar cross section values on scene polygons.

In principle, the scene database required for sensors is no different from any other scene database that Vega uses, except in one important point - sensors require knowledge of what real materials are associated with the scene. This knowledge is vital to accurate quantitative rendering for all sensor simulations.

The user classifies textures with Texture Material Mapper to specify, for example, that one kind of green is painted metal and another green is vegetation. Once the user maps each texture in a scene database to real materials, he can use these mapping assignments for any simulation using those textures.

SensorVision and RadarWorks read the same Texture Material Mapper files, so the user needs only to classify a scene database with Texture Material Mapper once. SensorVision uses the optical properties data for the materials and RadarWorks uses the electromagnetic properties.

The RadarVision portion of RadarWorks computes the radar cross section of each pixel based on the radar/polygon geometry, frequency, and polarization of the radar and material of every texel in the scene. It determines the radar cross section through table lookup and interpolation on the supplemental databases.

RadarVision correlates with the Vega visual "out-the-window" view to create the same 3-D geometrical representation of a scene and RGB textures as Vega. There is true geometric and spatial correlation because both products use the same database as starting points.

The output of RadarVision is quantitative, in that it expresses each pixel in units of dBsm. Quantitative results allow the user to examine the true magnitude of the radar return from all objects in a scene. RadarVision computes radar shadowing from terrain and cultural features in real-time. It computes shadows via IRIS Performer, thus making use of the Silicon Graphics hardware. This guarantees that geometry-dependent shadows are computed in real-time for every object as the user flies through the Vega scene.

Additional features in RadarVision include:

n polygons that are material textured and spatially correlated with Vega RGB textures;

n a flexible database format that allows for input of user-defined measured or modeled radar cross section data;

n No user restrictions to SAR-type trajectories; the user can use free flight to produce accurate radar cross section maps based on real-time geometry computations; and

n User-defined radar resolution down to sub-meter.

RadarWorks 1.0 includes RBGM mode. It simulates the mode of mapping radar in which the screen displays returns from the antenna with no signal processing. The amplitude of the signal modulates the brightness or color of the display. The azimuth and range of a pixel on the screen relate directly to the azimuth and range of the objects that reflect the radar energy. The azimuthal beamwidth of the antenna limits the azimuthal resolution of the display. RBGM does not involve any radar signal processing.

The RadarWorks RBGM mode uses the RadarVision radar cross section map as input. As the user flies around the area of interest, RadarVision updates the radar cross section map based upon the frequency and polarization of the radar and the radar/scene geometry. RadarWorks convolves the RBGM antenna pattern with the RadarVision scene, based upon user defined quantities, and performs proper range/azimuth cell integrations to produce a true RBGM Plan Position Indicator (PPI) display which is updated in real-time.

Since the RBGM mode does not involve any radar processing, the RadarWorks user inputs only describe real characteristics of the radar including antenna beamwidth, range resolution, and scan angles. User inputs or controllable parameters for RBGM include:

- 3dB beamwidth of the antenna;

- scan angle to the left and right of aircraft center;

- radar incident angle (antenna tilt);

- range bin size (range resolution);

- beginning (minimum) and ending (maximum) range for the map (scan volume size and center);

- azimuthal increment at which to step the beam for developing the map; and

- scan speed.

Doppler Beam Sharpening (DBS) mode will come in the second release of RadarWorks. Each beam position transmits several pulses of radar energy and processes them so that reflecting objects within the antenna beam are separated azimuthally based on their Doppler frequency. It is expected that DBS will include the current controls in RBGM with additional parameters such as PRF and FFT size.

True SAR mode in the second release of RadarWorks will limit azimuthal resolution of radar by the size of the antenna aperture, where larger apertures yield improved resolution. RadarWorks will simulate SAR, which synthetically creates a large aperture by successively transmitting pulses and collecting data (as the aircraft flies) as if the operation were taking place at each element of a very long antenna.

User inputs will describe the size and location of the desired map as well as pertinent radar characteristics such as wavelength. The creation of SAR maps is subject to errors from a variety of sources. The most significant of these will be available to the user with parameters that describe the errors and their compensation.

Although true SAR mode is also in the design process, engineers expect it to include location of center of SAR map, range extent of SAR map, range-azimuth resolution, a quantitative description of the error introduced by each of several different error sources (motion compensation, de-focusing, etc.), update rates, and PRF.

The user interfaces with RadarWorks in the exact same way as with Vega. The Vega Lynx graphical user interface or a C application-programming interface can generate RadarWorks Application Definition Files. The RadarWorks developer`s package includes a sample application program demonstrating the C API.

Chris Blasband, William Jorch, and Mark Sigda are engineers at Photon Research Associates Inc., 5720 Oberlin Drive, San Diego, Calif., 92121. They can be reached by e-mail at cbb@photon.com, or by phone at 619-455-9741.

More in Communications