War Games

Nov. 1, 2006
Increasingly, businesses are tapping commercial gaming technologies to enhance precision and realism for military training, simulation, and mission rehearsal systems.

Increasingly, businesses are tapping commercial gaming technologies to enhance precision and realism for military training, simulation, and mission rehearsal systems.

By Courtney E. Howard

The lines are blurring between commercially available computer games and military training and simulation solutions. “We’re living a technology transfer,” says Paul Kruzsewski, chief technology officer at Engenuity Technologies in Montreal. Although seemingly disparate, the two technologies have a great deal in common, right down to the hardware and software.

From a technological perspective, great similarities exist between commercial gaming businesses and military training and simulation companies. “Their goals are different; their cultures might be different, but when you look under the hood at the tools and pipelines, they are almost identical,” Kruzsewski explains. “They are similar applications-one is for training and one is for entertainment, but a lot of the technology is shared.”

The needs of small military training and simulation applications traditionally have been addressed by high-end specialized software and proprietary hardware systems. Times have changed, however, and today’s training and simulation professionals are looking to access, incorporate, and accelerate advanced technologies prevalent in computer graphics.

“Army personnel working with very large and expensive simulators during the day were going home to see their kids playing on a Microsoft Xbox, Sony PlayStation, or desktop PC, and doing things that their simulators couldn’t even do, and for a much lower cost,” says Andrew Elvish, director of marketing communications at Engenuity.

Software engineers at Krauss-Maffei Wegmann (KMW), a system provider of armored wheeled and tracked vehicles based in Munich and Kassel in Europe, are using Engenuity’s AI.implant middleware in the development of realistic military simulation solutions.
Click here to enlarge image

At the same time, computer gaming has been accessible to the greater public for roughly 30 years. An entire generation has grown up completely digital; older gamers in their late 30s and early 40s are now decision-makers in the military, notes Kruzsewski. Today’s mid-level managers are accustomed to computer games and they are trying to attract, recruit, and train high-school graduates, who largely play video games in their spare time. The latest recruits require a training experience with the quality and immersion that they are used to in games.

Military training and simulation solution providers are investigating the hardware and software innovations employed by the commercial computer graphics industry, with the goal of making more capable, higher quality, and less expensive systems. Another goal is to develop and release new versions and innovations at a faster pace-something the commercial gaming industry does very well and the simulation industry wants to do better.

COTS computer graphics

In general, when it comes to real-time computer graphics solutions and technologies, the game industry is five years ahead of the simulation industry, notes Kruzsewski. “The game market is so large that technology development in that market is happening at a speed that we’re not seeing in the pure traditional simulation space,” he says. “The money being invested in gaming is such that the technology is accelerating there much more so than in the military space.”

Kruzsewski is not alone. Commercial off-the-shelf (COTS) solutions have become dominant in military training and mission rehearsal applications, notes Ross Smith, president of Quantum3D in San Jose, Calif. “With few exceptions would you see someone not use COTS in today’s world. All of the key providers-such as Lockheed-Martin, Boeing, CAE, FlightSafety, everyone-have gone COTS. It would take almost an act of Congress to get a non-COTS training device through an acquisition process today.”

Engenuity’s Ai.Implant artificial intelligence engine aids developers in perfecting the reactions of crowds, ranging from an entire community affected by an incoming aircraft to select warfighters responding to a perceived threat.
Click here to enlarge image

The PC is the vehicle that has enabled COTS to take over, according to Smith. Military customers and their technology partners are on a quest for high-end visuals on standard off-the-shelf PC hardware. Rather than investing in specialized hardware that is costly to acquire and maintain, they want PC-based hardware and software that is commercially available and maintained by the independent software vendors.

The benefits of using COTS products in training and simulation applications abound. Among them are high product availability, low cost, open architectures for seamless integration, and advanced technologies with rapid refresh rates and easy upgradeability. In fact, the low cost and accessibility of the technology has brought about a phenomenon within the military training and simulation space. Gone are the days when training on a simulation system was reserved for the highest-ranked commanding officers, such as for strategic planning purposes. Technology is enabling the proliferation across the ranks, to warfighters on the frontlines.

True to life

Commercial gaming companies are always striving to attain a higher level of realism and better immerse the user in the overall game experience. The same is true when it comes to the world of military training and simulation. In fact, the emphasis today in visual simulation for military purposes is on a higher level of realism, and with good reason.

“If you’re trying to emulate the behavior of an F-16 or an Abrams tank, the fidelity of that simulation has to match the real world,” Smith says. “Otherwise, the pilot or the tank crew doesn’t feel that it’s realistic, and they lose what’s called ‘suspension of disbelief’ and the training value goes down precipitously. Even worse, if you introduce artifacts into the training application that aren’t real, that negative training can be carried into the battlefield and cause people to make terrible mistakes. The whole purpose of training is to simulate to the degree that’s possible and essential what goes on in the real world. The ultimate quest is where you can’t distinguish the simulated world from the real world.”

Military training and simulation, in fact, requires far more realism than commercial games. “Customers are looking for simulations based on real, rather than just synthetic, data,” says Robert Kopersiewich, Engenuity’s director of product management. “We’re seeing the inclusion of operational data more and more in simulation. It’s about being able to take in data and making sure our products can interface with the real world, so that the training and the real-world converge.”

Quantum3D technology simulates a cargo drop from an Airbus A400 cargo lifter while airborne. Particle effects are used to mimic dust produced on impact with the ground, whereas detachable entities allow separation of the palette from the aircraft.
Click here to enlarge image

In fact, training and simulation can help prepare for potentially catastrophic real-world events. Users can infuse mission rehearsal systems with the latest forecasted hurricane data and, as a result, begin training immediately for a hurricane that is expected to hit in three days. Increasingly in defense and government applications satellite data is being tapped before an operation, enabling everyone from warfighters to first responders to simulate a scenario in advance of the anticipated event. In this way, explains Kopersiewich, the use of real-world data contributes to the notion of just-in-time training.

Just as operational data aids users in preparing for what is to come, it is also an effective tool for learning from past experiences. An Engenuity client running air-based simulations on the company’s STAGE end-to-end solution for developing simulation applications, for example, is taking full advantage of real-time information gleaned from fighter jets flying training exercises. That data is input into STAGE and contributes to the simulation scene that is generated by the software.

Whereas the Engenuity customer applies real operational data to train for certain real-world scenarios, clients of General Dynamics Information Technology in Fairfax, Va., use such information to learn from past mistakes.

General Dynamics Information instruments vehicles and individual soldiers with various components-ranging from cameras to targets to control systems-such that their actions are monitored during a mission rehearsal scenario on Bradley ranges, Striker ranges, and the like. “We put cameras on the ranges and instrumentation on the vehicles themselves to collect real-time data off the internal bus of those vehicles,” explains Raymond Shepherd, senior director for General Dynamics Information integrated instrumentation. “In other words, you can sit in our control room and actually see what the gunner in a Bradley is seeing when he pulls the trigger.” Real-time video offers the ability to view the crew inside the vehicle, watch their actions and reactions, and listen to their conversations as they engage the target enemy.

A primary purpose of the General Dynamics Information Digital Multipurpose Training Range is to assemble an after-action review for the crew or platoon at the completion of their scenarios or runs, Shepherd says. Those involved in training can play back the video, see how well they did on a particular run, and determine which areas need improvement. In this way, after-action review during simulation and training endeavors can contribute greatly to increased mission effectiveness, improved preparedness, and potential life savings today and into the foreseeable future.

Future finds

Commercial and military computer graphic, simulation, and training technologies continue to advance and deliver a better experience and more value to the user. And the Department of the Defense and its technology partners are readily applying these technologies in innovative ways. For example, companies are investing research and engineering time, money, and effort into making networking technology robust enough to support networked training in joint-force exercises. The goal is to have the Army, Marines, Navy, and Air Force all working together in a training scenario, and contribute to the Department of Defense’s overall network-centric vision.

Another future endeavor, and one which Shepherd credits the Army as having come a long way with, is to combine virtual and physical training in a single, live experience. He envisions a platoon or company training on a live-fire range and the rest of the battalion in a simulation exercise, with the two tied together seamlessly. Such an accomplishment enables a company, for example, to exercise its command and control over a much larger force without putting that force at risk on the field-something the military as a whole has been moving toward for years, Shepherd continues.

The quest for increasing fidelity with real-world scenarios has brought about another emerging trend, embedded training. The philosophy behind embedded training is integrating training and simulation technologies-to provide command and control information, C4ISR information, synthetic vision, avionics, and vetronics-into the actual combat vehicle itself, rather than building simulators in a fixed training facility called “a schoolhouse,” as has been done traditionally.

Quantum3D portrays a cargo drop shown from the aircraft exterior. The company’s real-time Shadow plug-in for Mantis generates shadows under each entity. Volumetric clouds and cloud shadows are also active.
Click here to enlarge image

Smith says embedded training is an amalgamation of advanced embedded visual computing technology and software technology from sensor and visual simulation. “If you take into account the long deployments soldiers have now, it’s impractical to bring them back for training,” Smith admits. “Having this kind of capability in the field in the actual vehicles they are going to use makes perfectly good sense and provides extremely high fidelity. When warfighters are deployed in Iraq or Afghanistan, for example, they are able to practice gunnery, driving, tank commanding, and more while in their actual tank.”

From a hardware perspective, advancements in mobile computing have facilitated the emergence of embedded training technologies-a notebook computer with a fast processor, considerable memory, and a powerful graphics chip, for example. Smith’s company, Quantum3D, currently harnesses those technologies in the construction of tactical computers, which lend to being embedded in a vehicle.

Embedded training, although promising, is not ideal for all military training applications. Among the applications that are difficult to do in an embedded environment is flight simulation. “When I’m in the cockpit flying an F-16 or a helicopter, my field of view might be 180 degrees by 80 degrees,” Smith notes. “No single graphics chip on the planet can deal with that, not with the kind of fidelity that people need for a proper training environment.”

Such a feat requires the use of several projectors, displays, and computers to drive it-all of which add more power, weight, size, and cost. Four racks of equipment are necessary to power a high-fidelity image generator for the pilot and the weapons system officer in an F-15 Eagle jet fighter-with a power requirement of roughly 10,000 watts. At the same time, a warfighter in a Bradley Fighting Vehicle is effectively looking through portholes and at displays-calling for approximately 120 watts. It’s a factor of 100 difference between the power required for an aviation simulator and for a ground vehicle, Smith says.

A ship is shown underway with the use of Quantum3D’s Advanced Ocean plug-in to generate the sea surface. Wakes and bow spray effect are active and dynamically adjust based on forward velocity, pitch, roll, and yaw. A particle effect is used for the ship stack exhaust plume. A view of the same ship is shown in infrared mode using the Quantum3D Vixsen Sensor Simulation plug-in to Mantis and Quest Sensor Simulation postprocessing hardware (bottom). The heat signature is generated by the exhaust plume particle effect. The terrain is Q3d’s Southern California demonstration database.
Click here to enlarge image

Embedded training is useful in not only maintaining warfighter skills through practice, but also upgrading their skills by teaching them new strategies. “Let’s say that the Army discovers a new tactic to deal with improvised explosive devices (IEDs),” Smith suggests. “You can’t send people back to Fort Knox to practice it, and you certainly don’t want them to learn it on the road in Fellujah. With embedded training, it’s with them all the time; whenever they are not on a mission, they can practice for the next mission. If they are going to conduct a mission that they’ve never done before, with an embedded trainer, they can practice that mission before they attempt it and do so in theater.”

Embedded training technologies enable personnel to learn and practice a new set of lessons, which can be sent to vehicles equipped with embedded trainers. This trend enables remote just-in-time training, as well as time and cost savings in not having to transport personnel to physical training, simulation, and mission rehearsal facilities. “Users can import new lessons into the embedded trainer easily, and hopefully save lives and improve mission effectiveness in the process.”

Click here to enlarge image

null

Voice your opinion!

To join the conversation, and become an exclusive member of Military Aerospace, create an account today!