Military eyes new Fakespace virtual environment

ORLANDO, Fla.-Fakespace Inc. of Mountain View, Calif., is designing a new projected-image display system that has broad potential application in military and aerospace simulation and training, system design, and mission rehearsal.

Jan 1st, 1997

By J.R. Wilson

ORLANDO, Fla.-Fakespace Inc. of Mountain View, Calif., is designing a new projected-image display system that has broad potential application in military and aerospace simulation and training, system design, and mission rehearsal.

The Immersive Workbench from Fakespace (http://www.fakespace.com) has computer-generated 3-D imagery that offers, quite literally, hands-on interaction, which enables viewers to reach into the display and pick up, move, or simply gather information about any object they see.

This new display approach combines primarily commercial off-the-shelf (COTS) hardware and software with a minimum of new components - themselves designed to become COTS across a broad range of user communities - then adds specialized experience to create a unified system, explains David Eggleston, vice president of Fakespace.

"You`re trying to convince someone something that is artificial is real, so it`s not just the engineering but also some psychology, dealing with how real human beings react with virtual objects," Eggleston says. "We do a lot of testing, having people try different things, and learning how to modify it to make it better."

Experts already envision applications of the Immersive Workbench for military command and control, medical surgical training, automotive design, and a host of other uses in which the translation of a database into a manipulable 3-D visual presentation is an advantage.

First introduced at Siggraph `96 in August as a commercial product, with a significant enhancement that premiered at SuperComputing `96 three weeks later, the new system is based on original work at the German National Computer Science and Mathematics Research Institute and follow-up developments by Stanford University, the University of Illinois, NASA Ames Research Lab, and the Naval Research Lab.

The basic structure of the system is deceptively simple. The initial design, which is scalable to fit a variety of requirements, comprises a work table about eight feet wide and 10 feet long, standing 36 to 40 inches high. The actual viewing screen is six feet wide by four and a half feet long, which is roughly the aspect ratio of a 35 mm slide.

The projector sits behind the table, aimed at a mirror under the table and angled at 45 degrees. The image then bounces up to the translucent screen directly above the mirror. The table is more than twice the length of the screen so it can provide the projector with the distance necessary to generate the final 6-by-4.5-foot image.

The first customers of the 20-person Silicon Valley company, which dictated that size: are the automotive industry, the military, and NASA Ames, where engineers are developing the medical simulation. Ames designers, for example, wanted an image roughly the size of a human patient, while military designers want a "virtual sand table" on which commanders can see an entire battlefield in reasonable detail.

The display can be angled anywhere from full horizontal, where the image appears to float atop the table, to near vertical, where the image seems to come out of the screen toward the viewer. In that mode, leaving the frame tilted a few degrees back allows for better structural support. Multi-positioning permits a wide range of uses, from placing a model or battle scene on a table to creating a "movie theater" approach for demonstrations to large groups.

But there also is a significant psychological difference between horizontal and vertical displays.

"When we see a vertical surface, even 3-D, we tend to think in terms of a movie theater - and we`re so used to movies we think of it as make-believe," Eggleston says. "But when we look down at a table, we`re used to looking at actual physical models, so when you see a virtual model on this table, it`s easier to think of it as real and interact with it, so the wall between the virtual and real world melts away very easily."

In addition to providing a natural physical reference in the form of a table, the Immersive Workbench approach also avoids the distractions - and often disorientation - experienced with wearing a virtual reality (VR) helmet, which cuts the user off from the outside world, Eggleston says. The glasses used in the VR table approach enable the user to see the real world around him as well as the display.

The resolution of the display typically is 1 million pixels in the visible area; the resolution of the database depends on the power of the computer that generates the visible display.

"If you are going to have a very detailed set of imagery, it will require a bigger machine, but there are a lot of tricks to deal with this," Eggleston explains. "From a god`s-eye view, you have less detailed images that become sharper as you zoom in. This allows you to represent a lot of information in a responsive way. But the resolution is there - on the table - to display all they can give us."

While a variety of customer-furnished software can be used for the database, Fakespace generally works with Coryphaeus Software of Los Gatos, Calif. For military displays, for example, Coryphaeus has modeled a variety of equipment, such as tanks and helicopters, and placed them on a database of California`s China Lake Naval Weapons Center near Ridgecrest, Calif. The models are activated with a logger file, which basically controls their movements.

But the 3-D display system has the potential to go far beyond training simulations or design; with the proper data feed, it also could provide a real-time view of an actual battlefield, cutting through such obstructions as smoke and dust, to give commanders a clear, life-like look at what is happening at that moment with real tanks, helicopters, soldiers, and terrain.

"From our standpoint, we`re just displaying images - whether they are a preprogrammed routine or coming from a JSTARS doesn`t matter," Eggleston says. "Provided the computer can take real-time input and incorporate it into the simulation, you would be able to see it on the Immersive Workbench, which is the back end of the process, the display and imaging. If the simulation software can incorporate real-time input, we`ll be able to view it. You provide the database expertise, we provide the display expertise."

While of obvious use for the new interlinked training exercises that combine real equipment and simulators onto a single virtual battlefield, this also could enable battle commanders to quickly identify enemy or allied forces and equipment, determine strategies based on comparative capabilities and locations - even experiment with different maneuvers by moving the virtual representations of real equipment before ordering any change of position for the actual combatants.

This capability also could avoid situations such as the U.S. military barracks explosion in Saudi Arabia. By creating a 3-D view of a facility, security experts could check their needs against real-world threats. For example, they could simulate potential car- or truck-bomb vehicles, along with a variety of explosives they could be carrying.

They could simulate how such an explosion could impact a given location using known blast parameters and structural dynamics. Experts could then move their virtual security fence and threat vehicle and perform other experiments until they determine an adequate safety factor.

"The data becomes more real and accessible as you are able to see things you may have actually known were there, but it didn`t become so obvious until you could actually visualize it," Eggleston says.

The rest of a standard Immersive Workbench system includes a Silicon Graphics engine to obtain the required RGB sequential video up to 1600-by-1200-pixel resolution at 60Hz (monoscopic) and 1280-by-1024-pixel resolution at 120Hz (stereoscopic). This feeds to an Electrohome Marque 9500 Series projector (with others optional) and users view the 3-D image through SteroGraphics Crystal Eyes glasses (other shutter glasses also optional).

Users can manipulate objects on the table with a Fakespace-built Pinch glove. Tracking choices include Ascension, Polhemus, or Logitech (other brands optional). A software interface also is in included to link to the client`s application. The power requirement is basic 120V AC, approximately 7 Amps.

The Fakespace Pinch glove system takes a different approach to object manipulation in VR. Most VR gloves are designed to be fully articulated and mimic the movements of the human hand, which means recalibrating for every user because hands differ in size and movement.

"We figured the movement of the hand doesn`t matter; what you really want to do is grab things and control them. So if you have a gesture-based glove, it doesn`t have to be calibrated. Just that process made the solution much easier to implement," Eggleston explains.

"With the Pinch gloves, you can highlight models, get information about them from the database (type, range, fuel, weapons, condition, unit - or actually pick it up and move it ) by just touching the model, which changes color or otherwise indicates you`ve captured it," Eggleston says. "You could program the action to suspend when you do that or continue while you deal with the tank, for example."

Sensors mount in each glove fingertip to detect contact between the digits of either hand. When sensors detect and verify contact between two or more digits, a signal goes to the host computer to take the appropriate programmed action, including lifting a 3-D image off the table. To anyone else wearing the system glasses, the person would then appear to be holding the object in his hand.

Using a standard RS232 interface and dip switch-selectable baud rates (from 9,600 to 19,200), the Pinch gloves are compatible with almost any computer system. The glove system includes two tracker mounts, two black medium gloves (left and right), driver software, and interface. Additional gloves, including custom sizes, colors, and designs, can be purchased separately.

The new development announced in late November is called the Dual User Option (DUO), a unique Fakespace capability that allows different people to see different views while looking at the same display.

"I could have one view of the battlefield given to commanders and another view to lower-level personnel - say a classified and a non-classified view at the same time. Both would see the same battlefield, but some would see more data than others, all controllable by whoever was running the simulation," Eggleston explains, adding DUO requires special glasses from Fakespace.

"Or if two people are standing in different locations, they can both see the best view because each is getting a unique view. This means everyone can see the same perspective, no matter where they are standing, or you can control it so you see everything in true perspective - one sees the front of a tank, one sees the back. In a simulation you`d probably want to see the back of the tank, but in a design application - say working on a new turret - you`d probably want everyone to see the same view," he says.

For the most part, the component technologies have been around a while. What the Immersive Workbench does is bring them all together into a system.

"We do have some things that are unique to us and some patents have been applied for, such as DUO," Eggleston says. "We think we`ve developed some good expertise in the entire area of virtual environments other companies just don`t have. The thing about the virtual environment is it still seems to have a little magic involved - how people react to these environments is not always predictable; solutions are not always obvious."

"This system is disarmingly simple - but being simple doesn`t mean it`s easy. There are a lot of different ways to do a translucent surface, for example. What it basically comes down to is having a lot of experience working in these environments and knowing where you can be flexible and where you must be spot on in your specifications, Eggleston says."

The complete Immersive Workbench system costs from $65,000 to $85,000 and the DUO is $25,000. That range represents different workbench sizes and the power of the projection system used, which is an environmental decision. In a location where the user can cut out all ambient light, a lower-power projector will suffice.

More in Communications