Air Force looks to the next generation of avionics networking

Dec. 1, 2004
HANSCOM AFB, Mass. - In September, Air Force and industry leaders ran a time-­critical targeting exercise at Hanscom Air Force Base, Mass.

HANSCOM AFB, Mass. - In September, Air Force and industry leaders ran a time-­critical targeting exercise at Hanscom Air Force Base, Mass. Their goal was to reduce the sensor-to-shooter timeline by combining data from disparate sensors, air platforms, and ground stations.

Aircraft such as this E-8C Joint Surveillance Target Attack Radar System (Joint STARS) collect vast amounts of sensor data as they perform real-time surveillance. Its future avionics systems will have enough bandwidth to download that data through mobile battlefield Internets.
Click here to enlarge image

The test was a glimpse of Air Force leaders’ plans for Constellation Net, a battlefield Internet that will connect sensor platforms in land, air, and space as part of the Pentagon’s Global Information Grid.

Constellation will reach far beyond mere modems and radios, and network every aircraft’s avionics. The weakest link in that chain today is the airborne communication web, ­experts say. In response, Air Force planners are building a network to cover everything from aircraft avionics to smart bombs and worldwide command centers.

“With the Air Force setting the vision and the major industry players providing domain expertise, we’ve developed a working architecture that is truly open and all-inclusive,” says Doug Barton, director of network-­centric systems for Lockheed Martin Integrated Systems and Solutions in Gaithersburg, Md. Air Force leaders awarded a contract in August, 2003 to Lockheed Martin to build the Constellation architecture.

“Today’s network looks like an organizational chart, with stovepiped systems reporting vertically up the decision chain but unable to communicate horizontally,” Barton says. “The Constellation architecture will look more like a round table, with dozens or even hundreds of systems, sensors, and warfighters logging in to a common network. Establishing that ‘battlefield Internet’ is key to the Air Force’s vision for net-centric operations.”

Lockheed Martin is building that battle­field Internet for the Air Force, along with partners including Boeing in St. Louis, Mo.; Gestalt LLC in Camden, N.J.; IBM in Armonk, N.Y.; L-3 Communications in New York; MITRE Corp. in Bedford, Mass.; and Raytheon Co. in Waltham, Mass. That team ran the recent test at Hanscom Air Force Base.

“We successfully demonstrated that the Constellation architecture can bridge traditional stovepipes, enable machine-to-machine data exchange, work across security domains, generate automated courses of action, and ultimately reduce the timeline for finding and engaging targets. This is a tremendous step forward for Air Force transformation,” says Denny Agin, Raytheon’s Integrated Product Team leader for the demonstration.

COTS is not enough

Their effort will depend on commercial off-the-shelf (COTS) components, as well as a burst of new research into wireless and optical networks for avionics platforms.

“I don’t think COTS components will satisfy airborne networking as a part of the flight platform, and the extension of the global information grid,” says Wayne Bonser, technology advisor in the connectivity branch of the Air Force Research Laboratory (AFRL) Information Directorate in Rome, N.Y.

“One area that needs attention is how to do network management across the entire Constellation Net, which is terrestrial, air, and space. It’s difficult, because those areas have been treated separately for many years,” he says.

“The airborne network is the weakest, without question. The only existing ability is Link 16, and then we’ll have the Joint Tactical Radio System when that launches. But there will still be legacy systems and stovepipes by 2020, so it’s a significant challenge.”

One possible solution is ORACLE, the optical radio frequency­-combined link experiment, being studied by researchers at the U.S. Defense Advanced Research Projects Agency (DARPA) in Arlington, Va. Another new solution is DLARS, the data link automated reporting system. Troops tested this airborne data link in the Joint Expeditionary Force Experiment (JEFX) at Nellis Air Force Base, Nev., in August.

Networking challenges

Engineers face many challenges in their quest to forge a free-space, high-capacity link between ground stations and airborne platforms, says Bob Kaminski, technology advisor in the distributed information systems branch at AFRL Rome.

A successful system will have to compensate for atmospheric distortion, probably by using nonlinear optics. Aircraft designers need to create a clear aperture in the airplane, using specialized window coatings.

Once an aircraft is linked to the ground, it will need faster onboard avionics to handle all the new data. “We’re currently using the 1553 databus, which runs at just a couple of megabits per second. But we want to get to gigabits per second,” Kaminski says.

“We need optical-to-electronic conversion to process the data. So we will use highly integrated photonics circuits. There’s some fiber on existing planes, but we must do more, like routing in optics, and amplification in optics.”

That technology could come from Navy HIP, a DARPA program to use highly integrated photonics (HIP) to manage communications of the sensor suite on an aircraft. The technique runs radio frequency communications over fiber, and is targeted for use on the EA-6B Prowler electronic warfare jet.

Security is another challenge in linking avionics to a global battlefield Internet.

“The Air Force wants to network the battlefield, including bombs, airborne assets, UAVs, and satellites, in a seamless, global, high capacity, Internet Protocol-based link. But we must also make it LPI and LPJ,” Kaminski says. LPI means low probability of intercept, and LPJ means low probability of jamming.

This battlefield network would also boost demand for precious bandwidth.

Air Force leaders envision a future in which swarms of UAVs and smart weapons communicate with each other. When they strike, they will send images of bomb damage back to commanders in distant control centers. But current bandwidth is too scarce to support this vision. First, engineers must create algorithms to control data compression and spectral management.

An interim solution could be DRIER, dialup rate IP over existing radios. Created by engineers at Northrop Grumman’s Information Technology sector in Herndon, Va., this technique enables airborne weapons systems to exchange data with ground stations or other aircraft. The method sends data over existing ­radio channels, such as high-frequency single-sideband radio, UFH, VHF, and satellite links. Airmen tested it on an E-8C Joint STARS in December 2003.

“We could integrate this capability over existing aircraft - to bring an aircraft into the 21st century without replacing all the cables and antennas,” says Warren Debany, technology advisor in the information- grid division at AFRL Rome.

Another way to build a battlefield Internet for avionics is to place portable data nodes in the airspace, flying onboard ­fueling tankers. ROBE, short for Roll-on Beyond Line-of-Sight Enhancement, is a set of portable satellite antennas that airmen can quickly install in KC-135 tankers. They are built by Modern Technology Corp. in Dayton, Ohio, and Northrop Grumman Information Technology. Air Force leaders plan to use Global Hawk unmanned aerial vehicles and Milstar satellites as flying Internet nodes in the future, but ROBE gets the job done faster.

Networking the JSF

Even before they have built the first F-35 Joint Strike Fighter (JSF) on an assembly line, planners at prime contractor Lockheed Martin Aeronautics Co. in Fort Worth, Texas, are looking for ways to improve it.

“It’s too early to start having conversations about block 4 and block 5. But technology development takes a long time, so we’re writing down a vision roadmap with 50 or 60 things that we’d like to do if the technology can support it,” says David Jeffreys, senior manager for product and technology roadmapping for the F-35 program at Lockheed Martin.

Jeffreys’s group has twin goals. First they build a product roadmap, identifying products that users want. Then they build a science and technology roadmap, finding ways to support those products with new research.

“I can’t share details of the list, but it includes lots of networking and communications, particularly for force transformation,” he says. “The program was built on four pillars: lethality, survivability, affordability, and supportability. Now we’re thinking of adding one more: interoperability.”

For example, a flight of fighters may communicate over their own local-area network (LAN) during a mission. That presents a challenge when they need to share data or voice traffic with a different group of planes. With better interoperability, those planes could communicate through a third node.

Jeffreys’ group will try to solve those problems with COTS components, to save time and money.

“We are very much in a mode to leverage commercial work; the Department of Defense does not drive technology any more, so we use commercial standards,” he says. “We use FibreChannel in the aircraft to move data between different boxes, as opposed to MIL-STD 1553, which the commercial world never picked up.”

By the same token, F-35 designers chose the 6U VME form factor to maintain open architecture and commonality.

“That’s one of our first questions to suppliers; if a new technology doesn’t survive the “Betamax vs. VHS” wars, then we don’t want to get stuck with the Betamax,” Jeffreys says.

That is the same reason the F-35 does not use wireless links to move data within the aircraft. They get “more bits for the buck” by using fiber and copper wiring, and also avoid electromagnetic interference inside the airplane.

They are now testing a wireless data link for ground communications. Researchers involved in the Lockheed Martin F-16 jet fighter program are completing a technology demonstration program to use the 802.11 wireless ­standard with an amplifier and repeater arrangement.

“No one ever expected fighter airplanes to use that kind of commercial technology, but we’re trying it,” he says.

Controlling innovation

The group’s main challenge in buying COTS parts is installing them.

“When commercial industry comes out with a next-generation FibreChannel that’s 10 times better capacity, we’ll want to fit that into the aircraft without ripping everything apart,” Jeffreys says.

“There does come a point when you have to rip everything out and start over, but we want to minimize breaking it apart. We would tell industry that we want your innovations, but not if they’re disruptive to existing infrastructure. It’s good to have disruptive technology, but it’s not good to have it every six months.”

Finally, the researchers will turn to government laboratories for components they cannot buy in the commercial market.

Jeffreys’ group will share its technology roadmap with domestic scientists at the Air Force Research Lab, Naval Research Lab, Naval Air Systems Command at Patuxent River Naval Air Station, Md., the Office of Naval Research, and U.S. Space and Naval Warfare Systems Command (SPAWAR) in San Diego.

They will work with scientists overseas at the Defence Science and Technology Lab in the United Kingdom (DSTL), Australia’s Defence Science and Technology Organisation (DSTO), the Netherlands National Aerospace Laboratory (NLR), the Netherlands Organisation for Applied Scientific Research (TNO), and Norway’s Defence Research Establishment (FFI).

The new technology is not just for the F-35. To get the most return from their investment - and boost interoperability - they will share technology roadmaps and new products with existing jet fighter platforms such as the F-16 and F-22. Air Force leaders will focus more attention on the F-22, since they predict a shorter lifetime for the F-16, he says.

In fact, the three aircraft already share common elements, such as the electronically scanned radar array. The part numbers are different because the radar needs a different shape to fit in the nose of each jet, but the essential technology is the same.

The driving force behind this wave of avionics work is a common need by military leaders to integrate combat aircraft into network-centric warfare, Merluzeau says.

Their top priority is data communication systems, mostly datalinks with other aircraft in the wing, tankers, UAVs, and accelerated sensor-to-shooter links. A second priority is improving onboard databuses for aircraft ranging from the F-22 to the F-35 and UCAV.

Onboard networking is crucial because sensors will soon generate huge quantities of data on reconnaissance aircraft - from FLIR or SAR (synthetic aperture radar) data capture to visual imagery. The flood of information would overwhelm basic co-axial cable, so engineers are rushing to install fiber optics.

“We’re not at fly by light yet - that is many years away - but fiber optics show great promise, and will grow substantially,” Merluzeau says.

Advances in navigation and displays will also drive the avionics market. Pilots use displays for cockpit instrumentation work in aircraft from the C-5 and C-130 to the F-16, Tornado, F-5, and T-38. More specifically, F-16 and F-22 pilots use helmet-mounted displays. Transport pilots on the C-17 and Airbus 400M depend on head-up display as their primary interface during takeoff, ascent, descent, and landing.

Avionics engineers will feed those displays with data from advanced vision systems, such as infrared imagery projected onto head-up displays, enabling pilots to see in night or poor visibility. That is particularly important for military pilots landing at remote airfields with mountainous terrain and poor flight aids - such as in Pakistan, Afghanistan, and Iraq, he says.

Engineers will also link cockpit displays to synthetic vision systems. In this scenario, an onboard global positioning system feeds the aircraft’s location to an onboard server, which searches its terrain database for topographical data about the nearest landing strip, and projects the enhanced image on the pilot’s display.

An IP address in each cockpit

Improved avionics is pointless unless it includes better networking, says Bruce Carmichael, vice president of Air Force programs for L-3 Communications, in the company’s Communications Systems West division in Salt Lake City.

“Communications is the enabler of the passage of information and data elements through the kill chain, ultimately to the guy who pulls the trigger,” he says.

“Hopefully that flows machine-to-machine, so it doesn’t matter how the data got there. You need communications that can pass data from various and distributed sources to one point. The bottom line is these things have to be IP-enabled. To have a common thread, you need to have an IP address in each cockpit.”

Designers at L-3 build wideband wireless communications for military avionics, particularly common data link (CDL), spanning from 10 megabits per second to 300 megabits per second.

“We used to do point-to-point networking, but we are now doing IP-based, network-enabled systems, so we’re moving toward net-centric. We provide a wideband backbone for the theater, with the ability to connect with other IP networks at more tactical levels - such as future IP-enabled radios - and also provide reach-back, for out-of-theater communications,” he says.

This capability is crucial for surveillance aircraft, which collect massive amounts of sensor data and need a way to pass it back to main agencies for analysis.

So L-3 designers build the multiplatform common datalink for the E-10A aircraft, in the battle management ­command-and-control (BMC2) system. They built data links for surveillance planes used recently in Afghanistan and Iraq, including the U-2, Global Hawk, and Predator, he says.

That data is not truly useful until it reaches warfighters on the ground. “I think if you read the tea leaves, with constrained budgets and the size of force structures, what we need to do is make each shooter a lot smarter, through the information infrastructure,” Carmichael says.

To reach that point, communications engineers will need to develop six major new technologies:

• High-performance antennas, where the typical blade antenna or omni­directional antenna is not good enough - the system demands a steerable, pointable antenna, for high capacity, long-distance communication that can handle any link, whether airborne, line-of-sight, air-to-air, or satellite communications;

• modular system architectures, where L-3 planners want FPGA-based common module sets, so they can configure a family of modules to match any applications, then fit a powerful system inside a tiny UAV, by splitting it up into various components, tucked away into different corners of the chassis;

• SCA compliance, where Pentagon leaders have mandated that suppliers comply with the Software Communications Architecture, allowing them to port waveforms from one hardware platform to another;

• software definable modems that can act as a computer, loading new waveforms instantaneously so soldiers can switch from a line-of-sight to satcom network while they’re in the field;

• wideband mobile routers, where net-centric communication depends on Internet Protocol-based networks so data can flow from point A to point B through any number of intermediate nodes - whether radio frequency, microwave, or optical links - as long as the trip is managed by a smart router on board the airplane; and

• network systems management, where to keep itself healthy the network must be self-forming so it can add and drop users dynamically, and check each aircraft’s technical capability, and steer data around antenna blockages.

Voice your opinion!

To join the conversation, and become an exclusive member of Military Aerospace, create an account today!