Industry knocks Air Force evaluations of COTS software

HANSCOM AFB, Mass. - A project of the U.S. Air Force Electronic Systems Center (ESC) at Hanscom Air Force Base, Mass., that aims at evaluating the effectiveness of commercial off-the-shelf (COTS) and so-called government off-the-shelf (GOTS) software for military command posts is drawing criticism from software vendors who claim the evaluations have been poorly planned and executed.

Dec 1st, 1997

By Wilson Dizard III

HANSCOM AFB, Mass. - A project of the U.S. Air Force Electronic Systems Center (ESC) at Hanscom Air Force Base, Mass., that aims at evaluating the effectiveness of commercial off-the-shelf (COTS) and so-called government off-the-shelf (GOTS) software for military command posts is drawing criticism from software vendors who claim the evaluations have been poorly planned and executed.

ESC officials defended their program, but an ESC official responsible for the project says he intends to investigate the criticisms leveled by the software vendors.

The two software evaluation projects in question are carried out at the National Product Line Asset Center (NPLACE), managed by the West Virginia High Technology Consortium Foundation in Fairmont, W.Va., and sponsored by ESC.

Vendors of the evaluated software say NPLACE officials lacked the necessary expertise to carry out useful evaluations, did not understand the products being evaluated, and lacked expertise in the needs of the Air Force.

Formation of NPLACE was in response to software systems that take too long to develop, with costs and schedules consistently exceeding available resources. NPLACE officials, who seek to highlight the commonality and variability inherent in families of systems, favor COTS software to reduce development time and costs, as well as to improve software quality and reliability.

Executives of NPLACE declined to comment for this story.

Bob Lencewicz, ESC program manager for the NPLACE software evaluation project, says the software evaluation process had built on years of similar experience within his organization, and insists it is an effective process.

He asked for details of the vendors` objections to the NPLACE software evaluations so he could gather more information. ESC officials have funded NPLACE for about 18 months at a level equal to $10 million over five years.

NPLACE officials have carried out the controversial evaluations of COTS and GOTS software in coordination with ESC`s Comprehensive Approach to Reusable Software (CARDS) and Portable Reusable Integrated Software Modules (PRISM) programs. All three efforts aim at capturing the cost and mission-effectiveness benefits of increasing the use of COTS software in the Air Force.

NPLACE is working to develop a "product line" approach for the creation of Air Force command post software. The process assumes a generic software architecture and a collection of reusable COTS components.

NPLACE`s role is to identify COTS and GOTS assets for systems developers, test the software against predefined product line criteria, and then pull the entire process together by developing the acquisition channels for the purchase of those products.

In recent months, NPLACE officials have been evaluating software in accordance with the ESC Generic Command Center Architecture developed by specialists in the PRISM program. To do so, NPLACE experts have sought out five types of COTS and GOTS software products: network managers; database managers; mapping services, or geographic information systems; automated message handling systems; and word processors/desktop publishers.

NPLACE leaders have asked software vendors to submit their products for evaluation. This, they say, will help systems developers save time in finding COTS components; reduce the costs and time of getting systems to market; and help COTS suppliers expose their product to new markets.

NPLACE officials have already evaluated several network management systems and geographical information systems. One vendor of network management tools involved in the NPLACE evaluations was Objective Systems Integrators Inc., (OSI) of Folsom, Calif. OSI leaders submitted their Net Expert network management framework for evaluation by NPLACE.

NetExpert helps manage large-scale networks by enabling its users to build a network management system, tailor it to the needs of a specific enterprise such as a command post, and then to monitor the network.

"It`s based on open systems, and it is object-oriented," says OSI Director of Sales Ray Marra. "It allows users to get a top-down view of a network, so they can monitor it for faults, network performance, security, and to make it easy to simplify provisioning of new services. It`s used in client-server networks, and it works with legacy systems."

NPLACE experts judged eight components of the NetExpert System with a test network running on one Sun Ultra 1 system and two Sun Sparc computers, including the operator workstation, and packages for servers and clients of a network. The test network NPLACE built used a database package from Oracle Corp.

NPLACE reported three types of results for each criterion: Yes, which meant that the test team confirmed that the software met the criterion; Yes/E, which indicated that the product met the criterion under certain conditions or with restrictions; and No, which meant that the test team "was unable to confirm that the product met the criterion."

In dozens of cases, NPLACE evaluators said they awarded Yes/E ratings because "a database product, such as Oracle, Informix, or Sybase (we used Oracle) is required." Other Yes/E results frequently referred to the need to use pre-packaged rule sets from OSI.

Ray Marra, director of sales for OSI, vehemently rejected the NPLACE report, test methods, and results. "This is not a piece of shrink-wrapped software," he explained. "Anybody testing it would need some guidance."

Marra contends that NPLACE evaluators failed to do a professional job. "The fact that they stated a conditional approval means nothing. Clearly, they didn`t understand the product. Either they don`t understand network management, or they don`t understand software products. They didn`t understand Unix, and they didn`t understand network systems.

"We told them we were not happy with their results [in the first release of the test report]. They came back with a second draft saying everything is conditional. It doesn`t tell you anything."

OSI sent a sales manager and an engineer to NPLACE to help the contractor use NetExpert, Marra says. "They say it needed a database," Marra notes. "All toolsets require databases."

Marra emphasized, "We want anyone looking at the web page where this is posted to understand that this information is flawed. The way the testing was done was inconclusive, and the way the testing was done was meaningless. The people they had doing the test were not experienced with network applications."

Marra says OSI executives had hoped to gain exposure for the NetExpert product to federal agencies through the evaluation process. "It had such potential but it was a big disappointment ... Maybe they should be exposed so people can be forewarned about this."

NPLACE also evaluated the Mapping Applications ClientServer (MACS) Release 3.3 application, a GOTS geographic information system (GIS). MACS originally was developed at the Air Force`s Rome Laboratory in Rome, N.Y.; now it is supported and marketed at the Sterling Software Inc. Information Technology Division in Bellevue, Neb.

MACS is a mapping, charting, geodesy and imagery system, written in C, that provides geospatial data, applications, and analysis support. It is designed to interface with the Air Force`s family of Intelligence Data Handling Systems. MACS is deployed in about 300 command centers worldwide, including organizations in the Air Force and the intelligence community. The biggest customer for MACS is the management of the ESC`s own Combat Intelligence System, which has used the software since 1990. Since then, the package continuously has been upgraded.

NPLACE evaluators used similar hardware to run MACS as they used to test NetExpert, and generated the same types of results for almost 300 criteria. The software earned Yes ratings on 166 criteria, Yes/E ratings on 33 criteria, and No ratings on 96 criteria.

Executives of Sterling Software expressed a mixed opinion of NPLACE`s test results, yet were not entirely dissatisfied with the results, says Gerald Greer, the company`s department director.

But Dave Boyd, senior engineer at Sterling Software responsible for MACS, pointed out problems. "We had no opportunity to get the users and testers up to speed," Boyd says. "Because we`re a GOTS product, our installation scripts are not as slick as they might be. But we got through.

"When we got a copy of the draft evaluation, we had serious questions with that," Boyd says. "They didn`t understand the concept of our product. Our product is not a conventional geographic information system. It is designed to be integrated with military mission applications. To accomplish the criteria, you have to do the integration. To our customers, we meet the criteria 100 percent."

Boyd continues, "I felt part of the problem with NPLACE was that they were not subject matter experts or domain experts. I didn`t catch that there was any command center background or military background there." Boyd notes that successful evaluations tend to be very interactive, but that there was little contact between NPLACE and Sterling.

"We feel the criteria were overly broad," Boyd says. "I`ve done this kind of work for 15 years, and have never seen such a broad range of requirements. I felt it was as if someone went through a GIS textbook and wrote down whatever they could find."

Boyd adds that the NPLACE evaluation team had corrected some glaring errors in the evaluation that Sterling Software specialists pointed out.

The ESC`s Lencewicz says he "didn`t understand [the software vendors` objections to the NPLACE evaluations] at all. I would like to know more. We have a good process in place here, part of which involves including the companies in looking at the software evaluation results. I am going to check into this," Lencewicz says.

NPLACE`s software test results are available on the World Wide Web at http://www.nplace.wvhtf.org/non-frame-index.html.

More in Test