Breakthrough technologies

Technology, like pop music, changes rapidly. Both thrive from constant innovation and fickle audiences. In the government marketplace, it is often up to the systems integrator to divine the Next Big Thing, to look beyond staid thinking to new innovation.

"Guitar groups are on their way out," a Decca Records executive said when turning down the then-unsigned group, The Beatles.That the Fab Four went on to become the world's best-selling musical act only shows the danger in safe thinking. After all, crooners were the moneymakers of 1962. Technology, like pop music, changes rapidly. Both thrive from constant innovation and fickle audiences. In the government marketplace, it is often up to the systems integrator to divine the Next Big Thing, to look beyond staid thinking to new innovation.For this issue, Washington Technology has picked its own Fab Four ? four technologies that promise to sweep through government IT infrastructures: Internet protocol version 6, dynamic frequency allocation, 64-bit processing and the Semantic Web.All four technologies answer pressing needs. All four are already being evaluated very closely by integrators and agencies. All four will require considerable work to weave into existing systems. All four promise to break open new markets.With expertise in these four technologies, your company could be rocking the charts as well ? the sales charts.As for Decca, after The Beatles shot to fame, Decca recognized its error and snapped up The Rolling Stones, showing that even the biggest missteps can be corrected by a quick recovery.XXXSPLITXXX-As cellular phone companies, radio stations and civilian and military agencies battle for increasingly thinner portions of airwave space, it would appear there just isn't enough spectrum to go around. The demands of modern technology might eventually exhaust the seemingly limited airwave supply.But not necessarily, according to Jon Peha.Earlier this year, Peha, an associate director for the Carnegie Mellon Center for Wireless and Broadband Networks in Pittsburgh, conducted an experiment to get an idea of just how much of the airwaves are actually being used. The surprising results could point the way to entirely new ways of using the radio spectrum.Peha and Marvin Sirbu, a professor, dispatched a few students to set up two lookout posts: one on top of a campus building at Carnegie Mellon, and the other on a bluff overlooking Pittsburgh. From these, they measured how much bandwidth was being used by radio stations, cellular phones and other users of the airwaves.They found that while some bands were used, others weren't used as much as expected. Some others weren't used at all."Spectrum use was highly sporadic," Peha said. Although the findings were based on a limited sample, they suggest that plenty of spectrum is available if it is used more efficiently, he said. This requires looking at spectrum use differently. Rather than viewing it as a property to be cut up and parceled out to users, it should be treated like highways and roads: Everyone has access to the roads, but people drive at different times."Measuring samples of spectrum may be more like observing activity on a highway. You see bursts of cars, you see gaps. There is real value to the fact we share that highway," Peha said. New technologies are being developed that use the airwaves more efficiently, paving the way for complex integration work for government integrators.Most notable in the government space is the Joint Tactical Radio System, a next-generation military communications system commissioned by the Army's Communications-Electronics Command. In June, the Boeing Co. won the lead integrator role for the first of four phases of this militarywide system, valued at approximately $2 billion.JTRS is based on a new concept called software-based radio. A software-based radio decodes and encodes signals using software, rather than relying solely on hardware designed to work only in particular frequencies and waveforms.Such flexibility will allow military units to talk to one another with their own radios, freely choosing those frequencies and waveforms that are most secure.Software radio also sets the stage for another concept, called cognitive radio, according to pioneering software radio engineer Joseph Mitola, a consulting scientist for Mitre Corp., McLean, Va., and a member of the Defense Science Board task force on wideband radio spectrum.While software radio lets users pick the frequency and waveform, cognitive radio draws on artificial intelligence to allow radios to automatically negotiate the best transmission path based on factors it evaluates internally. These factors range from output power to how heavily its preferred frequency is being used."Cognitive radio is a goal-driven framework in which the radio autonomously observes the radio environment, infers context, assesses alternatives, generates plans, supervises multimedia services and learns from its mistakes," Mitola said in a white paper on the subject. "This approach could expand the bandwidth available for conventional uses."Another development in this field comes from ArrayComm Inc., San Jose, Calif., which has developed a "smart antenna" that pinpoints users and sends signals directly to them rather than broadcasting in all directions, reducing the number of base stations needed. The company claims that telecommunications companies in Japan have found they can allow 10 times as many users to share a single channel.In June, DynCorp, Reston, Va., signed a partnership agreement with ArrayComm to explore using the company's antenna technology for government markets, particularly public safety.Despite these advances, shuffling users among different sets of airwaves is anything but assured."The problem I see is that we have this huge, embedded infrastructure in the United States that is built on restricted frequencies," said Robert Manchise, chief scientist for Anteon International Corp. in Fairfax, Va. "A lot of military communications, the Federal Aviation Administration, emergency response channels all have their own frequencies. To change these set frequencies would be very expensive and would meet a lot of resistance."Another problem is the physical constraints of wireless transmission itself, Manchise said. Some frequencies are simply more popular than others because they are more suited as communications conduits."There is a relationship between how high in the frequency range the signal is and how far it goes and how much data it can carry. The higher the frequency, the longer the range," he said.Regardless of the eventual outcome, these technologies should push agencies to look at smarter ways to use the spectrum."Hopefully, regulators will ask if they are using the spectrum as efficiently as they can, or if there are new technologies that can better use the existing spectrum," said Bradley Holmes, a senior vice president for ArrayComm.XXXSPLITXXX-Could homeland security be the catalyst that ushers in the arrival of 64-bit processors?Hewlett-Packard Co. President Michael Capellas thinks so. Coordinating the tracking of suspected terrorists across multiple federal agencies will require some hefty computation for data management, analysis and simulation. Palo Alto, Calif.-based HP is looking to market its supercomputer power to the homeland security agency for this task. "In order to get the returns that people want immediately, you will need enormous amounts of data. This is superscale," Capellas said."What you're trying to do is take millions of transactions and look for patterns," he said. This job would not be all that different in scope than the one executed by the $24.5 million HP supercomputer, built from a cluster of 1,388 64-bit servers, that the company sold in April to the Department of Energy's Pacific Northwest Laboratory. The supercomputer is used for simulating environmental conditions.Today, most government desktop computers run on 32-bit processors, meaning the chips take in data 32 bits at a time. While sufficient for word processing, Web surfing and other basic computing tasks, if computers users move to more speech-recognition and video-driven programs, 64-bit processors will increasingly creep into the marketplace, companies said.On July 8, chip-maker Intel Corp., Santa Clara, Calif., began shipping the second generation of its "Itanium" 64-bit processors for servers and workstations. Although some vendors, such as Dell Computer Corp., Round Rock, Texas, have taken a wait-and-see attitude, others, such as HP and Silicon Graphics Inc., Mountain View, Calif., are forging ahead with 64-bit-based strategies for the public sector."Itanium 1 version was for scientists and developers to look at. Itanium 2 is really where government will reap a lot of awards," said Bruce Klein, general manager of HP's federal sales organization.Klein said HP sees 64-bit servers not only serving specialized, heavy-duty tasks, such as at the Pacific Northwest lab, but also performing mainstream tasks, such as general purpose databases, enterprise resource management and customer relationship management.In these markets, 64-bit servers can offer competitive price advantages when compared to 32-bit servers, plus savings in the cost of applications licensed on a per-processor basis.SGI announced it would support Itanium 2 on its systems as well as its own 64-bit processors. The company offers high-performance, shared-memory servers that can be clustered to handle large amounts of complex data, such as large-scale terrain databases for visual simulation.With the support of the Itanium chip, "we have a prime opportunity to become the leader in scalable Linux environments," said Greg Slabodkin, spokesman for the SGI Federal subsidiary of SGI.Integrators with seat management contracts shouldn't worry about 64-bit processing yet, however. Intel has no immediate strategy for marketing the Itanium chip for desktop computers, said Mark Margevicius, research director of client computing for the IT research company Gartner Inc., Stamford, Conn. "Itanium is not a desktop play," he said.There are a number of reasons for the chip-maker's reticence. One is that there is a considerable future still left in the company's 32-bit line of chips, with the recent introduction of the 2.5 gigahertz speed chip and 3 gigahertz set to be released by the end of the year.Still another roadblock is that customer demand isn't there for improved performance. When the industry shifted from 16-bit to 32-bit processors in the last decade, it was largely because of the introduction of Microsoft Corp.'s Windows 95 operating system, which allowed users to run multiple applications at once. But there is no similar driving need for improved capability today, Margevicius said.However, others are more optimistic. Capellas said 64-bit processors will be in desktop computers in as soon as three years. Fueling this upgrade will be multimedia applications, such as streaming video and voice-driven computers."The reason there is no demand [for 64-bit processors] is that we still have a stupid interface," said Capellas, referring to keyboards and mice, which can be replaced by voice-recognition interfaces."The 64 bits allow you to move a bigger chunk of [the processed data] to the memory, so speech recognition or video comes off at a much higher resolution or smoother pace," he said. The 64-bit architecture also allows the processor to use a larger range of random access memory, from four megabytes to potentially hundreds.For the integrator, a shift to Intel 64-bit systems on the desktop means today's 32-bit applications will have to be ported to 64-bit architecture.However, the chips that Advanced Micro Devices Inc., Sunnyvale, Calif., plans to introduce next year will allow 32-bit software to be used without modification, Margevicius said.Integrators don't anticipate a big problem in the transition."We don't see it at this point," said Mike Boese, deputy chief technology officer of the Advanced Information Systems unit of General Dynamics Corp., Falls Church, Va. "We've been using 64-bit processors for some time. We've been applying them to information management and imagery. This one, to us, is very much part of the evolutionary approach in transitioning a lot of what we do."XXXSPLITXXX-The Internet is running out of address space.Although no one knows when all potential addresses will be appropriated, most experts see the window closing in five to 10 years. Four billion addresses just won't be enough for every network device on the planet.To meet this new demand, the Internet Engineering Task Force, the controlling standards body for the Internet, has upped the number of digits used per Internet address from 32 to 128. This next-generation Internet protocol, IP version 6 or IPv6, will offer possibly 35 trillion addresses, which should solve the address shortage for a while.The IETF has also added new features into IPv6, such as enhanced security, quality-of-service measures and autoconfiguration, all of which will open up new services that can be offered by systems integrators.However, implementing IPv6 across the Internet will be another "Y2K in the making," said Latif Ladid, president of the IPv6 Forum, which fosters the protocol use. To upgrade to IPv6, government agencies will have to replace or improve all their network equipment, such as routers and switches. Software in the servers and desktop computers that use the Internet will also need upgrading.This upgrade is in the making. In 1978, when the Defense Department mandated the integration of various stovepiped networks, including the Internet's predecessor, ARPAnet, it chose the Internet protocol as the common platform, according to a history of the Internet by the Navy's Spawar Systems Center in Charleston. S.C.Unfortunately, the Defense Department did not predict the Internet would be as widely used as it is. Today, the number of Internet hosts doubles each year, with the current estimate pegged at approximately 115 million hosts online.The pace is expected to continue as more developing nations jump online, and the falling costs of microchips all but guarantee that every electronic device will be controllable through the Internet, Ladid said. Many parties are leapfrogging straight to IPv6.For government contractors, the message is clear: The Internet is moving to IPv6, and agencies must follow suit to keep pace with the world.Earlier this year, government reseller GTSI Corp., Chantilly, Va., opened a lab to test new technologies, and one of the chief areas it is exploring is IPv6, said Sanjay Barthakur, a senior network engineer for the company.Barthakur said GTSI has not yet experienced great demand for IPv6-ready gear from agencies, though the company is preparing for it. It purchased hundreds of thousands of dollars worth of IPv6-ready network hardware from Cisco Systems Inc., San Jose, Calif., and Sun Microsystems Inc., Palo Alto, Calif., as well as IPv6-ready software, such as Sun's Solaris 8 and Microsoft Corp.'s Windows XP operating systems. GTSI will test this equipment in various multivendor configurations to anticipate problems agencies may experience.Ken Albanese, senior systems engineering manager for Cisco's federal unit, said Cisco is seeing a lot of interest in the federal arena, mostly from the Defense Department, but the equipment being purchased is for test environments.Most [agencies] are watching the research community to see how they react," Albanese said. "We haven't seen mass infrastructures converted yet."Unlike Y2K, there is no deadline for installing IPv6. The real driver for the adoption within U.S. agencies will come from outside the country, Ladid said.Last year, Japan set a 2005 deadline for all businesses and agencies to use IPv6. Korea and China also have national strategic adoption plans for the protocol. Europe, driven by a high use of cell phones, also has taken legislative initiatives.Such countries want to use IPv6 to boost sagging economies, hoping the new features will spark new markets, Ladid said.Countries late to the Internet revolution are also adapting IPv6 simply because there aren't enough Internet addresses to go around, said Jeff Thomas, a product manager specializing in IPv6 products for Compagnie Financier Alcatel, Paris."India is struggling for IP space," Thomas said, noting he saw a network in India using network address translation (the protocol that allows multiple computers to use one IP address) that was "five layers deep."This global adoption of IPv6 will leave U.S. federal agencies like "lobsters in a pot of slowly boiling water," said Ladid, meaning they might not see a business case for upgrades at first, but if not careful, they will be left with outdated networks incapable of handling sophisticated network traffic.The good news for government integrators is that IPv6 will also ease administration tasks and even allow new services."It's a double-edged sword to us. We take care of various government networks, so in that sense it is just more work [to adopt IPv6]" said Mike Boese, deputy chief technology officer of the Advanced Information Systems unit of General Dynamics Corp. of Falls Church, Va. "The real advantage for us is that there should be a positive cost tradeoff. Even though you have to pay for the cost of switchover, you will recoup that cost over time because it's easier to administer."Autoconfiguration features of IPv6 will also make it easier to deploy complex networks. "You won't need so much specialized engineering in the background to make it happen," Boese said. IPv6 also will produce a long-sought-after solution for wireless security, said GTSI's Barthakur, finally assuring that off-the-shelf wireless devices can be used safely for mission-critical jobs, such as police work and mobile networks for military units.XXXSPLITXXX-In the past year, government integrators have been increasingly using an extension of the World Wide Web format called extensible markup language, or XML, to help move data more easily between online systems. In coming years, they may employ a further extension of the Web, called the Semantic Web, to enable computers and other electronic devices to make more intelligent decisions about what actions they should take."The Semantic Web builds on XML," said Jim Skinner, a chief scientist for Computer Sciences Corp., El Segundo, Calif., who authored a company report on emerging technologies. "It will be set up so that intelligent [software] agents will be able retrieve information and bring it back to you."Researchers at University of Maryland-Baltimore County have used a Semantic Web framework to create a smart room that allows a person to operate various electronic devices, such as a lamp or computer, with a handheld computer.While remote controls are nothing new, what is novel here is the attempt to write a common language for remotely controlling all electronic devices ? creating, in essence, a World Wide Web for machines.The project received research funding from the Defense Advanced Research Projects Agency and the National Science Foundation.What makes the Semantic Web different from current device networking protocols, such as the Bluetooth standard and Sun Microsystems Inc.'s Jini, is that it offers developers a framework to write logic rules for devices to make intelligent decisions, said Timothy Finin, a principal investigator of the University of Maryland project.Anupam Joshi, a professor and another principal investigator on the project, gives an example of someone in a room who needs to print a document from a PDA. The document needs to be printed in color at 600 dots per inch. A protocol such as Jini would only look for a printer matching that specification. If none are in that range, the document wouldn't be printed.The Semantic Web framework would allow the PDA to evaluate the possible printers and offer the best solution. "It would ask the user, 'Would you prefer a black and white print at 1,200 DPI or a 300 DPI color print?' " Joshi said.For consumers this may mean that a ringing phone may signal to a DVD player to pause a movie being watched while the viewer answers the phone, he said. A government agency may use this protocol to automatically ensure that all its property, from lamps to automobiles, adhere to energy conservation and security policies. It could also allow complex knowledge management systems to share and analyze data with little human intervention.The creator of the Semantic Web is Tim Berners-Lee, the originator of the protocols used for the World Wide Web. "The Web is good for delivering information to humans. [Berners-Lee] wants to extend that to include information exchanged between machines," said Thanh Diep, a senior technologist for General Dynamics Advanced Information Systems, a business unit of General Dynamics Corp., Falls Church, Va.The University of Maryland project is part of a year 2000 Defense Department Semantic Web initiative called the DARPA Agent Markup Language, which seeks to develop a semantic language that would allow a higher level of interoperability, not only between devices, as the University of Maryland project demonstrates, but even between Web sites and databases."The market is virtually unlimited for DAML," said Adam Pease, director of knowledge systems for the intelligence software research company Teknowledge Corp., Palo Alto, Calif. "Everywhere we have Web sites written for people, DAML can be included to make those sites understandable to machines. There is a lot of work that has to happen in terms of creating DAML content."The company has participated in the DAML project and subsequently parlayed that experience into development work for other agencies. In February, the company won a $750,000 contract from the Air Force to develop software that can locate contradictions and inconsistencies in datasets based on metadata specifications.Other companies also express guarded optimism for the market the Semantic Web, or an offshoot of the technology, may produce."The Semantic Web is important to us. Because of the nature of the work we do, we're trying to get different computer systems talking to each other. And right now, we have to add a lot of our own work in there to make it happen," said Mike Boese, deputy chief technology officer of General Dynamics Advanced Information Systems."It's still in the realm of research, but if the kinks are worked out, the Semantic Web would go a long way in communicating across federal agencies," said Bill Medley, manager of the Web solutions for GTSI Corp., Chantilly, Va. "It would offer an order of magnitude leap in the ability to parse information and understand what that information means in context."












Bradley Holmes of ArrayComm Inc., which has developed a "smart antenna," is hopeful that regulators will see the need for more efficient use of the airwaves.

(Photo by Darwin Weigel)









































Bruce Klein of Hewlett-Packard said 64-bit servers will not only drive specialized supercomputers, but also mainstream tasks, such as general purpose databases, enterprise resource management and customer relationship management.

(Washington Technology photo by Henrik G. de Gyor)









































Internet Protocol version 6 is one of the new technologies senior network engineer Sanjay Barthakur and others are working on at a new test lab opened by GTSI Corp.

(Washinton Technology photo by Henrik G. de Gyor)













































University of Maryland-Baltimore County researchers ? including professor Anupam Joshi ? have used Semantic Web technology to operate electronic devices from a PDA.

(Washinton Technology photo by Henrik G. de Gyor)