NRC Reports Review Traffic Management, Technology Trends

NRC Reports Review Traffic Management, Technology Trends

By John Makulowich
Contributing Writer

Caught up in the whirlwind of the World Wide Web? Two new documents from the National Research Council are refreshing antidotes to this virtual malady, both a sober reminder that the pace of change often carries a hefty price tag.

"Traffic Management for High-Speed Networks" by H. T. Kung, the Gordon McKay Professor of Electrical Engineering and Computer Science at Harvard University, is the fourth lecture in the International Science Lecture Series. In this 22-page presentation, Kung argues that "network congestion will increase as network speed increases." The rest of his remarks unravel this counterintuitive statement.

Clearly, our continuing reliance on and use of high-speed networks requires assurance that the chance of losing critical data through congestion will be negligible. He argues that traffic management is essential because the so-called "brute-force" way of simply enlarging buffers to avoid data loss will quickly become technically and economically impractical, particularly in asynchronous transfer mode networks and switched Ethernets.

His recommendation is credit flow control, analogous to the use of dams for controlling floods where an upstream dam holds additional water to accommodate downstream congestion points. For the technically alert, this presentation updates Kung's continuing research and is worth reviewing.

(Readers interested in exploring the depths of ATM might consider the newly published Wiley Teubner paperback, "ATM Networks: Principles and Use" by Martin P. Clark (ISBN 0-471-96701-7 or 3-519-06448-0). It is one of the better-written texts on a technical subject for lay audiences.)

The second publication, "Information Systems Trustworthiness: Interim Report," can be read on several levels, not only for an understanding of specific topics but also for the technology trends covered in Chapter 2 and the so-called "non-technical realities" in Chapter 5.

The thrust of this work is the increasingly intertwined nature of our nation's networked infrastructures, whether information systems, electric utilities, public switched telephone network or computer and communications networks, and their potential to evolve into "an interdependent system of fragile and vulnerable subsystems. Understanding how to ensure that they will operate reliably is thus vital," notes the project study.

Initiated at the request of the Defense Advanced Research Projects Agency and the Information Systems Security Research Joint Technology Office, the project focused on developing a research agenda as well as a program of technical activities to strengthen the reliability of information systems. In doing so, it sought to improve society's ability to depend on these systems. The joint office is a collaboration among DARPA, the National Security Agency and the Defense Information Systems Agency.

Part of the work to develop an agenda called for defining the trends in technology, which are covered in three areas: processing, communication and software. As an example, the report notes some features that commodity computers will probably have in 5 to 6 years: 320-MB RAM with multigigabyte per second memory bandwidth; 40-GB capacity disk with 60-MB/s transfer rate and 2.5-ms latency; Gigabit per second network interface.

I guess we haven't seen nothing yet, as the saying goes. Just watch out for the speed bumps ahead.

John Makulowich writes, talks and trains on the Internet. You can reach him at; his home page is

Reader Comments

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above.

WT Daily

Sign up for our newsletter.

Terms and Privacy Policy consent

I agree to this site's Privacy Policy.