Microsoft and Open Systems: The Domino Effect

The concept of simple substitutions of infotech products remains a far-fetched dream, but vendors, driven by demands in the marketplace, now tout their wares as open, though interoperable is a more accurate description. In fact, not claiming openness and interoperability these days represents a marketing death sentence.

The initial objective behind open systems was to give users the ability to swap products, hardware and software, without worrying about whether they would work together, much like consumers do today with stereo components. With the exception of commodity items such as PCs though, "that's definitely still a theory," noted Liz Melcher, senior research analyst for open systems at the Gartner Group.

The concept of simple substitutions of infotech products remains a far-fetched dream, but vendors, driven by demands in the marketplace, now tout their wares as open, though interoperable is a more accurate description. In fact, not claiming openness and interoperability these days represents a marketing death sentence.

Benefits such as interoperability and scalability now drive the concept of open systems, although the term still means different things to different people. But as organizations start moving toward client/server systems and extend operations globally, they require the kind of seamless integration that open systems once promised. "That concept of seamless integration is very much alive today especially because of the Internet," Melcher said.

Yet consumers have become frustrated and disillusioned. People's disenchantment with the unrealized promise of truly open systems grew out of the belief that formally defined standards would end the reign of proprietary systems dominated by mainframe giant IBM Corp., and release the floodgates to open, non-proprietary systems. However, John Mann, senior analyst at the Yankee Group, said, "The hope of standards achieving anything is slim."

History has shown through IBM's legacy of mainframes still in the market today that proprietary systems can survive. These days, Microsoft has taken over that role, proving again that proprietary products can win in the marketplace, Mann said.

Microsoft has emerged as the infotech giant and stepped into the driver's seat, guiding the future of open systems in the 21st century. Although standards are not enough to ensure interoperability, a combination of standards and strong industry support for them may have a better chance.

Standards such as Transmission Control Protocol/Internet Protocol, and Structured Query Language, evolved into successes in the marketplace because vendors strongly supported them. With Microsoft leading the pack, the future of standards that allow interoperability and distributed computing across different hardware platforms will depend on whether the company lends its support.

For example, the Open Software Foundation, composed largely of academicians, endorses the Distributed Computing Environment standard, a leading standard in the open systems movement for some time. Challenged by the foundation, vendors two years ago demonstrated their respective DCE implementations.

The Yankee Group commented on industry's support for DCE:

  • IBM is more than minimally enthusiastic.

  • Hewlett-Packard is committed to distributed systems and sees DCE as a key foundation.

  • Digital Equipment Corp., true to its decentralized engineering heritage, has a group pursuing DCE and other groups pursuing object-oriented technology.

"We believe DCE will have a somewhat limited long-term role in the future of enterprisewide distributed computing.... We hasten to add that large vendors can hardly afford to ignore the possibility, however small, of losing ground in the open systems market. Insurance is cheaper than disaster," a Yankee Group report stated.

Although the DCE standard has existed for several years, Melcher noted, it never gained much momentum in the market. "That could change because of Microsoft," she said. Microsoft has committed itself to DCE, but the support comes with a price.

Windows NT, due later this month, does not use the foundation's licensed code for DCE. Instead, Microsoft wrote its own code. "That means once Microsoft changes the code," Melcher explained, "they will dictate the standards."

But Mann isn't so sure about that. When it comes to standards, "people are all over the map. Some believe standards are enough; others believe the products themselves will do. Allow all sorts of innovative companies to participate and let the best of breed products dominate," he suggested.

And perhaps the key to the DCE standard's inability to influence the marketplace lies in the customer base. Customers do not have a strong reason to insist on DCE, the Yankee Group report noted.

What customers want, Mann said, is the ability to share data between applications running on legacy mainframes and newer client/server, distributed systems. The market demand for interoperability or openness between applications, ironically though, has spawned a burgeoning, proprietary industry.

Middleware lies between software applications and system software and is designed to integrate applications that run on different hardware. The Yankee Group calls it "one of the most significant enabling technologies for distributed computing in the 1990s."

Yet most companies use their own homegrown version of middleware designed to meet their specific needs. Because the existing products deliver less than 10 percent of all the functions middleware eventually could provide, "nobody has a dominant role in the marketplace. The market is so ill-defined that people don't know how to pick a winner," Mann said.

So what is the future of middleware, the technology developed to solve interoperability problems but has created a whole new set? "They'll figure out how to interoperate when they have to," Mann predicted.


NEXT STORY: INTEGRATOR INSIDER