Site home page
(news and notices)

Get alerts when Linktionary is updated

Book updates and addendums

Get info about the Encyclopedia of Networking and Telecommunicatons, 3rd edition (2001)

Download the electronic version of the Encyclopedia of Networking, 2nd edition (1996). It's free!

Contribute to this site

Electronic licensing info




Related Entries    Web Links    New/Updated Information

Search Linktionary (powered by FreeFind)

Note: Many topics at this site are reduced versions of the text in "The Encyclopedia of Networking and Telecommunications." Search results will not be as extensive as a search of the book's CD-ROM.

Interoperability describes how different computer systems, networks, operating systems, and applications work together and share information. This is usually achieved by following published or de facto standards. Interoperability is the opposite of vendor proprietary solutions, which were pervasive in the days of IBM mainframe and minicomputer systems. When desktop computers appeared in the early 1980s, users were empowered but interoperability with other systems was a major problem.

Interoperability is a networking issue. In the days of centralized mainframe systems, all the components attached to the system were specifically designed to work together. With networks, it is possible to attach different computer platforms running different operating systems and different applications. Actually, the network itself provides the first level of interoperability. For example, all the workstations attached to an Ethernet network can potentially communicate with one another.

Until recently, it was difficult for PC, UNIX, and Macintosh users to simply exchange files, even when connected to the same network. The file storage techniques, formats, and file transfer programs on these systems were different. Users often converted files to simple ASCII text and transmitted them to one another using modems, losing formatting and control codes in the process.

Network servers solved the problem to some extent by providing a single place to store files. Most servers expanded to support a variety of clients, allowing them to store files in a single location for other clients to access. Standard file formats such as RTF (Rich Text Format) were developed to save formatting information and make them available to a variety of readers. Standard markup languages such as SGML provides another solution. Application interoperability was achieved across platforms with client/server computing and Web technologies. Middleware products and component software were developed to enable applications that work across platforms.

Simon Phipps, IBM's evangelist for Java and XML, likes to describe the transition to XML (Extensible Markup Language) as the "last gap" in defining a new world of information sharing. This has been achieved through the following progression of events:

  • TCP/IP has become the near-universal communications protocol for connecting information systems.

  • Browsers have become the common space into which solutions can be loaded.

  • Component technologies such as Java are now established as the standard for platform-neutral computing.

  • Data was the last gap. An open data-formatting specification was needed. XML is that specification.

Copyright (c) 2001 Tom Sheldon and Big Sur Multimedia.
All rights reserved under Pan American and International copyright conventions.