Some of you might remember Distributed Computing Environment (DCE), but it’s not clear that some industry pundits or venture capitalists do. Or at least they haven’t internalized a principal reason that DCE is, to put it politely, not prevalent today.
DCE is a set of technologies developed by the Open Software Foundation (now called The Open Group – www.opengroup.org) that lets a computer user employ network-based resources to augment his or her local computer. DCE, quoting from an IBM Corp. Web page, “is a comprehensive suite of integrated, yet modular, products which support transparent file access and secure resource sharing in heterogeneous, networked computing environments.”
I’m sure the Open Group will call this simplistic, but in my mind a major reason that DCE was developed was to share resources, such as disk and processor cycles, over a network because having enough dedicated resources for individuals was too expensive. With DCE, the user can access databases without needing to have a local copy and can get heavy-duty processing done without as powerful a computer on his or her desk.
But the DCE proponents did not take into account the continued development of technology. Before the DCE specifications could be fully developed, disk and computer technology developed enough to negate much of the assumed advantages of using DCE. DCE was based on the assumption that the cost of managing the use of distributed resources would remain less than the cost of replicating them. This assumption did not prove to be long-lived.
There just may be a lesson in the history of DCE for those who are considering investing in peer-to-peer networking, storage as a service offering or maybe even VPNs.
I am leaving out a number of other arguments that were made in the case of DCE – single sign-on, centralized back-up, centralized authorization management and more. Some of these arguments are now made for the newer technologies – they may prove to be as non-decisive as they were for DCE. I am also leaving out the ego factor that leads network managers to think that they should control everything that connects to their networks. That factor is harder to analyze – some of the egos are rather strong.
An undercurrent of Clayton Christensen’s book, The Innovator’s Dilemma, is that it is quite hard for people to take into account the fact that technology does not stand still when evaluating their options. It is much too easy to see what you can buy today and assume that it represents what will be available in the future. An example of this may be the pundits that dismiss using the best-effort Internet for telephony – all they can see is that it would not work well enough for them today. They forget that using today as a guide led to DCE’s development.
Bradner is a consultant with Harvard University’s University Information Systems. He can be reached at sob@sobco.com.