(A slightly modified version of this article
appeared in IEEE Computing, Volume I, Number 2, 1997.)

NC: An Acronym Revisited



By Christopher Locke
In the history of computing as it relates to industrial manufacturing, NC stands for "numeric[al] control." Today, NC is also being used to represent a new class of device dubbed by Oracle and others the Network Computer. While these two acronymic instantiations are unrelated in any intentional sense, they do bear an interesting resemblance.

Numeric control involves the programming of various machine operations once under the manual oversight of skilled craftspersons. David F. Noble tells this story in his excellent book, "Forces of Production: A Social History of Industrial Automation" (1986). This is highly recommended reading even for those with little interest in manufacturing per se, for the moral is that automation deskills, and absolute automation deskills absolutely. Once skill-based techniques are committed to computer programs, they tend to immediately ossify. Gone is the human understanding that enables flexibility in the face of changing conditions. While this form of NC was viable in mass production scenarios, it is much less workable in the case of smaller runs of a broader range of products.

This is a story I have often told myself, most recently in a talk at the Aspen Institute a couple years ago. "Nets and Mirrors: How the Internet Both Affects and Reflects the Global Economy " is available online (and nowhere else) at:

http://www.panix.com/~clocke/mirrors.html This paper is an attempt to bring together the line of reasoning that sees automation as deskilling with the internal logic of the Internet. My conclusion is that the net came along just at the right historical moment -- *as if* by happy accident -- to answer the thorny problems posed by a world rapidly transitioning from economies of scale to economies of scope. Critical to this transition is the empowerment of individuals to create new knowledge and share their insights via a global computer network.

Of course, as we know, this network did not grow out of the mainframe world, but was largely founded on independent workstations, and later personal computers. Networking such machines had a powerful effect -- and an even more powerful motivation -- precisely because each one was being used for a *different* purpose. This was a radical departure from the previous model of dumb terminals enslaved to a central CPU and data repository.

Turning to the notion of NC as "Network Computer," many hear disturbing echoes of this old mainframe mentality. While there are many definitions of NC in this context, and none is definitive, the basic idea is of a pared down workstation running standard software that resides on a centralized network server. One of the primary economic arguments that has been put forward for such devices (though hotly debated) is the cost savings that would accrue from simplified software maintenance. Related to this is a strong emphasis on locking the physical chassis of the device to keep end users from making modifications. On the software end of things, users are similarly constrained from using non-approved tools.

If we think of computing tools the same way industrialists thought of machine tools -- as a means of automating standard repeatable production processes -- then such "rationalization" of tools and procedures makes some sense. But even in the manufacturing world, the logic broke down as product lines expanded to fill a broadening set of market requirements, and competition was based more on the diversity of offerings than on volume output of "monocrop" products. As survival and competitiveness came to depend on a greater scope of market niches, more knowledge was required to drive production processes. Critically, such flexible (i.e., genuine) knowledge could not be adequately delivered by automation, notwithstanding the once-ubiquitous hype about artificial intelligence saving the day.

If computers were standardized data processing machines, then some of the economic advantages claimed for Network Computers might be compelling. But as we know, computers are not glorified calculators. Instead, they are increasingly tools for knowledge amplification -- as Doug Engelbart suggested decades ago. Many so-called workflow applications do indeed view computing as the rote processing of standardized data -- insurance forms, for instance. But the Internet has radically altered what can be viewed as "standard" in any business today. Market expectations are changing with blinding speed, the available tools change almost daily, and the kinds of things that can be done with those tools to meet those expectations is not something that can be looked up in a company manual.

Companies attempting to insulate themselves from the Internet-at-large by creating smaller, safer intranets, will inevitably lose out in exact proportion to the degree they're "protected" from the global marketplace of new ideas, new software, new ways of thinking.

There may seem little point in struggling with the quirks and idiosyncrasies of Win95 or whatever OS is coming next; no point in configuring Netscape plug-ins; no earthly reason to check out 15 shareware variations on some new wrinkle in computing capability. But heaven help the company whose people have been "saved" from such costly frustrations. They are also missing critical "Don't Examples" that otherwise might have been avoided in their own information management efforts. They are missing software innovations that they may finally realize -- too late -- have reshaped market expectation and demand. They are blind to opportunities that might be had by overcoming these very obstacles.

If the Internet has taught us anything, it is that the design of tools -- and even of information itself -- has become a globally distributed process. No IT department has the omniscience to single-handedly construct THE optimal internal environment. This is a paradox, of course, as it must always struggle to do just that. However, if it accomplishes perfect control over "end users" by locking out their experience and hard-won insight, it fails in the same moment, and by the same token. Only through the active and near-universal involvement of its people with the anarchy of the Internet -- and the stochastic vectors by which the net continues to evolve and shape itself -- can companies remain in touch with a world they can no longer specify.

Clearly but unfortunately, this constitutes preaching to the choir. Those who immediately grasp the argument presented here are probably already convinced. Those to whom this thesis sounds hopelessly abstract will continue to pursue a vision of computing that has more in common with Henry Ford's assembly line than with the potential of a global network of constantly evolving knowledge resources. Only time will tell which path is more productive. But as in so many other cases, time will likely not be kind in passing over those who've missed the boat.

###