Making Knowledge Pay

By Christopher Locke


BYTE cover story, June 1992

ABSTRACT

If knowledge is crucial to innovation -- and it seems to be -- it is also distributed throughout organizations, not concentrated in their executive suites. Companies that plan to survive in the new competitive environment must stop making autocratic demands on the organization and begin to learn from it. They must listen more closely to the stories their people constantly tell in the course of their work, and find ways to broaden that discourse. The tools described in this State of the Art section are suggestions for advancing such an agenda.


"Technology is changing the way we do business."
- anonymous industry wisdom

Unexamined belief in this hackneyed sentiment has resulted in innumerable "high- tech" tools that force people to work in unnaturally constrained ways according to age-old assumptions that no longer hold. Inverting the cliche yields a closer approximation to the truth: the way we do business today is beginning to elicit radically new technological responses.

Of course, there is always a reciprocal relationship between how business is pursued and the technological tools used to pursue it, and this reciprocity can lead to fruitless chicken-and-egg arguments about which influenced what first. Practically speaking, it is safe to say that much existing software design is based on obsolete principles stemming from the era of mass production -- an era that is now irrevocably past.

THE AUTOMATION TOLL

In those days of very large, stable -- and largely domestic -- markets, there was little pressure to innovate. A product might remain essentially unchanged for years or even decades. Because enormous quantities of these products were being churned out, the processes that controlled their manufacture could be "rationalized" and eventually automated. A significant economy of scale resulted: the more volume increased, the more unit costs dropped. Because knowledge about the design and control of these repetitive assembly-line procedures tended to concentrate at the top of organizations, the need to train workers was radically reduced In the typical mass-production scenario, a worker could be taught in minutes or hours to perform some trivial task that he or she might mindlessly repeat for a working lifetime.

The downside of this particular cost-benefit is often called "de-skilling." Mass production produced fewer and fewer people who understood what they were doing. The worker who inserts the bolt, said Henry Ford, should not be the same worker who tightens down the nut. During this period, sometimes known as the Second Industrial Revolution, pride in craftsmanship and deep learning through apprenticeship quickly gave way to a mind numbing automatism in the performance of work. Unions, once fierce protectors of the integrity of craft knowledge, came increasingly to support complex systems of legalistic work rules. A spirit of fatalistic resignation -- or often of hostile alienation -- came to characterize the American workforce. While workers resented being treated like appendages of the new machines, their companies were thriving on the rising output that resulted. As computer-controlled automation became available, companies began to replace even more workers with increasingly "intelligent" systems.


There is no magical, sacrosanct or immutable knowledge
that can be packaged and delivered to the corporate doorstep
to protect organizations from their own ignorance.

Sometime around the early '70s, something happened to break this rising spiral of automation, higher output and lower costs. That something was the dawn of global competition. All of a sudden -- as it must have seemed to many U.S. producers who weren't paying very close attention -- Asian and European companies were starting to capture significant market share in areas where American companies had previously enjoyed unchallenged hegemony: consumer electronics, automobiles, chips and computers. Today, markets are irreversibly fragmented, product diversity is far greater, production runs are correspondingly shorter, and economies of scale are therefore getting very hard to find. Mass production is a dead horse; no amount of whipping will make it pull the cart again. However, many companies have been slow to appreciate this as an authentic and permanent change. In fact, it represents a seismic paradigm shift -- and the lag in corporate perception has cost untold billions in lost markets and widespread unemployment.

THAT WAS THEN

What does this whirlwind historical tour have to do with the software that we use to manage our businesses today? Everything. "Intelligent machines" were great at controlling relatively stable mass production operations, but they require quotation marks because they are incapable of innovation. This is bad news for a certain wishful-thinking mentality, because innovation is the key to global competitiveness, and innovation requires learning. The fatal catch here is that only human beings are capable of learning -- in any sense that makes sense. Only through the first-hand experience, deep process insight and rapid adaptability of their people can organizations meet the fundamentally changed expectations of world markets. Today we don't need more questionably "intelligent" automation; instead, we need tools that will support -- not attempt to supplant -- genuine human understanding.

The Central-Mainframe-and-Master-Database business model has long assumed that accurate and adequate knowledge resides only at the top of organizations, and that only this privileged and near-prescient knowledge can, from that lofty height, command and control a hierarchy of subservient business units and their constituents. We've seen that assumption seriously challenged over the past decade with the swift proliferation of personal computers, workstations and local area networks. Still, the Mother-of-All-Databases mentality lives on in the design of many software tools. And many managers, firmly ensconced atop their hierarchical fiefdoms, continue to believe that only by carefully controlling the contents of database fields can they truly define, circumscribe and confer legitimacy upon any given subject. But, to take a common example, where fixed- field items like employee number, pay rate, and mother's maiden name are used to delimit a person's identity and worth, something irreplaceable is being lost.

At base, the problem is that we no longer unequivocally know what it is we need to know. The challenge here is not to increase efficiency -- as in the heyday of mass production -- but to deepen vision. Some companies that automated to the point of "lights-out" operation have since had their lights turned out by Chapter 11 proceedings -- not because they weren't productive, but because they were producing the wrong stuff. So what is the "right" stuff? The bigger question is: Who knows?

GETTING SMART

In the mid-80s, many hoped that artificial intelligence was the answer. Maybe AI would revive the recurrent management dream (or nightmare) of central control over increasingly complex operations. By capturing essential knowledge and automating the rules by which it was to be applied, AI promised to replicate (read replace) the human intelligence previously required to perform high-level work. But the world -- as it will -- had another idea. Then another and another... In contrast, so-called expert systems largely depend on the notion of a stable world (a "well-bounded domain" is the preferred jargon), one that has already "made up its mind," so to speak. While such stability may have characterized the era of mass production, it has long since been swept aside in most environments by an unrelenting dynamism. Had AI been available to Henry Ford, it might have worked miracles. Today, it simply doesn't work.

Knowledge is not fixed, nor is it the exclusive province of some organizational elite, whatever business schools they may have attended. Despite years of de-skilling automation, intimate knowledge still resides with people close to critical production processes -- whether their collars are blue or white, whether they produce washing machines or financial reports. Genuine knowledge has two irreducible aspects:

1) it is seldom structured in the form of fixed fields or dependable rules, and

2) it is social, i.e., it is distributed as shared understanding among human groups that often have little respect for artificial organizational boundaries.

The articles that follow focus on new kinds of tools for articulating, organizing and sharing such bona fide knowledge. For the most part, they deal with unstructured information, that is, language. While the outline of a book is definitely a structure -- and a useful one -- it does not constrain the content that can be communicated in the same way that a database's record structure limits the content of pre-specified fields.

WHAT'S THE STORY?

A note of explanation is in order here. It is easy to equate language solely with text, and indeed many of the tools described here do deal with textual information. However, many can also accommodate graphics, sound, video: the full range of multimedia communication modalities. What distinguishes language, in the sense the word is being used here, is not mode but modus. Language is narrative. That is, it enables the telling of stories. As John Seely Brown of Xerox PARC has pointed out, such stories are the "real expert systems" that enable organizations to function effectively (see Note 1).

While tools such as databases, spreadsheets and expert systems may be useful for massaging data or automating fairly routine procedures, they are poor vehicles at best for telling stories. First, the data these tools contain are not readable in the sense that a story is readable. How much plot development can you really get out of a field descriptor, a cell formula or an if/then rule? Second, they impose so rigid a context that there is no room for readers to interpret, amplify, reshape and retell their contents -- all the means whereby stories not only propagate but often evolve into something unsuspected at their inception. This nearly genetic potential for recombinant mutation through social transmission points to the enormous value of work-related stories: they express a collective imagination that is often far greater than the sum of their retellers'.

This living knowledge emerges as people share their current state of understanding and de-bug their collective prejudices, blind spots and unfounded assumptions. This collaborative exploration cannot be automated. Nor can it be controlled by the old mechanisms of organizational management: the hierarchy, the chain of command, the delivery of cut-and-dried "marching orders." Although the two are frequently confused, knowledge differs intrinsically from data in that it evolves along a critical path from story to culture to organization to technology.

Too many companies have got this process of natural selection backwards. They are drowning in a sea of high technology without insight or content, and often producing products and services of abominable quality. This is precisely why we are hearing so much today about concepts such as workforce empowerment, employee involvement, participatory management, self-directed work teams, and concurrent engineering. Organizations critically require knowledge they can no longer supply top-down. Ironically perhaps, relinquishing power to the people is also the major factor fueling a new breed of genuinely "knowledge-based" high-performance business organizations (see Sidebar).

If the idea of corporate storytelling still seems a little far fetched, consider some of the channels through which we practice the art every day: the telephone, the copier, the fax machine. And these are only the older trade routes in the story traffic. The social concept of networking is now almost inseparable in some quarters from the physical networks that support it: LANs, internets, electronic conferencing systems -- even "sneaker-nets" play a significant role. Marriages are being announced every day between computers, fax machines, modems, cellular phones, phone companies, cable networks, and electronic information providers. Moreover, the distinction between hardcopy print and softcopy file is rapidly blurring.

NET WORK

I recently installed a fax modem that enables me to do some interesting things. I can use my copier to capture a page from a book, then fax it to myself. Using the surprisingly capable optical character recognition (OCR) available today I can convert the fax image to machine-readable text. Voila, I've got a fairly intelligent "distributed" scanner. Then, taking advantage of the modem's "send" facilities, I can dial up the WELL (an online conferencing system in California) using Sprint's or CompuServe's carrier services, and transfer this newly captured file from my PC to my remote Unix account. I can then post it in a conference I'm hosting there (see sidebar) where it will be read -- and can be downloaded -- by people logging in from around the globe. Also, I can send e-mail to Bob Ryan at Byte and include my file: "Hey, Bob, what do you think of this for inclusion in the article we talked about?" If Bob agrees to this hypothetical request, he can download the mail file to his composition software, and you might read it on these pages as hard copy. But, in between, it has been as "soft" as it gets.

Also in between, the original information has probably generated various responses. Someone in the WELL conference may write, "I like where she says xyz, but the other stuff is a crock. Have you seen abc's article on this issue in Zyx Journal? I have attached a few relevant paragraphs and page citations." And someone else may respond to that. Also, Bob may send return e-mail saying: "Well, Byte already covered that pretty thoroughly two years ago in Vol. X, No. 5. Why don't you try reading the magazine once in a while? :-)" Or conversely, he might think it interesting enough to post in a BIX discussion. These opinions, annotations, and pointers to related material -- whether in print or on-line -- constitute an enormously powerful virtual conversation among people who may never meet face to face.

I recently got a fax from a fellow at the National Academy of Science and Engineering, which opened: "I am writing this while waiting for a plane at National [airport]." He composed the message -- in response to a journal paper I had sent him -- on a notebook computer equipped with a modem and a cellular phone. He sent the file as e-mail over the Internet to his office computer in D.C., which then looked up my fax number and transmitted the message to my office outside Chicago. His observations on the article were extremely welcome and helpful.

This form of networked conversation is the electronic analog of the proverbial water-cooler discussion in which serious organizational matters often got communicated -- sometimes even resolved. Part of it is trivial banter; but another part is critical information that individuals might never be aware of without this exposure to a larger community of loosely associated potential collaborators. The collective "net" is like some big mind in which our personal knowledge and intelligence is a single node. And big thoughts are brewing there -- a kind of massive parallelism, but with a recognizably human face. This point is critical; without the social banter and human connections, none of the rest is possible. People don't tend to trade for very long in fixed-field datapoints.

KNOWLEDGE vs INFOGLUT

When faxes become more than sheets of greasy low-resolution paper, disks are crammed full of CD-ROM and online database downloads, and e-mail-boxes begin to overflow, where does all this information go? Usually into directory structures on PCs or workstations: mail from Howard into this subdirectory, OCRed faxes on project X into that subdirectory, and so on. These structures represent attempts (usually fairly primitive) to create personal classification systems -- think of the Dewey Decimal System on a very small scale (see Note 2). But since nobody has all the answers, it is becoming increasingly necessary to share this "personal" information with one or more collaborative work teams, members of which may be distributed throughout a large corporation, or belong to entirely different organizations dispersed over several continents (see Note 3).

Initial responses to this problem include the construction of shared network directory structures, which attempt (though they often fail) to reflect some consensus about the categories of information they are designed to organize. A more systematic and comprehensive response is the adoption of concept-oriented information retrieval systems, such as Topic from Verity, Inc. (see info retrieval article). Topic enables the construction of "semantic networks," complex webs of ideas and their interrelationships. You can navigate the concepts along relational linkages until you find something that seems relevant, then ask for all the documents the system includes on that subject.

Because these conceptual maps are independent of the base data -- usually free-text information having no consistent format -- I think of this approach as a kind of "virtual hypertext." It is possible to empty a Topic textbase of its documentary contents, re-index a completely new, but substantively related, set of documents using the same topic nodes and links, and retain the fundamental relationships among the base information objects. This method differs significantly from the explicit embedding of hypertext links directly in the documents themselves.

However, explicit hypertext links are not an inferior approach by any means. Often, significant relationships cannot be deduced from conceptual proximity alone. Letters, faxes, e-mail, legal contracts, articles, CD-ROM dumps, online database downloads, source code, technical manuals, drawings and diagrams -- a wealth of document types -- can all be intimately related to a single project. Relying on the co-occurrence of certain key words to link these heterogeneous objects might not always work. Hypertext systems enable such objects to be bound together at their exact point of intersection -- a reference to a contract in a letter, for instance.

SOMETHING LIKE A WEB...

Another form -- what I call "structural hypertext" -- takes advantage of the inherent organization of many documents into logical sections, chapters, paragraphs, figures, tables. This allows people seeking information to browse at a high level, then drop down into the precisely relevant details -- that is, if the Table of Contents reflects a well designed book in the first place. DynaText from Electronic Book Technologies (see EBT article) provides for both types of hypertext, as well as straight Boolean queries against fully indexed document collections.

Many systems that deal with unstructured information run up against a very large barrier early on: proprietary data formats. For instance, all commercial word processing programs implement some form of "markup" strategy by which a writer can indicate centered headings, bold or italic text, indented paragraphs, enumerated lists, and so on. The way this markup is encoded differs for almost every product on the market. The net result, as we all know, is that Brand X word processing software cannot read files created by Brand Y, Brand Y cannot read files created using Brand Z, and so on. One way around this is to work in a text editor that uses only the ASCII character set. The original Unix facilities (vi, nroff, troff) took this approach, and I'm contrarian enough to believe that it is less backward than some may think. However, I seem to be in a small minority that prefers the power of raw Unix to the friendlier, prettier proprietary applications and GUIs.

Keep in mind that our word processing example is just that. The format/markup problem also pertains to graphics, database, spreadsheet, and just about any other kinds of files you can think of. Presenting relevant bits and pieces of this information to workgroups that need it -- when they need it -- has become a major challenge in most organizations. This is especially true in those companies attempting to break down the organizational walls and functional "chimneys" that have traditionally separated people who, today, critically require the benefit of each other's experience.

BATTLING BABEL

One approach to a solution here is software products that integrate information from multiple sources, irrespective of vendor. Microsoft Windows and the Apple Macintosh accomplish this by bumping the proprietary demand up a notch: you can share information as long as all your applications were designed explicitly for these environments. Sharing information between the two types is another story. While Bill Gates' vision of Information at your Fingertips -- or still farther out, John Scully's notion of a Knowledge Navigator -- is not a universal reality, vastly improved "transmigration of data" is certainly looming on the near horizon.

In the manufacturing arena, my own company has developed an environment called LINKAGE that serves as a sort of bridge allowing disparate types of information -- from widely distributed network sources and a broad array of proprietary software packages -- to be brought together in a single form on a Unix workstation or X-terminal screen. In addition to textual information of all kinds, these forms can present live vector graphics, raster images, information returned from SQL queries against multiple databases, even near-real-time video images captured with a very fast frame grabber. These forms can be scripted by people directly associated with production processes, even if they possess little or no programming knowledge.

By assembling it in one place, these screens can convey the core information required for self-directed work teams to execute, and continuously improve upon, extremely complex manufacturing tasks. Most important, perhaps, this environment is a two-way street, in that it not only delivers design and manufacturing engineering specifications to the shop floor, but also elicits feedback on potential problems and process improvements from people intimately familiar with actual manufacturing operations.

This kind of cross-functional collaboration is at the heart of what is called in some circles "concurrent engineering" -- simultaneously designing products and their associated manufacturing processes with early input from all groups that have relevant knowledge, regardless of where they may be located in the organization, either physically or hierarchically. With experienced people in the loop, an entire company can begin to learn through such systems, and thereby greatly expand its "organizational memory." In this light, the so-called knowledge acquisition bottleneck -- AI's great nemesis -- can be seen as the kind of problem typically generated by fundamentally flawed design principles.

SGML

While LINKAGE relies on a wide variety of data-type translators to integrate this broad bandwidth of manufacturing information, a more general solution is waiting in the wings -- and about to make a serious debut in corporate computing in this country. This solution goes by the somewhat daunting designation of Standard Generalized Markup Language; SGML to its friends. SGML is a well established ISO (International Standards Organization) standard and has been mandated by the U.S. Department of Defense as the markup methodology to be used in all aspects of the far-reaching Computer-aided Acquisitions and Logistics Support (CALS) program. It has also been accepted by the Air Transport Association, which represents airframe manufacturers, and a growing body of supporters spanning many industries, government agencies and service organizations throughout the world.

Without going into the details here (see the accompanying articles in this issue by Haviland Wright and Louis Reynolds), SGML enables companies to describe all the essential features of their most complex documents without becoming unwilling hostages to the proprietary "standards" imposed by most software vendors. A single document created in, or converted to, SGML format can be output in many different forms on many different types of systems. Attributes such as font size, typeface and method of emphasis (e.g., bold, italics, small caps) can be quickly and globally changed to suit circumstances.

Far more important, the inherent structure of documents -- the logical organization of their constituent elements such as chapters and sections -- can be described (marked up) in such a way that this information is never lost, no matter which target systems the documents ultimately end up on. Such documents can be transmitted in their "source" form over networks or otherwise exchanged with dissimilar systems, and then be "reconstituted" in whatever format is desired. Because source documents are simple ASCII, they can also be indexed and searched with great facility -- and without cumbersome translation -- by a single text-retrieval engine, even though they may have been authored using many dozens of different software products. The only requirement is that any software that creates, converts or interprets these documents must adhere to the same non- proprietary SGML standard. (FastTag and DynaText, featured later in this section, are prime examples.) The payoff that most industries are going after here is universally sharable information.

THE HUMAN ELEMENT

However, there is another, greater payoff for all the technologies outlined above. Though it is extremely hard to nail down and evaluate in terms an accountant might immediately appreciate, it nonetheless promises enormous strategic advantage. This advantage is the ability to create entire organizations capable of learning, both from their hardest-won insights and most egregious errors. Paradoxically, this potential advantage is inextricably linked with the perceived drawbacks of these same technologies: none are particularly easy to implement. Concept-based text retrieval, hypertext, document assembly, SGML -- even CIMLINC's own "applications synthesis" tools -- are simply not plug-and-play sorts of things. All require knowledge they cannot themselves supply. And this is, simultaneously, their best and worst aspect: they make people think.

There seems to be a widespread misconception in American culture that not knowing precisely what you are doing at all times and under all circumstances is a great sin, or at least a serious character flaw. We therefore see organizations chock-a-block with self-appointed experts, which are rapidly going out of business in the most authoritative and self-assured of styles. Occasionally -- say once a day -- it seems worthwhile to question such certain knowledge, and to wonder deeply whether anyone, starting with oneself, really understands what's going on. Painful perhaps, but incredibly effective. Especially considering the alternatives.

The search for push-button solutions has yielded little of lasting value. Promises of intelligent systems have often been responsible for the perpetration of grotesque stupidities. There is no magical, sacrosanct or immutable knowledge that can be packaged and delivered to the corporate doorstep to protect organizations from their own ignorance. Ignorance results solely from a failure to learn, for which there is no known technological cure. The deskilling that resulted from much "advanced" automation serves as a good example: learning quickly atrophied when corporations didn't seem to care any longer what workers knew -- or only wanted to know enough to automate their jobs away.

Companies facing the intense pressures of international competition must begin caring again -- and soon -- what everybody knows. They must encourage the open exchange of information by every means possible and forge new corporate cultures based on collegial discourse and collaborative exploration. This isn't easy. Just necessary. Because without such thoughtful discourse and engaged collaboration, innovation will stop. And if innovation stops, so will America.

The tools described in the following articles substantially increase access to a multifaceted constellation of narrative information resources. They can enable valuable old stories to be told through new channels, and vigorous new stories to emerge where only silence has reigned since the Second Industrial Revolution. Listen to what the people are saying; you can't afford not to. For good or ill, a third revolution is in the air.

NOTES

1) John Seely Brown, "Research That Reinvents the Corporation," Harvard Business Review, Jan.-Feb. 1991. For an extended discussion of the same theme, see John Seely Brown and Paul Duguid, "Organizational Learning and Communities-of-Practice: Toward a Unified View of Working, Learning and Innovation," Organization Science, Feb. 1991. (back to text)

2) The reference here to libraries is no accident. For more on the potential contributions of Library Science, see Christopher Locke, "The Dark Side of Document Image Processing," Byte, April, 1991. (back to text)

3) Ray Grenier and George Metes, Enterprise Networking: Working Together Apart, Digital Press, 1992. This excellent book takes both the social and technological dimensions of networking into account, and describes how they strongly affect each other. See also Lee Sproull and Sara Kiesler, Connections: New Ways of Working in the Networked Organization, MIT Press, 1991. (back to text)

SIDEBAR

The author has worked for Fujitsu International Engineering, Ricoh Software Research Center and the Fifth Generation Computer Systems project -- the Japanese government's primary artificial intelligence research initiative. He also served as director of corporate communications for two Pittsburgh-based AI firms: Carnegie Group and Intelligent Technology. Before joining Cimlinc in a similar capacity last year, he was director of industrial relations for the Robotics Institute at Carnegie Mellon University.

From June 1-3 in San Francisco, Locke will chair Collaboration '92, a new conference dealing with many of the issues raised in this article. Haviland Wright and Louis Reynolds, whose articles appear later in this section, will also present there. Subtitled "Learning to Team, Teaming to Learn," the conference will feature keynote talks by Michael Schrage of the Los Angeles Times and author of Shared Minds: The New Technologies of Collaboration (Random House, 1990), and Charles Garfield, author of Second to None: How Our Smartest Companies Put People First (Business One Irwin, 1992). Other speakers include John Seely Brown, Director of the Xerox Palo Alto Research Center, Etienne Wenger of the Institute for Research on Learning, and Terry Winograd, Professor of Computer Science at Stanford University.

These and many other speakers from consulting organizations, Fortune 100 companies and the national business media will address the factors influencing the paradigm shift that has taken place in corporate perspectives and priorities in recent years. They will put current organizational change into historical context by examining why previous corporate structures and styles of management are inadequate to present demands, and will describe the advantages accruing to organizations based on human initiative, local autonomy and collaborative learning. These difficult but necessary cultural changes taking place in business organizations can be either nurtured or obstructed depending on the computer and communications systems adopted today. This intersection of corporate culture and information technology constitutes a primary theme.

In addition, Locke is hosting an electronic version of this conference (collab) on the WELL, an online conferencing system which, though based in Sausalito, CA, connects a virtual community spanning many continents. (back to text)