Ten Themes

Here is the Visions of Hypertext idea broken down into ten themes:

» 1. Originators: Paul Otlet, Vannevar Bush, Douglas Engelbart, Ted Nelson, and others(Andries van Dam).

Ever since W. Boyd Rayward's explorations of Paul Otlet, the documentalist has regained his place as the founder of the documentation movement, creator of the UDC(w/ Henri La Fontaine), FID, and influential thinker in the field of Information Science. Yet, it was also suggested in Rayward(1994) that Paul Otlet was the original hypertext theorist. Surely, many well-known intellectuals put into use, or developed ideas of associative links(inherent in the auxiliary tables of the UDC), collective intelligence(Wells' World Brain and others, of course), text "chunking" and reorienting(anyone who has used index cards to jot ideas; Walter Benjamin, Will Durant etc.), but it was Paul Otlet who brought them all together in hopes of forming a repository of information available to all. His ideas touch every hypertext system that has come to being either implicitly or explicitly(Ted Nelson takes and furthers the idea of chunked text with StretchText).

It's no mystery that Vannaver Bush's article "As We May Think" is the most widely disseminated article that is seen as the predicate vision of hypertext. The creation of trails of associative links is spoken of at length in Randall Trigg's article in the book From Memex to Hypertext: Vannevar Bush and the Mind's Machine which considers the influence that Bush's ideas have had on the creation of hypertext. Seeing that a whole book is dedicated to the topic, I will not go into it at length. My notes on the article copy and occasionally explicate and connect thoughts regarding how "As We May Think" relates to hypertext and other current ideas in computing.

Douglas Engelbart is known not just for creating the computer mouse and other physical interface parts, but for developing and successfully demo-ing the oNLine System(NLS), a very early(the first?) hypertext system using links. Engelbart's thoughts were not merely on creating computer systems, but on using technology to better intelligence - this may lead to a greater discussion of how, if, and with what that has been done. His success led to an embrace of the ideas of hypertext in certain circles, and subsequent funding from Stanford University; his work on augmenting human intelligence continues to this day at the Bootstrap Institute.

Many other hypertext systems and people helped bring generalized visions of hypertext into their heyday between 1985-1993 until everything was subsumed by the WWW. A closer look at the systems developed at Brown University will be useful in the future to get a better idea of wide historical context. Furthermore, other than Otlet, who is only theoretically related to hypertext development, there is a lack of international perspective/input. Surely, something was going on across the pond dealing with similar ideas. This is another area that should be explored.

» 2. The importance of Networking.

Though the technology of the internet grew up around the same time as hypertext systems were sprouting in various places, the two were not adequately combined. Ted Nelson always envisioned Xanadu as a distributed system, but no working implementation followed, thus Xanadu is still "in the works." Other implementations were research projects, or commercial products sold on storage media like floppy disks, laser discs or compact discs. Apple's Hypercard(also see this) became a hit, achieving unprecedented penetration for a hypertext system because it came bundled with Mac computers. It was with early systems like gopher, Teletext, Minitel (see Gillies & Cailliau(2000), esp. "Bits & PCs"(pgs. 91-141)) that the potential of growth using network effects(see Shapiro & Varian(1998) for in-depth discussion of these economic principles) was seen. Now, the results of wide networking combined with hypertext are obvious whereas in the hypertext boom, the concept was not explored. One problem was the scalability of the systems; clients were very involved with many link types and particular features. A similar problem was that there were no standards - in fact, until the WWW became the de-facto standard and the W3C began recommending standards, hypertext was a relatively open market. Though useful in its own right, hypertext manipulation with the WWW, as proposed by Tim Berners-Lee(1989) would have missed the point had not networking been a main point. Still, it must be repeated that probably the most significant point of networking to hypertext is how it allowed a system to spread. The subsequent innovations and collective brain creations all owe themselves to the popularity of the system on which they are implemented. Without the 'network,' the collective brain is not very collective.

» 3. The primacy of Links.

If there is one feature that separates hypertext from a regular computerized word processing document it is the presence of links. Today, we have a simplified version of what linking means; due to issues like scalability, time constraints, standardization, etc. Tim Berners-Lee only programmed one type of link into the WWW: a one-way referential link limited to a single destination. Yet, at the outset, numerous link types were present in every hypertext system. Some ways to open up the field is to note the different options available for links. Though the idea of two-way self-updating links, links to more than one destination, or links that are specifically created to follow a hierarchy sound esoteric in terms of current WWW implementations or even to merely envision, when we step out of the WWW paradigm - to, for example, understand the workings of the Xanadu, or NoteCards systems - different link types are not only possible, but somewhat obvious. The emerging link glossary gives some idea of the types of links that have been around.

Another way to look at links is not through their technical characteristics, but through their rhetoric. What are certain links trying to signify in the way they exist? I created a taxonomy of basic WWW links(Kagan, 2008), breaking down links annotating a poem into three categories: informational, wordplay, emphasis. These categories are specific, in this case, to the author's poem annotations, but the concept that links carry implications far beyond merely technical or navigational conventions is clear. Surely, many writers have gone down this road, but the scope of the current work has not yet been expanded to include that. It is, however, something to keep in mind.

» 4. Features of hypertext systems

Many of the features we take for granted, and/or are amazed by in today's browsers are not, in fact, new innovations; bookmarks, versioning, tabbed browsing(to an extent), history, search(though fundamentally different; see keyword links in link glossary), and others were essential parts of hypertext systems. Of course, while the WWW has shifted from knowledge management towards an open-ended purpose, other systems, like Glasgow Online, HyperTIES, and the Perseus Project among others were built for education, or tourist information, or as in the case of gIBIS, to solve "wicked" problems. This specialization made them better than any other software to carry out their purpose, but probably hindered overall acceptance/popularity. Lack of standardization was also a result of makers pushing their own unique features which could not, without major overhauls be adopted across the board. A major feature of the original conception of the WWW that is overlooked is that it was supposed to be easy for the average person to write documents for the web; the Amaya browser is an example of what this would look like.

» 5. Functions: Learning and Problem Solving.

Much excitement was generated by the possibility of using hypertext to revolutionize the way students learn. The idea of links began to take on mythic proportions as a method for self-directed study. Students were to be put at a hypertext system and, in effect, absorb the knowledge available by navigating through various links in the system. The integrity of this idealistic scene have been studied thoroughly by psychologists and other scientists to determine whether the use of hypertext does indeed increase retention of knowledge. Most of the studies that the writer has read have not found a significant improvement in comprehension and retention merely from exposure to hypertext. Certainly, as a learning tool, hypertext systems provide another way for educators to present information to students. The systems allow far greater depth to be added to the material, giving only those who are interested in learning more the opportunity to do so. However, in a closed hypertext system, the absence of dialectic, there is no way for students to go beyond reading and clicking. Socrates' statement that one cannot argue with a book brings up a similar point.

In organizing information to be learned, hypertext does have advantages over regular documents. The ability to layer lessons in a non-hierarchical format allows students to visualize relationships between concepts in a different, non-linear way. Hypertext and hypermedia exhibit properties of new media, to give two ways, in that they are easily modified and potentially dynamic in structure. A break from top-to-bottom, left-to-right visual rhetoric lets educators shift how information is displayed, changing it with new developments, as well as to encourage discussion. Several well-known systems were created specifically for the purpose of instruction, and a simple search of the WWW reveals countless hypertext tutorials, delivered as simple documents, as textbooks, or as Online exhibits. New ways to educate by using hypertext ideas are being developed.

» 6. Functions: Knowledge Management.

Though some systems were designed for specialized purposes, there were also programs like NoteCards, HyperCard, Guide, and visions like Memex and Xanadu that were released for the open-ended purpose of knowledge management. Knowledge management here can mean the organization and process of idea-making, writing, mind-mapping, on a personal or collaborative level, and retrieval and storage of these ideas, and display. Collectively, this subset of hypertext systems should be known as intellectual technology since its purpose is to, like Engelbart and Nelson say, expand the possibilities of the mind(or memory) for the thinker. It is for this type of software that features like annotation, "trails" and easy read/write capabilities are most important. Today, knowledge management programs have taken several roads - as has been reiterated, the WWW was originally intended as a knowledge management tool at CERN, and expanded into all of the functions here. Eastgate System's Tinkerbox, Freemind, and other note-taking, mind-mapping, etc. software all have their roots in hypertext systems developed between 1985-1993, and before. Like many of the themes covered, the idea of knowledge management(also expressed in various keywords) draws from many different disciplines, cross-subjects, niches, and ratholes - a full run-down of it is practically impossible.

» 7. Functions: Art.

From the art/humanities world comes much of the theory involving hypertext. It is even fair to say that there is more theorizing about hypertext art (commentary like Steven Johnson's "meta-shows" in Interface Culture(1997)) than there is actual art. From the writer's perspective the theory is actually more relevant than the art itself, which is often visually ugly and linguistically nonsensical. The bugaboo is that these two characteristics are occasionally considered features of the works. Like much modern art, the digital art landscape is littered with commentators seeing wonders in pedestrian creations - staunchly defending the integrity of the creations they critique. The most widely cited hypertext fiction is Afternoon by Michael Joyce, a multi-dimensional narrative of a man's time. The lauded feature of Afternoon is how the reader becomes the creator of the plot through choosing which links to follow. Arguably the same can be said for the Leisure Suit Larry games, where players click on hot spots leading the irreverent Larry on adventures. The difference is in presentation: visual(LLL) and textual(Afternoon) - this difference is bridged in Shelley Jackson's Patchwork Girl. Likely, Michael Joyce and Shelley Jackson take their work more seriously than they would LLL. Eastgate System's Reading Room has links to hypertext fictions. Storyscape is a program created by Eastgate Systems to write hypertext fiction, though the WWW has made the actual creation easier if the author can keep track of his own link diagram. George P. Landow's book on Hypertext(1992, 1997, 2006 — to be annotated) go deep into the theoretical issues brought up by hypertext art.

» 8. Shifting Interfaces: Browsers.

The idea of the "browser" or interface of the hypertext systems changed dramatically as the WWW grew. Before the Mosaic, Viola etc. browsers interfaces were more like current desktop systems; there would be multiple windows inside of the main window of the program. Using that type of interface, it was easier for users to see the layout of information in a more complete way then the current one-site-at-a-time approach. One must remember, however that the "nodes" users were looking at in hypertext systems were not necessarily long web pages; many systems were designed to pop-up short chunks, or notecards of information. HyperCard was designed on this pretext, that "stuff" would be written on little cards. The Xanadu system differs from this in that it was supposed to have a more free-flowing interface, though in every demonstration the writer has seen there are two monolithic documents parallel to each other. The type of interface seen in early hypertext systems is best mirrored with several of today's browsers open on a dual monitor set-up - if only those windows were made to interact with each other, the environment would be similar. It seems, then, that the contention is that "browsing" had taken a step back when Mosaic became the de-facto WWW browser in 1993-1994, followed by the similar Internet Explorer etc. etc. reducing the information-scape to one-site-at-a-time. Despite the great popularity of the WWW serving to allow for lots of information - many of the good interface features of past hypertext systems were ignored and lost, among them, expansive front-ends(no pun intended).

» 9. Tim Berners-Lee and the World Wide Web.

Envisioning relationships between people at CERN as a web, Tim Berners-Lee had created a program called ENQUIRE in the early 1980s to keep track of information about his friends and the projects they were working on. The program was not networked and had limitations, but the idea became inspiration for the WWW. Relatively simple compared to other hypertext systems and ideas(Xanadu), the WWW is basically the http protocol, the HTML language and browser software. This simplicity as well as the price(free) gave the WWW a boost over other protocols(gopher) designed around the same time. Though in itself, the WWW does not mean much, its network effects are staggering; the more web pages there are, the more useful the system becomes. Unlike non-networked hypertext systems which each may represent one site(since many used the same underlying software to create different learning modules), the WWW allows contents to be accessed by all. It is admitted that the WWW is now bigger than we can encompass in a mere Google search, this leads to questions of control, access, and authority, among others. The W3C is the standards body for the WWW. It is currently working on an "upgrade" for the WWW called the Semantic Web, which attempts to contextualize information on websites, allowing for more efficient metadata generation, hopefully leading to better retrieval.

» 10. Hypertext in the Future.

Though the idea of the WWW seems simple, web page creation is far from that even for novice computer users. Web 2.0 is supposed to make all of this interaction easier for everyone, but does it? Looking at the whole picture, we must consider that people's interactions with hypertext documents is completely controlled by the programmer who created the infrastructure. While most internet users can easily post a comment, they can only get their own Web 2.0 concoction by using code and design templates invented by others. While HTML, and to a certain extent CSS are relatively easy to learn, languages that create innovative hypertext these days have a high barrier of entry. PHP, Flash, Javascript are all too advanced for most people to pick up in an afternoon. This means that while new web technologies are allowing more interaction within hypertexts(I'm thinking of complicated applications like Google Docs or Thinkfree Office) these are basically black boxes. In thinking about the future of hypertext, currently this is inextricably tied to the WWW, we must consider how we view the "openness" of our capabilities. It is these capabilities - the traits that help us think - that were envisioned in early hypertext systems and must live on in the future.

Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-ShareAlike 3.0 License