Category Archives: structure

thinking about tex

Chances are that unless you’re a mathematician or a physicist you don’t know anything about TeX. TeX is a computerized typesetting system begun in the late 1970s; since the 1980s, it’s been the standard way in which papers in the hard sciences are written on computers. TeX preceded PostScript, PDF, HTML, and XML, though it has connections to all of them. In general, though, people who aren’t hard scientists don’t tend to think about TeX any more; designers tend to think of it as a weird dead end. But TeX isn’t simply computer history: it’s worth thinking about in terms of how attitudes towards design and process have changed over time. In a sense, it’s one of the most serious long-term efforts to think about how we represent language on computers.
TeX famously began its life as a distraction. In 1977, Donald Knuth, a computer scientist, wanted a better way to typeset his mammoth book about computer science, The Art of Computer Science, a book which he is still in the process of writing. The results of phototypesetting a book heavy on math weren’t particularly attractive; Knuth reasoned that he could write a program that would do a better job of it. He set himself to learning how typesetting worked; in the end, what he constructed was a markup language that authors could write in and a program that would translated the markup language into files representing finished pages that could be sent to any printer. Because Knuth didn’t do anything halfway, he created this in his own programming language, WEB; he also made a system for creating fonts (a related program called Metafont) and his own fonts. There was also his concept of literate programming. All of it’s open source. In short, it’s the consummate computer science project. Since its initial release, TeX has been further refined, but it’s remained remarkably stable. Specialized versions have been spun off of the main program. Some of the most prominent are LaTeX, for basic paper writing, XeTeX, for non-Roman scripts, and ConTeXt, for more complex page design.
How does TeX work? Basically, the author writes entirely in plain text, working with a markup language, where bits of code starting with “” are interspersed with the content. In LaTeX, for example, this:

frac{1}{4}

results in something like this:

onequarter.gif

The “1” is the numerator of the fraction; the “4” is the denominator, and the frac tells LaTeX to make a top-over-bottom fraction. Commands like this exist for just about every mathematical figure; if commands don’t exist, they can be created. Non-math text works the same way: you type in paragraphs separated by blank lines and the TeX engine figures out how they’ll look best. The same system is used for larger document structures: the documentstyle{} command tells LaTeX what kind of document you’re making (a book, paper, or letter, for example). chapter{chapter title} makes a new chapter titled “chapter title”. Style and appearance is left entirely to the program: the author just tells the computer what the content is. If you know what you’re doing, you could write an entire book without leaving your text editor. While TeX is its own language, it’s not much more complicated than HTML and generally makes more sense; you can learn the basics in an afternoon.
To an extent, this is a technologically determined system. It’s based around the supposition of limited computing power: you write everything, then dump it into TeX which grinds away while you go out for coffee, or lunch, or home for the night. Modern TeX systems are more user friendly: you can type in some content and press the “Typeset” button to generate a PDF that can be inspected for problems. But the underlying concept is the same: TeX is document design that works in exactly the same way that a compiler for computer programs. Or, to go further back, the design process is much the same as metal typesetting: a manuscript is finished, then handed over to a typesetter.
This is why, I think, there’s the general conception among designers that TeX is a sluggish backwater. While TeX slowly perfected itself, the philosophy of WYSIWYG – What You See Is What You Get – triumphed. The advent of desktop publishing – programs like Quark XPress and Pagemaker – gave anyone with a computer the ability to change the layouts of text and images. (Publishing workflows often keep authors from working directly in page layout programs, but this is not an imposition of software. While it was certainly a pain to edit text in old versions of PageMarker, there’s little difference between editing text in Word and InDesign.) There’s an immediate payoff to a WYSIWYG system: you make a change and its immediately reflected in what you see on the screen. You can try things out. Designers tend to like this, though it has to be said that this working method has enabled a lot of very bad design.
(There’s no shortage of arguments against WYSIWYG design – the TeX community is nothing if not evangelical – but outside of world of scientific writing, it’s an uphill battle. TeX seems likely to hold ground in the hard sciences. There, perhaps, the separation between form and content is more absolute than elsewhere. Mathematics is an extremely precise visual embodiment of language; created by a computer scientist, TeX understands this.)
As print design has changed from a TeX-like model to WYSIWYG, so has screen design, where the same dichotomy – code vs. appearance – operates. Not that document design hasn’t taken something from Tex: TeX’s separation of content from style becomes a basic principle in XML, but where TeX has a specific task, XML generalizes. But in general, the idea of leaving style to those who specialize in it (the designer, in the case of print book design, the typesetting engine in TeX) is an idea that’s disappearing. The specialist is being replaced by the generalist. The marketer putting bulletpoints in his PowerPoint probably formats it himself, though it’s unlikely he’s been trained to do so. The blogger inadvertently creates collages by mixing words and images. Design has been radically decentralized.
The result of this has been mixed: there’s more bad design than ever before, simply because there’s more design. Gramsci comes uncomfortably to mind: “The crisis consists precisely in the fact that the old is dying and the new cannot be born; in this interregnum a great variety of morbid symptoms appear.” If TeX comes from an old order, it’s an order with admirable clarity of purpose. Thinking about TeX points out an obvious failure in the educational system that creates the current environment: while most of us are taught, in some form or another, to write, very few are taught to visually present information, and it’s not clear that a need for this is generally perceived. If we’re all to e designers, we have to all think like designers.