Borders, in partnership with Lulu.com, has launched a comprehensive personal publishing platform, enabling anyone to design and publish their own (print) book and have it distributed throughout the Borders physical and online retail chain. Beyond the basic self-publishing tools, authors can opt for a number of service packages: simple ISBN registration (49 bucks), the basic package ($299), in which someone designs and formats your book for you, and the premium ($499), in which you get all the afore-mentioned plus “editorial evaluation.” According to the demo, you can even pay to have your own book tour and readings in actual Borders stores, bringing vanity publishing to a whole new level of fantasy role-playing. Writing and publishing, as the Borders site proclaims in its top banner, is now a “lifestyle.”
A side thought. It’s curious how “vanity publishing” as a cultural category seems to have a very clear relationship with the print book but a far more ambiguous one with the digital. Of course the Web as a whole could be viewed as one enormous vanity press, overflowing with amateur publishers and self-appointed authors, yet for some reason the vanity label is seldom applied -? though a range of other, comparable disparagements (“cult of the amateur”, “the electronic mob” etc.) sometimes are. But these new labels tend to be issued in a reactionary way, not with the confident, sneering self-satisfaction that has usually accompanied noses snobbishly upturned at the self-published.
In the realm of print, there is (or traditionally has been) something vain, pretentious, even delusional, in the laying out of cash to simulate a kind of publication that is normally granted, by the forces of economics and cultural arbitration, to a talented or lucky few. Of course, so-called vanity publishing can also come from a pure impulse to get something out into the world that no one is willing to pay for, but generally speaking, it is something we’ve looked down on. Blogs, MySpace, personal web pages and the like arise out of a different set of socio-economic conditions. The barriers to publication are incredibly low (digital divide notwithstanding), and so authorship online is perceived differently than in print, even if it still arises out of the same basic need to communicate. It feels more like simply taking part in a conversation, participating in a commons. One is not immediately suspicious of the author’s credibility in quite the same way as when the self-financed publication is in print.
This is not to suggest that veracity, trust and quality control are no longer concerns on the Web. Quite the contrary. In fact we must develop better and more sensitive instruments of bullshit detection than ever before to navigate a landscape that lacks the comfortingly comprehensive systems of filtering and quality control that the publishing industry traditionally provided. But “vanity publishing” as a damning label, designed to consign certain types of books to a fixed cultural underclass, loses much of its withering power online. Electronic authorship comes with the possibility of social mobility. What starts as a vanity operation can, with time, become legitimized and respected through complex social processes that we are only beginning to be able to track. Self-publishing is simply a convenient starter mechanism, not a last resort for the excluded.
And with services like Lulu and the new Borders program, we’re seeing some of that social mobility reflected back onto print. New affordances of digital production and the flexibility of print on demand have radically lowered the barriers to publishing in print as well as in bits, and so what was once dismissed categorically as vanity is now transforming into a complex topography of niche markets where unmet readerly demands can finally be satisfied by hitherto untapped authorial supplies.
All the world’s a vanity press and we have to learn to make sense of what it produces.
Category Archives: books
book machine
Philip M. Parker, a professor at Insead, the international business school based in Fontainebleau, France, has written 85,000 books and counting. He’s like a machine. In fact, he has a machine that writes them for him. The Guardian has more.
Most, if not all, of these books can be found on Amazon. Sifting through them felt like a bad riff on “The Library of Babel.” I felt like I’d stumbled upon a weird new form of bibliographic spam -? thousands of machine-generated titles gumming up the works, jamming the signal, eroding the utility of the library. Matt Kirschenbaum, who forwarded the link, said it recalled the book machines in Italo Calvino’s great meta-novel, If On A Winter’s Night a Traveler:
He has you taken into the machine room. “Allow me to introduce our programmer, Sheila.”
Before you, in a white smock buttoned up to the neck, you see Corinna-Gertrude-Alfonsina, who is tending a battery of smooth metallic appliances, like dishwashers. “These are the memory units that have stored the whole text of Around an empty grave. The terminal is a printing apparatus that, as you see, can reproduce the novel word for word from the beginning to the end,” the officer says. A long sheet unrolls from a kind of typewriter which, with machine-gun speed, is covering it with cold capital letters.
Prices are often absurdly inflated, up to the many hundreds of dollars. While, on Amazon, you can’t peek inside any of the books, the product descriptions read like prose recycled from free government business or health leaflets (stuff that usually feels like it was written by a machine anyway). There seem to be a few dozen tropes which are repeated with slight variations ad nauseum. A few sample titles:
-? The 2007 Report on Wood Toilet Seats: World Market Segmentation by City (330pp., $795)
-? The 2007-2012 Outlook for Lemon-Flavored Bottled Water in Japan (140pp., $495)
-? Avocados: A Medical Dictionary, Bibliography, and Annotated Research Guide (108pp., $28.95)
-? Brain Injuries – A Medical Dictionary, Bibliography, and Annotated Research Guide to Internet References (244pp., $28.95)
In fact, there’s a whole trope of titles that are guides to “internet references,” which makes me wonder if Parker’s machine is just scraping the entire Web for content.
Odd.
developing books in networked communities: a conversation with don waters
Two weeks ago, when the blog-based peer review of Noah Wardrip-Fruin’s Expressive Processing began on Grand Text Auto, Bob sent a note about the project to Don Waters, the program officer for scholarly communications at the Andrew W. Mellon Foundation -? someone very much at the forefront of developments in the digital publishing arena. He wrote back intrigued but slightly puzzled as to the goals, scope and definitions of the experiment. We forwarded the note to Noah and to Doug Sery, Noah’s editor at MIT Press, and decided each to write some clarifying responses from our different perspectives: book author/blogger (Noah), book editor (Doug), and web editor (myself). The result is an interesting exchange about networked publishing and useful meta-document about the project. As our various responses, and Don’s subsequent reply, help to articulate, playing with new forms of peer review is only one aspect of this experiment, and maybe not even the most interesting one. The exchange is reproduced below (a couple of names mentioned have been made anonymous).
Don Waters (Mellon Foundation):
Thanks, Bob. This is a very interesting idea. In reading through the materials, however, I did not really understand how, if at all, this “experiment” would affect MIT Press behavior. What are the hypotheses being tested in that regard? I can see, from one perspective, that this “experiment” would result purely in more work for everyone. The author would get the benefit of the “crowd” commenting on his work, and revise accordingly, and then the Press would still send the final product out for peer review and copy editing prior to final publication.
Don
Ben Vershbow (Institute for the Future of the Book):
There are a number of things we set out to learn here. First, can an open, Web-based review process make a book better? Given the inherently inter-disciplinary nature of Noah’s book, and the diversity of the Grand Text Auto readership, it seems fairly likely that exposing the manuscript to a broader range of critical first-responders will bring new things to light and help Noah to hone his argument. As can be seen in his recap of discussions around the first chapter, there have already been a number of incisive critiques that will almost certainly impact subsequent revisions.
Second, how can we use available web technologies to build community around a book, or to bring existing communities into a book’s orbit? “Books are social vectors, but publishers have been slow to see it,” writes Ursula K. Le Guin in a provocative essay in the latest issue of Harper’s. For the past three years, the Institute for the Future of the Book’s mission has been to push beyond the comfort zone of traditional publishers, exploring the potential of networked technologies to enlarge the social dimensions of books. By building a highly interactive Web component to a text, where the author and his closest peers are present and actively engaged, and where the entire text is accessible with mechanisms for feedback and discussion, we believe the book will occupy a more lively and relevant place in the intellectual ecology of the Internet and probably do better overall in the offline arena as well.
The print book may have some life left in it yet, but it now functions within a larger networked commons. To deny this could prove fatal for publishers in the long run. Print books today need dynamic windows into the Web and publishers need to start experimenting with the different forms those windows could take or else retreat further into marginality. Having direct contact with the author -? being part of the making of the book -? is a compelling prospect for the book’s core audience and their enthusiasm is likely to spread. Certainly, it’s too early to make a definitive assessment about the efficacy of this Web outreach strategy, but initial indicators are very positive. Looked at one way, it certainly does create more work for everyone, but this is work that has to be done. At the bare minimum, we are building marketing networks and generating general excitement about the book. Already, the book has received a great deal of attention around the blogosphere, not just because of its novelty as a publishing experiment, but out of genuine interest in the subject matter and author. I would say that this is effort well spent.
It’s important to note that, despite CHE’s lovely but slightly sensational coverage of this experiment as a kind of mortal combat between traditional blind peer review and the new blog-based approach, we view the two review processes as complementary, not competitive. At the end, we plan to compare the different sorts of feedback the two processes generate. Our instinct is that it will suggest hybrid models rather than a wholesale replacement of one system with another.
That being said, our instincts tell us that open blog-based review (or other related forms) will become increasingly common practice among the next generation of academic writers in the humanities. The question for publishers is how best to engage with, and ideally incorporate, these new practices. Already, we see a thriving culture of pre-publication peer review in the sciences, and major publishers such as Nature are beginning to build robust online community infrastructures so as to host these kinds of interactions within their own virtual walls. Humanities publishers should be thinking along the same lines, and partnerships with respected blogging communities like GTxA are a good way to start experimenting. In a way, the MIT-GTxA collab represents an interface not just of two ideas of peer review but between two kinds of publishing imprints. Both have built a trusted name and become known for a particular editorial vision in their respective (and overlapping) communities. Each excels in a different sort of publishing, one print-based, the other online community-based. Together they are greater than the sum of their parts and suggest a new idea of publishing that treats books as extended processes rather than products. MIT may regard this as an interesting but not terribly significant side project for now, but it could end up having a greater impact on the press (and hopefully on other presses) than they expect.
All the best,
Ben
Noah Wardrip-Fruin (author, UC San Diego):
Hi Bob –
Yesterday I went to meet some people at a game company. There’s a lot of expertise there – and actually quite a bit of reflection on what they’re doing, how to think about it, and so on. But they don’t participate in academic peer review. They don’t even read academic books. But they do read blogs, and sometimes comment on them, and I was pleased to hear that there are some Grand Text Auto readers there.
If they comment on the Expressive Processing manuscript, it will create more work for me in one sense. I’ll have to think about what they say, perhaps respond, and perhaps have to revise my text. But, from my perspective, this work is far outweighed by the potential benefits: making a better book, deepening my thinking, and broadening the group that feels academic writing and publishing is potentially relevant to them.
What makes this an experiment, from my point of view, is the opportunity to also compare what I learn from the blog-based peer review to what I learn from the traditional peer review. However, this will only be one data point. We’ll need to do a number of these, all using blogs that are already read by the audience we hope will participate in the peer review. When we have enough data points perhaps we’ll start to be able to answer some interesting questions. For example, is this form of review more useful in some cases than others? Is the feedback from the two types of review generally overlapping or divergent? Hopefully we’ll learn some lessons that presses like MITP can put into practice – suggesting blog-based review when it is most appropriate, for example. With those lessons learned, it will be time to design the next experiment.
Best,
Noah
Doug Sery (MIT Press):
Hi Bob,
I know Don’s work in digital libraries and preservation, so I’m not surprised at the questions. While I don’t know the breadth of the discussions Noah and Ben had around this project, I do know that Noah and I approached this in a very casual manner. Noah has expressed his interest in “open communication” any number of times and when he mentioned that he’d like to “crowd-source” “Expressive Processing” on Grand Text Auto I agreed to it with little hesitation, so I’m not sure I’d call it an experiment. There are no metrics in place to determine whether this will affect sales or produce a better book. I don’t see this affecting the way The MIT Press will approach his book or publishing in general, at least for the time being.
This is not competing with the traditional academic press peer-review, although the CHE article would lead the reader to believe otherwise (Jeff obviously knows how to generate interest in a topic, which is fine, but even a games studies scholar, in a conversation I had with him today, laughingly called the headline “tabloidesque.”) . While Noah is posting chapters on his blog, I’m having the first draft peer-reviewed. After the peer-reviews come in, Noah and I will sit down to discuss them to see if any revisions to the manuscript need to be made. I don’t plan on going over the GTxA comments with Noah, unless I happen to see something that piques my interest, so I don’t see any additional work having to be done on the part of MITP. It’s a nice way for Noah to engage with the potential audience for his ideas, which I think is his primary goal for all of this. So, I’m thinking of this more as an exercise to see what kind of interest people have in these new tools and/or mechanisms. Hopefully, it will be a learning experience that MITP can use as we explore new models of publishing.
Hope this helps and that all’s well.
Best,
Doug
Don Waters:
Thanks, Bob (and friends) for this helpful and informative feedback.
As I understand the explanations, there is a sense in which the experiment is not aimed at “peer review” at all in the sense that peer review assesses the qualities of a work to help the publisher determine whether or not to publish it. What the exposure of the work-in-progress to the community does, besides the extremely useful community-building activity, is provide a mechanism for a function that is now all but lost in scholarly publishing, namely “developmental editing.” It is a side benefit of current peer review practice that an author gets some feedback on the work that might improve it, but what really helps an author is close, careful reading by friends who offer substantive criticism and editorial comments. Most accomplished authors seek out such feedback in a variety of informal ways, such as sending out manuscripts in various stages of completion to their colleagues and friends. The software that facilitates annotation and the use of the network, as demonstrated in this experiment, promise to extend this informal practice to authors more generally. I may have the distinction between peer review and developmental editing wrong, or you all may view the distinction as mere quibbling, but I think it helps explain why CHE got it so wrong in reporting the experiment as struggle between peer review and the blog-based approach. Two very different functions are being served, and as you all point out, these are complementary rather than competing functions.
I am very intrigued by the suggestions that scholarly presses need to engage in this approach more generally, and am eagerly learning from this and related experiments, such as those at Nature and elsewhere, more about the potential benefits of this kind of approach.
Great work and many thanks for the wonderful (and kind) responses.
Best,
Don
“books are social vectors”
Some choice quotes from Ursula K. Le Guin’s terrific new Harper’s essay, “Staying Awake: Notes on the alleged decline of reading” (unfortunately behind pay wall):
Books are social vectors, but publishers have been slow to see it. They barely even noticed book clubs until Oprah goosed them. But then the stupidity of the contemporary, corporation-owned publishing company is fathomless: they think they can sell books as commodities.
…I keep hoping the corporations will wake up and realize that publishing is not, in fact, a normal business with a nice healthy relationship to capitalism. Elements of publishing are, or can be forced to be, successfully capitalistic: the textbook industry is all too clear a proof of that. How-to books and the like have some market predictability. But inevitably some of what publishers publish is, or is partly, literature -? art. And the relationship of art to capitalism is, to put it mildly, vexed. It has not been a happy marriage.
expressive processing meta
To mark the posting of the final chunk of chapter 1 of the Expressive Processing manuscript on Grand Text Auto, Noah has kicked off what will hopefully be a revealing meta-discussion to run alongside the blog-based peer review experiment. The first meta post includes a roundup of comments from the first week and invites readers to comment on the process as a whole. As you’ll see, there’s already been some incisive feedback and Noah is mulling over revisions. Chapter 2 starts tomorrow.
In case you missed it, here’s an intro to the project.
amazon reviewer no. 7 and the ambiguities of web 2.0
Slate takes a look at Grady Harp, Amazon’s no. 7-ranked book reviewer, and finds the amateur-driven literary culture there to be a much grayer area than expected:
Absent the institutional standards that govern (however notionally) professional journalists, Web 2.0 stakes its credibility on the transparency of users’ motives and their freedom from top-down interference. Amazon, for example, describes its Top Reviewers as “clear-eyed critics [who] provide their fellow shoppers with helpful, honest, tell-it-like-it-is product information.” But beneath the just-us-folks rhetoric lurks an unresolved tension between transparency and opacity; in this respect, Amazon exemplifies the ambiguities of Web 2.0. The Top 10 List promises interactivity – ?”How do I become a Top Reviewer?” – ?yet Amazon guards its rankings algorithms closely…. As in any numbers game (tax returns, elections) opacity abets manipulation.
expressive processing: an experiment in blog-based peer review
An exciting new experiment begins today, one which ties together many of the threads begun in our earlier “networked book” projects, from Without Gods to Gamer Theory to CommentPress. It involves a community, a manuscript, and an open peer review process -? and, very significantly, the blessing of a leading academic press. (The Chronicle of Higher Education also reports.)
The community in question is Grand Text Auto, a popular multi-author blog about all things relating to digital narrative, games and new media, which for many readers here, probably needs no further introduction. The author, Noah Wardrip-Fruin, a professor of communication at UC San Diego, a writer/maker of digital fictions, and, of course, a blogger at GTxA. His book, which starting today will be posted in small chunks, open to reader feedback, every weekday over a ten-week period, is called Expressive Processing: Digital Fictions, Computer Games, and Software Studies. It probes the fundamental nature of digital media, looking specifically at the technical aspects of creation -? the machines and software we use, the systems and processes we must learn end employ in order to make media -? and how this changes how and what we create. It’s an appropriate guinea pig, when you think about it, for an open review experiment that implicitly asks, how does this new technology (and the new social arrangements it makes possible) change how a book is made?
The press that has given the green light to all of this is none other than MIT, with whom Noah has published several important, vibrantly inter-disciplinary anthologies of new media writing. Expressive Processing his first solo-authored work with the press, will come out some time next year but now is the time when the manuscript gets sent out for review by a small group of handpicked academic peers. Doug Sery, the editor at MIT, asked Noah who would be the ideal readers for this book. To Noah, the answer was obvious: the Grand Text Auto community, which encompasses not only many of Noah’s leading peers in the new media field, but also a slew of non-academic experts -? writers, digital media makers, artists, gamers, game designers etc. -? who provide crucial alternative perspectives and valuable hands-on knowledge that can’t be gotten through more formal channels. Noah:
Blogging has already changed how I work as a scholar and creator of digital media. Reading blogs started out as a way to keep up with the field between conferences — and I soon realized that blogs also contain raw research, early results, and other useful information that never gets presented at conferences. But, of course, that’s just the beginning. We founded Grand Text Auto, in 2003, for an even more important reason: blogs can create community. And the communities around blogs can be much more open and welcoming than those at conferences and festivals, drawing in people from industry, universities, the arts, and the general public. Interdisciplinary conversations happen on blogs that are more diverse and sustained than any I’ve seen in person.
Given that ours is a field in which major expertise is located outside the academy (like many other fields, from 1950s cinema to Civil War history) the Grand Text Auto community has been invaluable for my work. In fact, while writing the manuscript for Expressive Processing I found myself regularly citing blog posts and comments, both from Grand Text Auto and elsewhere….I immediately realized that the peer review I most wanted was from the community around Grand Text Auto.
Sery was enthusiastic about the idea (although he insisted that the traditional blind review process proceed alongside it) and so Noah contacted me about working together to adapt CommentPress to the task at hand.
The challenge technically was to integrate CommentPress into an existing blog template, applying its functionality selectively -? in other words, to make it work for a specific group of posts rather than for all content in the site. We could have made a standalone web site dedicated to the book, but the idea was to literally weave sections of the manuscript into the daily traffic of the blog. From the beginning, Noah was very clear that this was the way it needed to work, insisting that the social and technical integration of the review process were inseparable. I’ve since come to appreciate how crucial this choice was for making a larger point about the value of blog-based communities in scholarly production, and moreover how elegantly it chimes with the central notions of Noah’s book: that form and content, process and output, can never truly be separated.
Up to this point, CommentPress has been an all or nothing deal. You can either have a whole site working with paragraph-level commenting, or not at all. In the technical terms of WordPress, its platform, CommentPress is a theme: a template for restructuring an entire blog to work with the CommentPress interface. What we’ve done -? with the help of a talented WordPress developer named Mark Edwards, and invaluable guidance and insight from Jeremy Douglass of the Software Studies project at UC San Diego (and the Writer Response Theory blog) -? is made CommentPress into a plugin: a program that enables a specific function on demand within a larger program or site. This is an important step for CommentPress, giving it a new flexibility that it has sorely lacked and acknowledging that it is not a one-size-fits-all solution.
Just to be clear, these changes are not yet packaged into the general CommentPress codebase, although they will be before too long. A good test run is still needed to refine the new model, and important decisions have to be made about the overall direction of CommentPress: whether from here it definitively becomes a plugin, or perhaps forks into two paths (theme and plugin), or somehow combines both options within a single package. If you have opinions on this matter, we’re all ears…
But the potential impact of this project goes well beyond the technical.
It represents a bold step by a scholarly press -? one of the most distinguished and most innovative in the world -? toward developing new procedures for vetting material and assuring excellence, and more specifically, toward meaningful collaboration with existing online scholarly communities to develop and promote new scholarship.
It seems to me that the presses that will survive the present upheaval will be those that learn to productively interact with grassroots publishing communities in the wild of the Web and to adopt the forms and methods they generate. I don’t think this will be a simple story of the blogosphere and other emerging media ecologies overthrowing the old order. Some of the older order will die off to be sure, but other parts of it will adapt and combine with the new in interesting ways. What’s particularly compelling about this present experiment is that it has the potential to be (perhaps now or perhaps only in retrospect, further down the line) one of these important hybrid moments -? a genuine, if slightly tentative, interface between two publishing cultures.
Whether the MIT folks realize it or not (their attitude at the outset seems to be respectful but skeptical), this small experiment may contain the seeds of larger shifts that will redefine their trade. The most obvious changes leveled on publishing by the Internet, and the ones that get by far the most attention, are in the area of distribution and economic models. The net flattens distribution, making everyone a publisher, and radically undercuts the heretofore profitable construct of copyright and the whole system of information commodities. The effects are less clear, however, in those hardest to pin down yet most essential areas of publishing -? the territory of editorial instinct, reputation, identity, trust, taste, community… These are things that the best print publishers still do quite well, even as their accounting departments and managing directors descend into panic about the great digital undoing. And these are things that bloggers and bookmarkers and other web curators, archivists and filterers are also learning to do well -? to sift through the information deluge, to chart a path of quality and relevance through the incredible, unprecedented din.
This is the part of publishing that is most important, that transcends technological upheaval -? you might say the human part. And there is great potential for productive alliances between print publishers and editors and the digital upstarts. By delegating half of the review process to an existing blog-based peer community, effectively plugging a node of his press into the Web-based communications circuit, Doug Sery is trying out a new kind of editorial relationship and exercising a new kind of editorial choice. Over time, we may see MIT evolve to take on some of the functions that blog communities currently serve, to start providing technical and social infrastructure for authors and scholarly collectives, and to play the valuable (and time-consuming) roles of facilitator, moderator and curator within these vast overlapping conversations. Fostering, organizing, designing those conversations may well become the main work of publishing and of editors.
I could go on, but better to hold off on further speculation and to just watch how it unfolds. The Expressive Processing peer review experiment begins today (the first actual manuscript section is here) and will run for approximately ten weeks and 100 thousand words on Grand Text Auto, with a new post every weekday during that period. At the end, comments will be sorted, selected and incorporated and the whole thing bundled together into some sort of package for MIT. We’re still figuring out how that part will work. Please go over and take a look and if a thought is provoked, join the discussion.
nominate the best tech writing of 2007
digitalculturebooks, a collaborative imprint of the University of Michigan press and library, publishes an annual anthology of the year’s best technology writing. The nominating process is open to the public and they’re giving people until January 31st to suggest exemplary articles on “any and every technology topic–biotech, information technology, gadgetry, tech policy, Silicon Valley, and software engineering” etc.
The 2007 collection is being edited by Clive Thompson. Last year’s was Steven Levy. When complete, the collection is published as a trade paperback and put in its entirety online in clean, fully searchable HTML editions, so head over and help build what will become a terrific open access resource.
the year of reading dangerously
2008 is going well so far for the Institute in London – I was invited to 10 Downing Street this morning for the launch of the National Year of Reading which takes place in 2008, as one of a small group including literacy promoters, librarians, teachers, schoolchildren, authors and Richard Madeley, the presenter who with his partner Judy has become the British equivalent of Oprah, hosting a hugely influential TV book group which helps the trade to sell stacks of the titles it recommends. Prime Minister Gordon Brown has had a rough few months since taking over from Blair, but was at his best today – he’s a genuine enthusiast for reading.
One topic for discussion was the importance of fathers reading to their children, and in particular to their sons. There are so many opportunities for new media here to help reach out to those who don’t think of themselves as ‘book people’.
Ten years ago the first Year of Reading kicked off a lot of activities and alliances which have thrived since, but I don’t remember anyone giving much attention to the internet – except as a place to download resources from. So I was delighted to be there this time representing the Institute, and able to make the point at the outset that any promotion of the importance of literacy skills, reading appetite and the pleasure of literature must recognise the cultural importance of the networked screen and the interconnectedness of different media in the minds of young people and the lives of us all, even those who don’t acknowledge this. Well, I kind of made that point…briefly and perhaps not so clearly. Anyway, I was there and got to speak up for if:book. The year has a different theme each month, ending with the Future of Reading in December, so we are planning all kinds of activities to link with that. Watch this space.
if all the sky was paper
Perhaps the only blog featuring a tag cloud in which ‘Assistant Post Mistress’ looms large, The Travelling Bookbinder in Antarctica somehow seems suitable festive reading for the online book lover.
Book artist and travelling bookbinder, Rachel Hazell, is currently working and living in the world’s most famous post office at Port Lockroy, Antarctica. For six white and blue months, Rachel will be working as Assistant Post Mistress and Penguin Monitor. Entries so far include ‘thinking of poetry and missing it’ and ‘thinking about Ali Smith’s enthusiasm for the spare and simple’. Oh and there are instructions on fashioning a book from a sheet of snow white A4.