Category Archives: publishing

e-read all about it

An article in Publishing News this week suggests that UK publishers are bracing themselves for the arrival on these shores of the Kindle or a rival to it soon. Much discussion of e-royalties is going on; HarperCollins and Random House US are putting some whole works on line for free; meanwhile Francis Bennett, the consultant who has been gazing into the crystal ball for the booktrade re digitisation, admits to being “baffled by Amazon – they never do what you expect them to.”
Consultant (and ex-Penguin boss) Anthony Forbes Watson is more definite (maybe): “The competition will be between the best of the closed networks. Perhaps Amazon will rope in Abebooks. Perhaps Barnes & Noble will join up with a partner to combat Amazon, perhaps Amazon will develop something with Apple. But I don’t think the market will be that big. I’d be surprised if it goes above 3%, or 10% tops.”
Well, nothing to worry about there then. Meanwhile we’ve been talking to friends in the booktrade who point out how little publishers will do for their huge slice of the cake these digital days, once printing and physical distribution are out of the picture. Do the e-royalties being offered reflect these changes? Do they hell.

danah boyd’s closed journal boycott

I meant to blog this earlier but it’s still quite relevant, especially in light of other recent activity on the open access front. Last week, Danah Boyd announced that henceforth she would only publish in open access journals and urged others -? especially tenured faculty, who are secure in their status and have little to lose -? to do the same.

I’d be sad to see some of the academic publishers go, but if they can’t evolve to figure out new market options, I have no interest in supporting their silencing practices. I think that scholars have a responsibility to make their work available as a public good. I believe that scholars should be valued for publishing influential material that can be consumed by anyone who might find it relevant to their interests. I believe that the product of our labor should be a public good. I do not believe that scholars should be encouraged to follow stupid rules for the sake of maintaining norms. Given that we do the bulk of the labor behind journals, I think that we can do it without academic publishers…

harvard faculty votes overwhelmingly for open access

The Harvard Crimson:

The motion, which passed easily at yesterday’s Faculty meeting, grants Harvard a non-exclusive copyright over all articles produced by any current Faculty member, allowing for the creation of an online repository that would be “available to other services such as web harvesters, Google Scholar, and the like.”
…English professor Stephen Greenblatt, the editor of what he described as a journal with “a decent reputation and a quite anemic subscription base,” advocated for the motion because he doubted it would accelerate the death of his journal, and because he said he was worried about the currently high cost of many monographs.
“This is one of the only ways we can break the backs of the monopolists who are currently seriously damaging our fields,” he said.

New York Times:

“The chorus of ‘yeas’ was thunderous,” Robert Darnton, the director of the University Library, wrote in an e-mail message. “I hope this marks a turning point in the way communications operate in the world of scholarship.”

harvard faculty cast vote on open access

The U.S. presidential primaries in Virginia, Maryland and D.C. are not the only votes to watch today. The New York Times reports that arts and sciences faculty at Harvard are weighing in today on a proposed measure that would make all scholarly articles available in a free open access repository run by the library immediately following publication.

“In place of a closed, privileged and costly system, it will help open up the world of learning to everyone who wants to learn,” said Robert Darnton, director of the university library. “It will be a first step toward freeing scholarship from the stranglehold of commercial publishers by making it freely available on our own university repository.”
Under the proposal Harvard would deposit finished papers in an open-access repository run by the library that would instantly make them available on the Internet. Authors would still retain their copyright and could publish anywhere they pleased -? including at a high-priced journal, if the journal would have them.
What distinguishes this plan from current practice, said Stuart Shieber, a professor of computer science who is sponsoring the faculty motion, is that it would create an “opt-out” system: an article would be included unless the author specifically requested it not be. Mr. Shieber was the chairman of a committee set up by Harvard’s provost to investigate scholarly publishing; this proposal grew out of one of the recommendations, he said.

My fingers are crossed that this vote will go the way of openness. A vote for open access from Harvard would be a huge boost for the movement. Change is more likely to come if people at the top of the heap, whose personal incentive for reform is far less obvious, start making the move on principle -? saying, essentially, that it’s not the job of scholars to prop up the journal business.

at o’reilly

Over the next couple of days I’ll be filling up my brain at the O’Reilly Tools of Change for Publishing conference -? taking place, conveniently, here in New York. I’m giving a talk today called Books as Conversations, and participating in a panel, Are New Devices Breathing New Life into e-Books?, tomorrow. Many fascinating presentations. More soon.

harpercollins offers free ebooks

The New York Times:

In an attempt to increase book sales, HarperCollins Publishers will begin offering free electronic editions of some of its books on its Web site, including a novel by Paulo Coelho and a cookbook by the Food Network star Robert Irvine.
The idea is to give readers the opportunity to sample the books online in the same way that prospective buyers can flip through books in a bookstore.

developing books in networked communities: a conversation with don waters

Two weeks ago, when the blog-based peer review of Noah Wardrip-Fruin’s Expressive Processing began on Grand Text Auto, Bob sent a note about the project to Don Waters, the program officer for scholarly communications at the Andrew W. Mellon Foundation -? someone very much at the forefront of developments in the digital publishing arena. He wrote back intrigued but slightly puzzled as to the goals, scope and definitions of the experiment. We forwarded the note to Noah and to Doug Sery, Noah’s editor at MIT Press, and decided each to write some clarifying responses from our different perspectives: book author/blogger (Noah), book editor (Doug), and web editor (myself). The result is an interesting exchange about networked publishing and useful meta-document about the project. As our various responses, and Don’s subsequent reply, help to articulate, playing with new forms of peer review is only one aspect of this experiment, and maybe not even the most interesting one. The exchange is reproduced below (a couple of names mentioned have been made anonymous).
Don Waters (Mellon Foundation):
Thanks, Bob. This is a very interesting idea. In reading through the materials, however, I did not really understand how, if at all, this “experiment” would affect MIT Press behavior. What are the hypotheses being tested in that regard? I can see, from one perspective, that this “experiment” would result purely in more work for everyone. The author would get the benefit of the “crowd” commenting on his work, and revise accordingly, and then the Press would still send the final product out for peer review and copy editing prior to final publication.
Don
Ben Vershbow (Institute for the Future of the Book):
There are a number of things we set out to learn here. First, can an open, Web-based review process make a book better? Given the inherently inter-disciplinary nature of Noah’s book, and the diversity of the Grand Text Auto readership, it seems fairly likely that exposing the manuscript to a broader range of critical first-responders will bring new things to light and help Noah to hone his argument. As can be seen in his recap of discussions around the first chapter, there have already been a number of incisive critiques that will almost certainly impact subsequent revisions.
Second, how can we use available web technologies to build community around a book, or to bring existing communities into a book’s orbit? “Books are social vectors, but publishers have been slow to see it,” writes Ursula K. Le Guin in a provocative essay in the latest issue of Harper’s. For the past three years, the Institute for the Future of the Book’s mission has been to push beyond the comfort zone of traditional publishers, exploring the potential of networked technologies to enlarge the social dimensions of books. By building a highly interactive Web component to a text, where the author and his closest peers are present and actively engaged, and where the entire text is accessible with mechanisms for feedback and discussion, we believe the book will occupy a more lively and relevant place in the intellectual ecology of the Internet and probably do better overall in the offline arena as well.
The print book may have some life left in it yet, but it now functions within a larger networked commons. To deny this could prove fatal for publishers in the long run. Print books today need dynamic windows into the Web and publishers need to start experimenting with the different forms those windows could take or else retreat further into marginality. Having direct contact with the author -? being part of the making of the book -? is a compelling prospect for the book’s core audience and their enthusiasm is likely to spread. Certainly, it’s too early to make a definitive assessment about the efficacy of this Web outreach strategy, but initial indicators are very positive. Looked at one way, it certainly does create more work for everyone, but this is work that has to be done. At the bare minimum, we are building marketing networks and generating general excitement about the book. Already, the book has received a great deal of attention around the blogosphere, not just because of its novelty as a publishing experiment, but out of genuine interest in the subject matter and author. I would say that this is effort well spent.
It’s important to note that, despite CHE’s lovely but slightly sensational coverage of this experiment as a kind of mortal combat between traditional blind peer review and the new blog-based approach, we view the two review processes as complementary, not competitive. At the end, we plan to compare the different sorts of feedback the two processes generate. Our instinct is that it will suggest hybrid models rather than a wholesale replacement of one system with another.
That being said, our instincts tell us that open blog-based review (or other related forms) will become increasingly common practice among the next generation of academic writers in the humanities. The question for publishers is how best to engage with, and ideally incorporate, these new practices. Already, we see a thriving culture of pre-publication peer review in the sciences, and major publishers such as Nature are beginning to build robust online community infrastructures so as to host these kinds of interactions within their own virtual walls. Humanities publishers should be thinking along the same lines, and partnerships with respected blogging communities like GTxA are a good way to start experimenting. In a way, the MIT-GTxA collab represents an interface not just of two ideas of peer review but between two kinds of publishing imprints. Both have built a trusted name and become known for a particular editorial vision in their respective (and overlapping) communities. Each excels in a different sort of publishing, one print-based, the other online community-based. Together they are greater than the sum of their parts and suggest a new idea of publishing that treats books as extended processes rather than products. MIT may regard this as an interesting but not terribly significant side project for now, but it could end up having a greater impact on the press (and hopefully on other presses) than they expect.
All the best,
Ben
Noah Wardrip-Fruin (author, UC San Diego):
Hi Bob –
Yesterday I went to meet some people at a game company. There’s a lot of expertise there – and actually quite a bit of reflection on what they’re doing, how to think about it, and so on. But they don’t participate in academic peer review. They don’t even read academic books. But they do read blogs, and sometimes comment on them, and I was pleased to hear that there are some Grand Text Auto readers there.
If they comment on the Expressive Processing manuscript, it will create more work for me in one sense. I’ll have to think about what they say, perhaps respond, and perhaps have to revise my text. But, from my perspective, this work is far outweighed by the potential benefits: making a better book, deepening my thinking, and broadening the group that feels academic writing and publishing is potentially relevant to them.
What makes this an experiment, from my point of view, is the opportunity to also compare what I learn from the blog-based peer review to what I learn from the traditional peer review. However, this will only be one data point. We’ll need to do a number of these, all using blogs that are already read by the audience we hope will participate in the peer review. When we have enough data points perhaps we’ll start to be able to answer some interesting questions. For example, is this form of review more useful in some cases than others? Is the feedback from the two types of review generally overlapping or divergent? Hopefully we’ll learn some lessons that presses like MITP can put into practice – suggesting blog-based review when it is most appropriate, for example. With those lessons learned, it will be time to design the next experiment.
Best,
Noah
Doug Sery (MIT Press):
Hi Bob,
I know Don’s work in digital libraries and preservation, so I’m not surprised at the questions. While I don’t know the breadth of the discussions Noah and Ben had around this project, I do know that Noah and I approached this in a very casual manner. Noah has expressed his interest in “open communication” any number of times and when he mentioned that he’d like to “crowd-source” “Expressive Processing” on Grand Text Auto I agreed to it with little hesitation, so I’m not sure I’d call it an experiment. There are no metrics in place to determine whether this will affect sales or produce a better book. I don’t see this affecting the way The MIT Press will approach his book or publishing in general, at least for the time being.
This is not competing with the traditional academic press peer-review, although the CHE article would lead the reader to believe otherwise (Jeff obviously knows how to generate interest in a topic, which is fine, but even a games studies scholar, in a conversation I had with him today, laughingly called the headline “tabloidesque.”) . While Noah is posting chapters on his blog, I’m having the first draft peer-reviewed. After the peer-reviews come in, Noah and I will sit down to discuss them to see if any revisions to the manuscript need to be made. I don’t plan on going over the GTxA comments with Noah, unless I happen to see something that piques my interest, so I don’t see any additional work having to be done on the part of MITP. It’s a nice way for Noah to engage with the potential audience for his ideas, which I think is his primary goal for all of this. So, I’m thinking of this more as an exercise to see what kind of interest people have in these new tools and/or mechanisms. Hopefully, it will be a learning experience that MITP can use as we explore new models of publishing.
Hope this helps and that all’s well.
Best,
Doug
Don Waters:
Thanks, Bob (and friends) for this helpful and informative feedback.
As I understand the explanations, there is a sense in which the experiment is not aimed at “peer review” at all in the sense that peer review assesses the qualities of a work to help the publisher determine whether or not to publish it. What the exposure of the work-in-progress to the community does, besides the extremely useful community-building activity, is provide a mechanism for a function that is now all but lost in scholarly publishing, namely “developmental editing.” It is a side benefit of current peer review practice that an author gets some feedback on the work that might improve it, but what really helps an author is close, careful reading by friends who offer substantive criticism and editorial comments. Most accomplished authors seek out such feedback in a variety of informal ways, such as sending out manuscripts in various stages of completion to their colleagues and friends. The software that facilitates annotation and the use of the network, as demonstrated in this experiment, promise to extend this informal practice to authors more generally. I may have the distinction between peer review and developmental editing wrong, or you all may view the distinction as mere quibbling, but I think it helps explain why CHE got it so wrong in reporting the experiment as struggle between peer review and the blog-based approach. Two very different functions are being served, and as you all point out, these are complementary rather than competing functions.
I am very intrigued by the suggestions that scholarly presses need to engage in this approach more generally, and am eagerly learning from this and related experiments, such as those at Nature and elsewhere, more about the potential benefits of this kind of approach.
Great work and many thanks for the wonderful (and kind) responses.
Best,
Don

expressive processing meta

To mark the posting of the final chunk of chapter 1 of the Expressive Processing manuscript on Grand Text Auto, Noah has kicked off what will hopefully be a revealing meta-discussion to run alongside the blog-based peer review experiment. The first meta post includes a roundup of comments from the first week and invites readers to comment on the process as a whole. As you’ll see, there’s already been some incisive feedback and Noah is mulling over revisions. Chapter 2 starts tomorrow.
In case you missed it, here’s an intro to the project.

expressive processing: an experiment in blog-based peer review

An exciting new experiment begins today, one which ties together many of the threads begun in our earlier “networked book” projects, from Without Gods to Gamer Theory to CommentPress. It involves a community, a manuscript, and an open peer review process -? and, very significantly, the blessing of a leading academic press. (The Chronicle of Higher Education also reports.)
Mitpress_logo.png The community in question is Grand Text Auto, a popular multi-author blog about all things relating to digital narrative, games and new media, which for many readers here, probably needs no further introduction. The author, Noah Wardrip-Fruin, a professor of communication at UC San Diego, a writer/maker of digital fictions, and, of course, a blogger at GTxA. His book, which starting today will be posted in small chunks, open to reader feedback, every weekday over a ten-week period, is called Expressive Processing: Digital Fictions, Computer Games, and Software Studies. It probes the fundamental nature of digital media, looking specifically at the technical aspects of creation -? the machines and software we use, the systems and processes we must learn end employ in order to make media -? and how this changes how and what we create. It’s an appropriate guinea pig, when you think about it, for an open review experiment that implicitly asks, how does this new technology (and the new social arrangements it makes possible) change how a book is made?
The press that has given the green light to all of this is none other than MIT, with whom Noah has published several important, vibrantly inter-disciplinary anthologies of new media writing. Expressive Processing his first solo-authored work with the press, will come out some time next year but now is the time when the manuscript gets sent out for review by a small group of handpicked academic peers. Doug Sery, the editor at MIT, asked Noah who would be the ideal readers for this book. To Noah, the answer was obvious: the Grand Text Auto community, which encompasses not only many of Noah’s leading peers in the new media field, but also a slew of non-academic experts -? writers, digital media makers, artists, gamers, game designers etc. -? who provide crucial alternative perspectives and valuable hands-on knowledge that can’t be gotten through more formal channels. Noah:

Blogging has already changed how I work as a scholar and creator of digital media. Reading blogs started out as a way to keep up with the field between conferences — and I soon realized that blogs also contain raw research, early results, and other useful information that never gets presented at conferences. But, of course, that’s just the beginning. We founded Grand Text Auto, in 2003, for an even more important reason: blogs can create community. And the communities around blogs can be much more open and welcoming than those at conferences and festivals, drawing in people from industry, universities, the arts, and the general public. Interdisciplinary conversations happen on blogs that are more diverse and sustained than any I’ve seen in person.
Given that ours is a field in which major expertise is located outside the academy (like many other fields, from 1950s cinema to Civil War history) the Grand Text Auto community has been invaluable for my work. In fact, while writing the manuscript for Expressive Processing I found myself regularly citing blog posts and comments, both from Grand Text Auto and elsewhere….I immediately realized that the peer review I most wanted was from the community around Grand Text Auto.

Sery was enthusiastic about the idea (although he insisted that the traditional blind review process proceed alongside it) and so Noah contacted me about working together to adapt CommentPress to the task at hand.
gtalogo.jpg The challenge technically was to integrate CommentPress into an existing blog template, applying its functionality selectively -? in other words, to make it work for a specific group of posts rather than for all content in the site. We could have made a standalone web site dedicated to the book, but the idea was to literally weave sections of the manuscript into the daily traffic of the blog. From the beginning, Noah was very clear that this was the way it needed to work, insisting that the social and technical integration of the review process were inseparable. I’ve since come to appreciate how crucial this choice was for making a larger point about the value of blog-based communities in scholarly production, and moreover how elegantly it chimes with the central notions of Noah’s book: that form and content, process and output, can never truly be separated.
Up to this point, CommentPress has been an all or nothing deal. You can either have a whole site working with paragraph-level commenting, or not at all. In the technical terms of WordPress, its platform, CommentPress is a theme: a template for restructuring an entire blog to work with the CommentPress interface. What we’ve done -? with the help of a talented WordPress developer named Mark Edwards, and invaluable guidance and insight from Jeremy Douglass of the Software Studies project at UC San Diego (and the Writer Response Theory blog) -? is made CommentPress into a plugin: a program that enables a specific function on demand within a larger program or site. This is an important step for CommentPress, giving it a new flexibility that it has sorely lacked and acknowledging that it is not a one-size-fits-all solution.
Just to be clear, these changes are not yet packaged into the general CommentPress codebase, although they will be before too long. A good test run is still needed to refine the new model, and important decisions have to be made about the overall direction of CommentPress: whether from here it definitively becomes a plugin, or perhaps forks into two paths (theme and plugin), or somehow combines both options within a single package. If you have opinions on this matter, we’re all ears…
But the potential impact of this project goes well beyond the technical.
It represents a bold step by a scholarly press -? one of the most distinguished and most innovative in the world -? toward developing new procedures for vetting material and assuring excellence, and more specifically, toward meaningful collaboration with existing online scholarly communities to develop and promote new scholarship.
It seems to me that the presses that will survive the present upheaval will be those that learn to productively interact with grassroots publishing communities in the wild of the Web and to adopt the forms and methods they generate. I don’t think this will be a simple story of the blogosphere and other emerging media ecologies overthrowing the old order. Some of the older order will die off to be sure, but other parts of it will adapt and combine with the new in interesting ways. What’s particularly compelling about this present experiment is that it has the potential to be (perhaps now or perhaps only in retrospect, further down the line) one of these important hybrid moments -? a genuine, if slightly tentative, interface between two publishing cultures.
Whether the MIT folks realize it or not (their attitude at the outset seems to be respectful but skeptical), this small experiment may contain the seeds of larger shifts that will redefine their trade. The most obvious changes leveled on publishing by the Internet, and the ones that get by far the most attention, are in the area of distribution and economic models. The net flattens distribution, making everyone a publisher, and radically undercuts the heretofore profitable construct of copyright and the whole system of information commodities. The effects are less clear, however, in those hardest to pin down yet most essential areas of publishing -? the territory of editorial instinct, reputation, identity, trust, taste, community… These are things that the best print publishers still do quite well, even as their accounting departments and managing directors descend into panic about the great digital undoing. And these are things that bloggers and bookmarkers and other web curators, archivists and filterers are also learning to do well -? to sift through the information deluge, to chart a path of quality and relevance through the incredible, unprecedented din.
This is the part of publishing that is most important, that transcends technological upheaval -? you might say the human part. And there is great potential for productive alliances between print publishers and editors and the digital upstarts. By delegating half of the review process to an existing blog-based peer community, effectively plugging a node of his press into the Web-based communications circuit, Doug Sery is trying out a new kind of editorial relationship and exercising a new kind of editorial choice. Over time, we may see MIT evolve to take on some of the functions that blog communities currently serve, to start providing technical and social infrastructure for authors and scholarly collectives, and to play the valuable (and time-consuming) roles of facilitator, moderator and curator within these vast overlapping conversations. Fostering, organizing, designing those conversations may well become the main work of publishing and of editors.
I could go on, but better to hold off on further speculation and to just watch how it unfolds. The Expressive Processing peer review experiment begins today (the first actual manuscript section is here) and will run for approximately ten weeks and 100 thousand words on Grand Text Auto, with a new post every weekday during that period. At the end, comments will be sorted, selected and incorporated and the whole thing bundled together into some sort of package for MIT. We’re still figuring out how that part will work. Please go over and take a look and if a thought is provoked, join the discussion.

emergency books

In the course of looking for something else entirely, I just stumbled upon Emergency Books. It’s a (slightly dormant) side project of Litromagazine, a freesheet that publishes and distributes short fiction outside London Underground stations. Emergency Books are, very simply, out-of-print texts taken from Project Gutenberg and dropped wholesale into a PDF template that makes them easy and economical to print on a standard home printer. They’re designed “for when you’ve nothing to read and a standard issue of Litro is too short”, the publisher (is that the right word here?) explains:

Each ‘double page spread’ fits nicely in an Acrobat Reader window, which results in minimal need for scrolling. On- or off-screen, the columns are relatively narrow and short so you don’t get lost in a sea of text (as you would if you simply printed direct from Project Gutenberg). There is little of the blank white space found in standard books – this is to get as much text on the page as possible thereby reducing the total number of pages required (for example, The Call of the Wild by Jack London, at 128 pages in book form, takes only 15 double-side printed A4 sheets as an Emergency Book – while being just as easy to read). This saves on resources as well as making the printed Emergency Book easier to fold and carry around.
If you are a ‘format purist’, you may well hate them. But if you love literature for the content, Emergency Books could be for you.

Of the small number who’ve saved Emergency Books on del.icio.us, one noted that Emergency Books are ‘for reading when you’re caught short. If that ever happens’. I like the idea of literature being, like cigarettes, something one can be ‘caught short’ without – for all that in this age of information overload the reverse more often feels true. There aren’t that many texts there at present, and I’m slightly baffled by the extant choice. But whatever you think of Conan Doyle, Emergency Books shows a refeshingly pragmatic grasp of the relation between digital and paper publishing formats, and represents an interesting attempt at minimising the downsides of each in the interests of guaranteeing the reading addict a regular fix.