The picture is of a Futurizer, based on the kinds of contraption I built as a child from cardboard, balsa wood and string which allowed me to communicate with other planets and centuries. It was reconstructed by a group of us at a conference on Transliteracy at the Institute of Creative Technologies at De Montfort University, organised by PART. The aim of the day was to try to make some transliterate objects and in so doing consider if such things can, could or should exist. We had an enjoyable if inconclusive time grappling with this.
Plug headphones into an iPod or XBox and you will be able to listen to one of a large but finite range of sounds. Plug headphones into a cardboard box and you can (not) hear anything you can possibly imagine. Travelling back through the years to my childhood, these machines allowed me to think across time and space, out of the (cardboard) box. They were also a means of engaging with the TV I loved, in a bygone era when no adult expressed any interest in the way I read my TV21 comic or consumed Thunderbirds and The Man From Uncle.
Unlike those friends who screwed together bits of meccanno to build working bridges, or fiddled with circuit boards until bulbs lit up, my games were all about interfaces.
I never worried for a moment about how these things might actually work. Now a lot of inventiveness is once again going into cutting and sticking, playing with FaceBook applications and YouTube clips like we used Corn Flake packets and sticky-backed plastic. Isn’t it great, living here in the future?
By the end of the day the Futurizer had been photographed and uploaded to Second Life. a fitting place for it to end up really: transmogrified, transliterated, futurized.
To mark the posting of the final chunk of chapter 1 of the Expressive Processing manuscript on Grand Text Auto, Noah has kicked off what will hopefully be a revealing meta-discussion to run alongside the blog-based peer review experiment. The first meta post includes a roundup of comments from the first week and invites readers to comment on the process as a whole. As you’ll see, there’s already been some incisive feedback and Noah is mulling over revisions. Chapter 2 starts tomorrow.
In case you missed it, here’s an intro to the project.
Slate takes a look at Grady Harp, Amazon’s no. 7-ranked book reviewer, and finds the amateur-driven literary culture there to be a much grayer area than expected:
Absent the institutional standards that govern (however notionally) professional journalists, Web 2.0 stakes its credibility on the transparency of users’ motives and their freedom from top-down interference. Amazon, for example, describes its Top Reviewers as “clear-eyed critics [who] provide their fellow shoppers with helpful, honest, tell-it-like-it-is product information.” But beneath the just-us-folks rhetoric lurks an unresolved tension between transparency and opacity; in this respect, Amazon exemplifies the ambiguities of Web 2.0. The Top 10 List promises interactivity – ?”How do I become a Top Reviewer?” – ?yet Amazon guards its rankings algorithms closely…. As in any numbers game (tax returns, elections) opacity abets manipulation.
The Institute for the Future of the Book has been appointed by Arts Council England to undertake research into digital developments in literature. This is exciting news for us, not least because it marks the official launch of our London office.
Over the next few months Chris Meade and Sebastian Mary Harrington will be talking to a wide range of organisations including Arts Council England literature clients and others whose work could provide useful models to the sector.
We’ll be looking at book publishing and magazines, reader development, writers including collaborative and new media authors and the blurring of distinctions between amateur and professional, live literature and festivals, plus other web activity that could provide inspiration to agencies working to spread the word about the word – and we’ll be posting questions and comments on the ifbook blog as we go along.
Sebastian Mary Harrington’s scarf captured live under construction at the Institute’s London HQ, skillfully knitted in the colours of The Institute for the Future of the Book – and The School of Everything – to celebrate the start of our new research project.
If you’re in the New York City region, this is worth checking out (features Institute fellow Siva Vaidhyanathan):
From Free Culture @ NYU:
In 1998, university professor Kembrew McLeod trademarked the phrase “freedom of expression” – ?a startling comment on the way that intellectual property law can restrict creativity and the expression of ideas. This provocative and amusing documentary explores the battles being waged in courts, classrooms, museums, film studios, and the Internet over control of our cultural commons. Based on McLeod’s award-winning book of the same title, Freedom of Expression® charts the many successful attempts to push back the assault on free expression by overzealous copyright holders.
In cooperation with the Media Education Foundation and La Lutta, Free Culture @ NYU is screening Freedom of Expression®: Resistance and Repression in the Age of Intellectual Property at 9pm on Thursday, January 31.
Narrated by Naomi Klein, the film features interviews with Stanford Law’s Lawrence Lessig, Illegal Art Show curator Carrie McLaren, Negativland’s Mark Hosler, UVA media scholar Siva Vaidhyanathan, and Free Culture @ NYU co-founder Inga Chernyak, among many others. This 53-minute documentary will be preceded by selections from Negativland’s new DVD, Our Favorite Things, and it will be followed by a Q&A with Freedom of Expression® author and director Kembrew McLeod and co-producer Jeremy Smith.
Freedom of Expression Screening and Q&A with Creators
Sponsored by Free Culture @ NYU, NYU ACM, and WiNC
Free and Open to the Public (bring ID if non-NYU)
Thursday, January 31, 2008
NYU’s Courant Institute
251 Mercer Street b/w Bleecker and W. 4th
On the film’s site, I found this very clever (if slightly spastic) DVD extra, “A Fair(y) Use Tale”:
An exciting new experiment begins today, one which ties together many of the threads begun in our earlier “networked book” projects, from Without Gods to Gamer Theory to CommentPress. It involves a community, a manuscript, and an open peer review process -? and, very significantly, the blessing of a leading academic press. (The Chronicle of Higher Education also reports.)
The community in question is Grand Text Auto, a popular multi-author blog about all things relating to digital narrative, games and new media, which for many readers here, probably needs no further introduction. The author, Noah Wardrip-Fruin, a professor of communication at UC San Diego, a writer/maker of digital fictions, and, of course, a blogger at GTxA. His book, which starting today will be posted in small chunks, open to reader feedback, every weekday over a ten-week period, is called Expressive Processing: Digital Fictions, Computer Games, and Software Studies. It probes the fundamental nature of digital media, looking specifically at the technical aspects of creation -? the machines and software we use, the systems and processes we must learn end employ in order to make media -? and how this changes how and what we create. It’s an appropriate guinea pig, when you think about it, for an open review experiment that implicitly asks, how does this new technology (and the new social arrangements it makes possible) change how a book is made?
The press that has given the green light to all of this is none other than MIT, with whom Noah has published several important, vibrantly inter-disciplinary anthologies of new media writing. Expressive Processing his first solo-authored work with the press, will come out some time next year but now is the time when the manuscript gets sent out for review by a small group of handpicked academic peers. Doug Sery, the editor at MIT, asked Noah who would be the ideal readers for this book. To Noah, the answer was obvious: the Grand Text Auto community, which encompasses not only many of Noah’s leading peers in the new media field, but also a slew of non-academic experts -? writers, digital media makers, artists, gamers, game designers etc. -? who provide crucial alternative perspectives and valuable hands-on knowledge that can’t be gotten through more formal channels. Noah:
Blogging has already changed how I work as a scholar and creator of digital media. Reading blogs started out as a way to keep up with the field between conferences — and I soon realized that blogs also contain raw research, early results, and other useful information that never gets presented at conferences. But, of course, that’s just the beginning. We founded Grand Text Auto, in 2003, for an even more important reason: blogs can create community. And the communities around blogs can be much more open and welcoming than those at conferences and festivals, drawing in people from industry, universities, the arts, and the general public. Interdisciplinary conversations happen on blogs that are more diverse and sustained than any I’ve seen in person.
Given that ours is a field in which major expertise is located outside the academy (like many other fields, from 1950s cinema to Civil War history) the Grand Text Auto community has been invaluable for my work. In fact, while writing the manuscript for Expressive Processing I found myself regularly citing blog posts and comments, both from Grand Text Auto and elsewhere….I immediately realized that the peer review I most wanted was from the community around Grand Text Auto.
Sery was enthusiastic about the idea (although he insisted that the traditional blind review process proceed alongside it) and so Noah contacted me about working together to adapt CommentPress to the task at hand.
The challenge technically was to integrate CommentPress into an existing blog template, applying its functionality selectively -? in other words, to make it work for a specific group of posts rather than for all content in the site. We could have made a standalone web site dedicated to the book, but the idea was to literally weave sections of the manuscript into the daily traffic of the blog. From the beginning, Noah was very clear that this was the way it needed to work, insisting that the social and technical integration of the review process were inseparable. I’ve since come to appreciate how crucial this choice was for making a larger point about the value of blog-based communities in scholarly production, and moreover how elegantly it chimes with the central notions of Noah’s book: that form and content, process and output, can never truly be separated.
Up to this point, CommentPress has been an all or nothing deal. You can either have a whole site working with paragraph-level commenting, or not at all. In the technical terms of WordPress, its platform, CommentPress is a theme: a template for restructuring an entire blog to work with the CommentPress interface. What we’ve done -? with the help of a talented WordPress developer named Mark Edwards, and invaluable guidance and insight from Jeremy Douglass of the Software Studies project at UC San Diego (and the Writer Response Theory blog) -? is made CommentPress into a plugin: a program that enables a specific function on demand within a larger program or site. This is an important step for CommentPress, giving it a new flexibility that it has sorely lacked and acknowledging that it is not a one-size-fits-all solution.
Just to be clear, these changes are not yet packaged into the general CommentPress codebase, although they will be before too long. A good test run is still needed to refine the new model, and important decisions have to be made about the overall direction of CommentPress: whether from here it definitively becomes a plugin, or perhaps forks into two paths (theme and plugin), or somehow combines both options within a single package. If you have opinions on this matter, we’re all ears…
But the potential impact of this project goes well beyond the technical.
It represents a bold step by a scholarly press -? one of the most distinguished and most innovative in the world -? toward developing new procedures for vetting material and assuring excellence, and more specifically, toward meaningful collaboration with existing online scholarly communities to develop and promote new scholarship.
It seems to me that the presses that will survive the present upheaval will be those that learn to productively interact with grassroots publishing communities in the wild of the Web and to adopt the forms and methods they generate. I don’t think this will be a simple story of the blogosphere and other emerging media ecologies overthrowing the old order. Some of the older order will die off to be sure, but other parts of it will adapt and combine with the new in interesting ways. What’s particularly compelling about this present experiment is that it has the potential to be (perhaps now or perhaps only in retrospect, further down the line) one of these important hybrid moments -? a genuine, if slightly tentative, interface between two publishing cultures.
Whether the MIT folks realize it or not (their attitude at the outset seems to be respectful but skeptical), this small experiment may contain the seeds of larger shifts that will redefine their trade. The most obvious changes leveled on publishing by the Internet, and the ones that get by far the most attention, are in the area of distribution and economic models. The net flattens distribution, making everyone a publisher, and radically undercuts the heretofore profitable construct of copyright and the whole system of information commodities. The effects are less clear, however, in those hardest to pin down yet most essential areas of publishing -? the territory of editorial instinct, reputation, identity, trust, taste, community… These are things that the best print publishers still do quite well, even as their accounting departments and managing directors descend into panic about the great digital undoing. And these are things that bloggers and bookmarkers and other web curators, archivists and filterers are also learning to do well -? to sift through the information deluge, to chart a path of quality and relevance through the incredible, unprecedented din.
This is the part of publishing that is most important, that transcends technological upheaval -? you might say the human part. And there is great potential for productive alliances between print publishers and editors and the digital upstarts. By delegating half of the review process to an existing blog-based peer community, effectively plugging a node of his press into the Web-based communications circuit, Doug Sery is trying out a new kind of editorial relationship and exercising a new kind of editorial choice. Over time, we may see MIT evolve to take on some of the functions that blog communities currently serve, to start providing technical and social infrastructure for authors and scholarly collectives, and to play the valuable (and time-consuming) roles of facilitator, moderator and curator within these vast overlapping conversations. Fostering, organizing, designing those conversations may well become the main work of publishing and of editors.
I could go on, but better to hold off on further speculation and to just watch how it unfolds. The Expressive Processing peer review experiment begins today (the first actual manuscript section is here) and will run for approximately ten weeks and 100 thousand words on Grand Text Auto, with a new post every weekday during that period. At the end, comments will be sorted, selected and incorporated and the whole thing bundled together into some sort of package for MIT. We’re still figuring out how that part will work. Please go over and take a look and if a thought is provoked, join the discussion.
We’ve finally squashed the bug that made CommentPress incompatible with the latest version of WordPress (2.3), so anyone out there with a CP installation can finally go ahead and upgrade:
CommentPress 1.4.1 »
Other than the compatibility fix, 1.4.1 is exactly the same as 1.4. Of course, there are numerous improvements we’d still like to make, and plans for that are underway. Stay tuned.
Also: tomorrow I’m going to be announcing an exciting new CommentPress publishing experiment that will suggest possible future directions for the tool’s development.
I’ve been meaning to post something for a while about The Reprover, or Le Reprobateur, a hugely impressive work of digital fiction by François Coulon, Paris-based digital writer. It includes excellent cartoons, live video of the main character and a witty text in French and elaborate English which expands and contracts – the same sentence blooming different additional clauses each time you pass a mouse across it. This is a deeply disconcerting effect at first, but once you’ve got used to it, a whole new kind of three dimensional reading emerges. It’s a fascinating idea which could only work on the web.
I’ve been meaning to post.. but haven’t got round to it. That’s why I need a Reprobateur, “someone who would be there simply to give us a bad conscience.” Part psychoanalyst, part priest, part bloke in a suit, the Reprover is a wonderful creation. The story is set in the 80s and you can navigate around it by spinning a 3D polyhedron. “It’s literature plus electricity!” says Coulon.
It’s also plus so many tricks and distractions that it’s hard to settle into – there’s too much fun to be had clicking, spinning and adjusting the layers of soundtrack to actually immerse oneself in the story. The Reprover is beautifully produced and costs real money: 16 Euros or 160 for institutions, but you can get an excellent taster by going to http://www.totonium.com.
I’ve been going back to this one several times for more. Once you’re signed up you can contact the narrator for free advice from your very own Reprover. You’ll wonder how you coped all those years without one.
An interesting experiment on Vimeo. See what’s going on?
Via IT IN place.