Monthly Archives: January 2008

emergency books

In the course of looking for something else entirely, I just stumbled upon Emergency Books. It’s a (slightly dormant) side project of Litromagazine, a freesheet that publishes and distributes short fiction outside London Underground stations. Emergency Books are, very simply, out-of-print texts taken from Project Gutenberg and dropped wholesale into a PDF template that makes them easy and economical to print on a standard home printer. They’re designed “for when you’ve nothing to read and a standard issue of Litro is too short”, the publisher (is that the right word here?) explains:

Each ‘double page spread’ fits nicely in an Acrobat Reader window, which results in minimal need for scrolling. On- or off-screen, the columns are relatively narrow and short so you don’t get lost in a sea of text (as you would if you simply printed direct from Project Gutenberg). There is little of the blank white space found in standard books – this is to get as much text on the page as possible thereby reducing the total number of pages required (for example, The Call of the Wild by Jack London, at 128 pages in book form, takes only 15 double-side printed A4 sheets as an Emergency Book – while being just as easy to read). This saves on resources as well as making the printed Emergency Book easier to fold and carry around.
If you are a ‘format purist’, you may well hate them. But if you love literature for the content, Emergency Books could be for you.

Of the small number who’ve saved Emergency Books on del.icio.us, one noted that Emergency Books are ‘for reading when you’re caught short. If that ever happens’. I like the idea of literature being, like cigarettes, something one can be ‘caught short’ without – for all that in this age of information overload the reverse more often feels true. There aren’t that many texts there at present, and I’m slightly baffled by the extant choice. But whatever you think of Conan Doyle, Emergency Books shows a refeshingly pragmatic grasp of the relation between digital and paper publishing formats, and represents an interesting attempt at minimising the downsides of each in the interests of guaranteeing the reading addict a regular fix.

nominate the best tech writing of 2007

digitalculturebooks, a collaborative imprint of the University of Michigan press and library, publishes an annual anthology of the year’s best technology writing. The nominating process is open to the public and they’re giving people until January 31st to suggest exemplary articles on “any and every technology topic–biotech, information technology, gadgetry, tech policy, Silicon Valley, and software engineering” etc.
The 2007 collection is being edited by Clive Thompson. Last year’s was Steven Levy. When complete, the collection is published as a trade paperback and put in its entirety online in clean, fully searchable HTML editions, so head over and help build what will become a terrific open access resource.

youtube purges: fair use tested

Last week there was a wave of takedowns on YouTube of copyright-infringing material -? mostly clips from television and movies. MediaCommons, the nascent media studies network we help to run, felt this rather acutely. In Media Res, an area of the site where media scholars post and comment on video clips, uses YouTube and other free hosting sites like Veoh and blip.tv to stream its video. The upside of this is that it’s convenient, free and fast. The downside is that it leaves In Media Res, which is quickly becoming a valuable archive of critically annotated media artifacts, vulnerable to the copyright purges that periodically sweep fan-driven media sites, YouTube especially.
In this latest episode, a full 27 posts on In Media Res suddenly found themselves with gaping holes where video clips once had been. The biggest single takedown we’ve yet experienced. Fortunately, since we regard these sorts of media quotations as fair use, we make it a policy to rip backups of every externally hosted clip so that we can remount them on our own server in the event of a takedown. And so, with a little work, nearly everything was restored -? there were a few clips that for various reasons we had failed to back up. We’re still trying to scrounge up other copies.
The MediaCommons fair use statement reads as follows:

MediaCommons is a strong advocate for the right of media scholars to quote from the materials they analyze, as protected by the principle of “fair use.” If such quotation is necessary to a scholar’s argument, if the quotation serves to support a scholar’s original analysis or pedagogical purpose, and if the quotation does not harm the market value of the original text — but rather, and on the contrary, enhances it — we must defend the scholar’s right to quote from the media texts under study.

The good news is that In Media Res carries on relatively unruffled, but these recent events serve as a sobering reminder of the fragility of the media ecology we are collectively building, of the importance of the all too infrequently invoked right of fair use in non-textual media contexts, and of the need for more robust, legally insulated media archives. They also supply us with a handy moral: keep backups of everything. Without a practical contingency plan, fair use is just a bunch of words.
Incidentally, some of these questions were raised in a good In Media Res post last August by Sharon Shahaf of the University of Texas, Austin: The Promises and Challenges of Fan-Based On-Line Archives for Global Television.

poem for no one

Just came across something lovely. Video for “Jed’s Other Poem (Beautiful Ground)” by the now disbanded Grandaddy from their great album The Sophtware Slump (2000). Jed is a character who weaves in and out of the album, a forlorn humanoid robot made of junk parts who eventually dies, leaving behind a few mournful poems.

Creator Stewart Smith: “I programmed this entirely in Applesoft BASIC on a vintage 1979 Apple ][+ with 48K of RAM — a computer so old it has no hard drive, mouse up/down arrow keys, and only types in capitals. First open-source music video, code available on website. Cinematography by Jeff Bernier.” A nice detail of the story is that this was originally a fan vid but was eventually adopted as the “official” video for the song.
Thanks to Alex Itin for the link!

no longer separated by a common language

LibraryThing now interfaces with the British Library and loads of other UK sources:

The BL is a catch in more than one way. It’s huge, of course. But, unlike some other sources, BL data isn’t normally available to the public. To get it, our friends at Talis, the UK-based library software company, have granted us special access to their Talis Base product, an elephantine mass of book data. In the case of the BL, that’s some twelve million unique records, two copies Gutenberg Bibles and two copies of the Magna Carta.

reading between the lines?

The NEA claims it wishes to “initiate a serious discussion” over the findings of its latest report, but the public statements from representatives of the Endowment have had a terse or caustic tone, such as in Sunil Iyengar’s reply to Nancy Kaplan. Another example is Mark Bauerlein’s letter to the editor in response to my December 7, 2007 Chronicle Review piece, “How Reading is Being Reimagined,” a letter in which Bauerlein seems unable or unwilling to elevate the discourse beyond branding me a “votary” of screen reading and suggesting that I “do some homework before passing opinions on matters out of [my] depth.”
One suspects that, stung by critical responses to the earlier Reading at Risk report (2004), the decision this time around was that the best defense is a good offense. Bauerlein chastises me for not matching data with data, that is for failing to provide any quantitative documentation in support of various observations about screen reading and new media (not able to resist the opportunity for insult, he also suggests such indolence is only to be expected of a digital partisan). Yet data wrangling was not the focus of my piece, and I said as much in print: rather, I wanted to raise questions about the NEA’s report in the context of the history of reading, questions which have also been asked by Harvard scholar Leah Price in a recent essay in the New York Times Book Review.
If my work is lacking in statistical heavy mettle, the NEA’s description of reading proceeds as though the last three decades of scholarship by figures like Elizabeth Eisenstein, Harvey Graff, Anthony Grafton, Lisa Jardin, Bill Sherman, Adrian Johns, Roger Chartier, Peter Stallybrass, Patricia Crain, Lisa Gitelman, and many others simply does not exist. But this body of work has demolished the idea that reading is a stable or historically homogeneous activity, thereby ripping the support out from under the quaint notion that the codex book is the simple, self-consistent artifact it is presented as in the reports, while also documenting the numerous varieties of cultural anxiety that have attended the act of reading and questions over whether we’re reading not enough or too much.
It’s worth underscoring that the academic response to the NEA’s two reports has been largely skeptical. Why is this? After all, in the ivied circles I move in, everyone loves books, cherishes reading, and wants people to read more, in whatever venue or medium. I also know that’s true of the people at if:book (and thanks to Ben Vershbow, by the way, for giving me the opportunity to respond here). And yet we bristle at the data as presented by the NEA. Is it because, as academics, eggheads, and other varieties of bookwormish nerds and geeks we’re all hopelessly ensorcelled by the pleasures of problematizing and complicating rather than accepting hard evidence at face value? Herein lies the curious anti-intellectualism to which I think at least some of us are reacting, an anti-intellectualism that manifests superficially in the rancorous and dismissive tone that Bauerlein and Iyengar have brought to the very conversation they claim they sought to initiate, but anti-intellectualism which, at its root, is – ?just possibly – ?about a frustration that the professors won’t stop indulging their fancy theories and footnotes and ditzy digital rhetoric. (Too much book larnin’ going on up at the college? Is that what I’m reading between the lines?)
Or maybe I’m wrong about that last bit. I hope so. Because as I said in my Chronicle Review piece, there’s no doubt it’s time for a serious conversation about reading. Perhaps we can have a portion of it here on if:book.
Matthew Kirschenbaum
University of Maryland

Related: “the NEA’s misreading of reading”

the year of reading dangerously

2008 is going well so far for the Institute in London – I was invited to 10 Downing Street this morning for the launch of the National Year of Reading which takes place in 2008, as one of a small group including literacy promoters, librarians, teachers, schoolchildren, authors and Richard Madeley, the presenter who with his partner Judy has become the British equivalent of Oprah, hosting a hugely influential TV book group which helps the trade to sell stacks of the titles it recommends. Prime Minister Gordon Brown has had a rough few months since taking over from Blair, but was at his best today – he’s a genuine enthusiast for reading.
One topic for discussion was the importance of fathers reading to their children, and in particular to their sons. There are so many opportunities for new media here to help reach out to those who don’t think of themselves as ‘book people’.
Ten years ago the first Year of Reading kicked off a lot of activities and alliances which have thrived since, but I don’t remember anyone giving much attention to the internet – except as a place to download resources from. So I was delighted to be there this time representing the Institute, and able to make the point at the outset that any promotion of the importance of literacy skills, reading appetite and the pleasure of literature must recognise the cultural importance of the networked screen and the interconnectedness of different media in the minds of young people and the lives of us all, even those who don’t acknowledge this. Well, I kind of made that point…briefly and perhaps not so clearly. Anyway, I was there and got to speak up for if:book. The year has a different theme each month, ending with the Future of Reading in December, so we are planning all kinds of activities to link with that. Watch this space.

NEA reading debate round 2: an exchange between sunil iyengar and nancy kaplan

Last week I received an email from Sunil Iyengar of the National Endownment for the Arts responding to Nancy Kaplan’s critique (published here on if:book) of the NEA’s handling of literacy data in its report “To Read or Not to Read.” I’m reproducing the letter followed by Nancy’s response.
Sunil Iyengar:
The National Endowment for the Arts welcomes a “careful and responsible” reading of the report, To Read or Not To Read, and the data used to generate it. Unfortunately, Nancy Kaplan’s critique (11/30/07) misconstrues the NEA’s presentation of Department of Education test data as a “distortion,” although all of the report’s charts are clearly and accurately labeled.
For example, in Charts 5A to 5D of the full report, the reader is invited to view long-term trends in the average reading score of students at ages 9, 13, and 17. The charts show test scores from 1984 through 2004. Why did we choose that interval? Simply because most of the trend data in the preceding chapters–starting with the NEA’s own study data featured in Chapter One–cover the same 20-year period. For the sake of consistency, Charts 5A to 5D refer to those years.
Dr. Kaplan notes that the Department of Education’s database contains reading score trends from 1971 onward. The NEA report also emphasizes this fact, in several places. In 2004, the report observes, the average reading score for 17-year-olds dipped back to where it was in 1971. “For more than 30 years…17-year-olds have not sustained improvements in reading scores,” the report states on p. 57. Nine-year-olds, by contrast, scored significantly higher in 2004 than in 1971.
Further, unlike the chart in Dr. Kaplan’s critique, the NEA’s Charts 5A to 5D explain that the “test years occurred at irregular intervals,” and each test year from 1984 to 2004 is provided. Also omitted from the critique’s reproduction are labels for the charts’ vertical axes, which provide 5-point rather than the 10-point intervals used by the Department of Education chart. Again, there is no mystery here. Five-point intervals were chosen to make the trends easier to read.
Dr. Kaplan makes another mistake in her analysis. She suggests that the NEA report is wrong to draw attention to declines in the average reading score of adult Americans of virtually every education level, and an overall decline in the percentage of adult readers who are proficient. But the Department of Education itself records these declines. In their separate reports, the NEA and the Department of Education each acknowledge that the average reading score of adults has remained unchanged. That’s because from 1992 to 2003, the percentage of adults with postsecondary education increased and the percentage who did not finish high school decreased. “After all,” the NEA report notes, “compared with adults who do not complete high school, adults with postsecondary education tend to attain higher prose scores.” Yet this fact in no way invalidates the finding that average reading scores and proficiency levels are declining even at the highest education levels.
“There is little evidence of an actual decline in literacy rates or proficiency,” Dr. Kaplan concludes. We respectfully disagree.
Sunil Iyengar
Director, Research & Analysis
National Endowment for the Arts
Nancy Kaplan:
I appreciate Mr. Iyengar’s engagement with issues at the level of data and am happy to acknowledge that the NEA’s report includes a single sentence on pages 55-56 with the crucial concession that over the entire period for which we have data, the average scale scores of 17 year-olds have not changed: “By 2004, the average scale score had retreated to 285, virtually the same score as in 1971, though not shown in the chart.” I will even concede the accuracy of the following sentence: “For more than 30 years, in other words, 17year-olds have not sustained improvements in reading scores” [emphasis in the original]. What the report fails to note or account for, however, is that there actually was a period of statistically significant improvement in scores for 17 year-olds from 1971 to 1984. Although I did not mention it in my original critique, the report handles data from 13 year-olds in the same way: “the scores for 13-year-olds have remained largely flat from 1984-2004, with no significant change between the 2004 average score and the scores from the preceding seven test years. Although not apparent from the chart, the 2004 score does represent a significant improvement over the 1971 average – ?a four-point increase” (p. 56).
In other words, a completely accurate and honest assessment of the data shows that reading proficiency among 17 year-olds has fluctuated over the past 30 years, but has not declined over that entire period. At the same time, reading proficiency among 9 year-olds and 13 year-olds has improved significantly. Why does the NEA not state the case in the simple, accurate and complete way I have just written? The answer Mr. Iyengar proffers is consistency, but that response may be a bit disingenuous.
Plenty of graphs in the NEA report show a variety of time periods, so there is at best a weak rationale for choosing 1984 as the starting point for the graphs in question. Consistency, in this case, is surely less important than accuracy and completeness. Given the inferences the report draws from the data, then, it is more likely that the sample of data the NEA used in its representations was chosen precisely because, as Mr. Iyengar admits, that sample would make “the trends easier to read.” My point is that the “trends” the report wants to foreground are not the only trends in the data: truncating the data set makes other, equally important trends literally invisible. A single sentence in the middle of a paragraph cannot excuse the act of erasure here. As both Edward Tufte (The Visual Display of Quantitative Information) and Jacques Bertin (Semiology of Graphics), the two most prominent authorities on graphical representations of data, demonstrate in their seminal works on the subject, selective representation of data constitutes distortion of that data.
Similarly, labels attached to a graph, even when they state that the tests occurred at irregular intervals, do not substitute for representing the irregularity of the intervals in the graph itself (again, see Tufte and Bertin). To do otherwise is to turn disinterested analysis into polemic. “Regularizing” the intervals in the graphic representation distorts the data.
The NEA report wants us to focus on a possible correlation between choosing to read books in one’s leisure time, reading proficiency, and a host of worthy social and civic activities. Fine. But if the reading scores of 17 year-olds improved from 1971 to 1984 but there is no evidence that during the period of improvement these youngsters were reading more, the case the NEA is trying to build becomes shaky at best. Similarly, the reading scores of 13 year-olds improved from 1971 to 1984 but “have remained largely flat from 1984-2004 ….” Yet during that same period, the NEA report claims, leisure reading among 13 year-olds was declining. So what exactly is the hypothesis here -? that sometimes declines in leisure reading correlate with declines in reading proficiency but sometimes such a decline is not accompanied by a decline in reading proficiency? I’m skeptical.
My critique is aimed at the management of data (rather than the a-historical definition of reading the NEA employs, a somewhat richer and more potent issue joined by Matthew Kirschenbaum and others) because I believe that a crucial component of contemporary literacy, in its most capacious sense, includes the ability to understand the relationships between claims, evidence and the warrants for that evidence. The NEA’s data need to be read with great care and its argument held to a high scientific standard lest we promulgate worthless or wasteful public policy based on weak research.
I am a humanist by training and so have come to my appreciation of quantitative studies rather late in my intellectual life. I cannot claim to have a deep understanding of statistics, yet I know what “confounding factors” are. When the NEA report chooses to claim that the reading proficiency of adults is declining while at the same time ignoring the NCES explanation of the statistical paradox that explains the data, it is difficult to avoid the conclusion that the report’s authors are not engaging in a disinterested (that is, dispassionate) exploration of what we can know about the state of literacy in America today but are instead cherry-picking the elements that best suit the case they want to make.
Nancy Kaplan, Executive Director
School of Information Arts and Technologies
University of Baltimore

the year of the author

Natalie Merchant, one of my favorite artists, was featured in The New York Times today. She is back after a long hiatus, but if you want to hear her new songs you better stand in line for a ticket to one of her shows because she doesn’t plan to release an album anytime soon. She appeared this weekend at the Hiro Ballroom in New York City. According to the Times, a voice in the crowd asked when Ms. Merchant would release a new album, she said with a smile that she was awaiting “a new paradigm for the recording industry.”
hmm, well, the good news is that the paradigm is shifting, fast. But we don’t yet know if this will be a good thing or a bad thing. It’s certainly a bad thing for the major labels, who are losing market share faster then polar bears are losing their ice (sorry for the awful metaphor). But as they continue to shrink, so do the services and protections they offer to the artists. And the more content moves online the less customers are willing to pay for it. Radiohead’s recent experiment proves that.
But artists are still embracing new media and using it to take matters into their own hands. In the music industry, a long-tail entrepreneurial system supported by online networks and e-commerce is beginning to emerge. Sites like nimbit empower artists to manage their own sales and promotion, bypassing itunes which takes a hefty 50% off the top and and, unlike record labels, does nothing to shape or nurture an artist’s career.
Now, indulge me for a moment while I talk about the Kindle as though it were the ipod of ebooks. It’s not, for lots of reasons. But it does have one thing in common with its music industry counterpart, it allows authors to upload their own content and sell it on amazon. That is huge. That alone might be enough to start a similar paradigm shift in publishing. In this week’s issue of Publisher’s Weekly, Mike Shatzkin predicts it will.
So why have I titled this, “the year of the author”? (I borrowed that phrase from Mike Shatzkin’s prediction #3 btw). I’m not trying to say it will be a great year for authors. New media is going to squeeze them as it is squeezing musicians and striking writer’s guild members. It is the year of the author, because they will be the ones who drive the paradigm shift. They may begin to use online publishing and distribution tools to bypass traditional publishers and put their work out there en masse. OR they will opt out of the internet’s “give-up-your-work-for-free” model and create a new model altogether. Natalie Merchant is opting to (temporarily I hope) bring back the troubadour tradition in the music biz. It will be interesting to see what choices authors make as the publishing industry’s ice begins to shift.