Category Archives: academic

major news: IFB and NYU libraries to collaborate

A couple of weeks ago, I alluded to a new institutional partnership that’s been in the works for some time. Well I’m thrilled to officially announce that the we are joining forces with the NYU Division of Libraries!
From Carol A. Mandel, dean of the NYU Libraries. “IFB is a thought leader in the future of scholarly communication. We will work together to develop new software and new options that faculty can use to publish, review, share, and collaborate at NYU and in the larger academic community.”
Read the full press release: NYU Libraries & Institute for the Future of the Book Announce Partnership to Develop Tools for Digital Scholarly Research
A basic breakdown of what this means:
-? NYU is now our technical home. All IFB sites are running out of there with IT support from the NYU Libraries’ top-notch team.
-? Bob, Dan and I will serve as visiting scholars at NYU.
-? With recently secured NEH digital humanities start-up funding (along with other monies yet to be raised), we will work with the NYU digital library team, headed by James Bullen, to develop social networking tools and infrastructure for MediaCommons. This will serve as applied research for digital tools and frameworks that NYU is presently developing.
-? We will work with NYU librarians, with the digital library team, and with Monica McCormick, the Libraries’ program officer for digital scholarly publishing, to create forums for collaboration and to develop specific projects and digital initiatives with NYU faculty, and possibly NYU Press.
Needless to say, we’re tremendously excited about this partnership. Things are still being set up but expect more news in the weeks and months ahead.

danah boyd’s closed journal boycott

I meant to blog this earlier but it’s still quite relevant, especially in light of other recent activity on the open access front. Last week, Danah Boyd announced that henceforth she would only publish in open access journals and urged others -? especially tenured faculty, who are secure in their status and have little to lose -? to do the same.

I’d be sad to see some of the academic publishers go, but if they can’t evolve to figure out new market options, I have no interest in supporting their silencing practices. I think that scholars have a responsibility to make their work available as a public good. I believe that scholars should be valued for publishing influential material that can be consumed by anyone who might find it relevant to their interests. I believe that the product of our labor should be a public good. I do not believe that scholars should be encouraged to follow stupid rules for the sake of maintaining norms. Given that we do the bulk of the labor behind journals, I think that we can do it without academic publishers…

harvard faculty votes overwhelmingly for open access

The Harvard Crimson:

The motion, which passed easily at yesterday’s Faculty meeting, grants Harvard a non-exclusive copyright over all articles produced by any current Faculty member, allowing for the creation of an online repository that would be “available to other services such as web harvesters, Google Scholar, and the like.”
…English professor Stephen Greenblatt, the editor of what he described as a journal with “a decent reputation and a quite anemic subscription base,” advocated for the motion because he doubted it would accelerate the death of his journal, and because he said he was worried about the currently high cost of many monographs.
“This is one of the only ways we can break the backs of the monopolists who are currently seriously damaging our fields,” he said.

New York Times:

“The chorus of ‘yeas’ was thunderous,” Robert Darnton, the director of the University Library, wrote in an e-mail message. “I hope this marks a turning point in the way communications operate in the world of scholarship.”

harvard faculty cast vote on open access

The U.S. presidential primaries in Virginia, Maryland and D.C. are not the only votes to watch today. The New York Times reports that arts and sciences faculty at Harvard are weighing in today on a proposed measure that would make all scholarly articles available in a free open access repository run by the library immediately following publication.

“In place of a closed, privileged and costly system, it will help open up the world of learning to everyone who wants to learn,” said Robert Darnton, director of the university library. “It will be a first step toward freeing scholarship from the stranglehold of commercial publishers by making it freely available on our own university repository.”
Under the proposal Harvard would deposit finished papers in an open-access repository run by the library that would instantly make them available on the Internet. Authors would still retain their copyright and could publish anywhere they pleased -? including at a high-priced journal, if the journal would have them.
What distinguishes this plan from current practice, said Stuart Shieber, a professor of computer science who is sponsoring the faculty motion, is that it would create an “opt-out” system: an article would be included unless the author specifically requested it not be. Mr. Shieber was the chairman of a committee set up by Harvard’s provost to investigate scholarly publishing; this proposal grew out of one of the recommendations, he said.

My fingers are crossed that this vote will go the way of openness. A vote for open access from Harvard would be a huge boost for the movement. Change is more likely to come if people at the top of the heap, whose personal incentive for reform is far less obvious, start making the move on principle -? saying, essentially, that it’s not the job of scholars to prop up the journal business.

developing books in networked communities: a conversation with don waters

Two weeks ago, when the blog-based peer review of Noah Wardrip-Fruin’s Expressive Processing began on Grand Text Auto, Bob sent a note about the project to Don Waters, the program officer for scholarly communications at the Andrew W. Mellon Foundation -? someone very much at the forefront of developments in the digital publishing arena. He wrote back intrigued but slightly puzzled as to the goals, scope and definitions of the experiment. We forwarded the note to Noah and to Doug Sery, Noah’s editor at MIT Press, and decided each to write some clarifying responses from our different perspectives: book author/blogger (Noah), book editor (Doug), and web editor (myself). The result is an interesting exchange about networked publishing and useful meta-document about the project. As our various responses, and Don’s subsequent reply, help to articulate, playing with new forms of peer review is only one aspect of this experiment, and maybe not even the most interesting one. The exchange is reproduced below (a couple of names mentioned have been made anonymous).
Don Waters (Mellon Foundation):
Thanks, Bob. This is a very interesting idea. In reading through the materials, however, I did not really understand how, if at all, this “experiment” would affect MIT Press behavior. What are the hypotheses being tested in that regard? I can see, from one perspective, that this “experiment” would result purely in more work for everyone. The author would get the benefit of the “crowd” commenting on his work, and revise accordingly, and then the Press would still send the final product out for peer review and copy editing prior to final publication.
Don
Ben Vershbow (Institute for the Future of the Book):
There are a number of things we set out to learn here. First, can an open, Web-based review process make a book better? Given the inherently inter-disciplinary nature of Noah’s book, and the diversity of the Grand Text Auto readership, it seems fairly likely that exposing the manuscript to a broader range of critical first-responders will bring new things to light and help Noah to hone his argument. As can be seen in his recap of discussions around the first chapter, there have already been a number of incisive critiques that will almost certainly impact subsequent revisions.
Second, how can we use available web technologies to build community around a book, or to bring existing communities into a book’s orbit? “Books are social vectors, but publishers have been slow to see it,” writes Ursula K. Le Guin in a provocative essay in the latest issue of Harper’s. For the past three years, the Institute for the Future of the Book’s mission has been to push beyond the comfort zone of traditional publishers, exploring the potential of networked technologies to enlarge the social dimensions of books. By building a highly interactive Web component to a text, where the author and his closest peers are present and actively engaged, and where the entire text is accessible with mechanisms for feedback and discussion, we believe the book will occupy a more lively and relevant place in the intellectual ecology of the Internet and probably do better overall in the offline arena as well.
The print book may have some life left in it yet, but it now functions within a larger networked commons. To deny this could prove fatal for publishers in the long run. Print books today need dynamic windows into the Web and publishers need to start experimenting with the different forms those windows could take or else retreat further into marginality. Having direct contact with the author -? being part of the making of the book -? is a compelling prospect for the book’s core audience and their enthusiasm is likely to spread. Certainly, it’s too early to make a definitive assessment about the efficacy of this Web outreach strategy, but initial indicators are very positive. Looked at one way, it certainly does create more work for everyone, but this is work that has to be done. At the bare minimum, we are building marketing networks and generating general excitement about the book. Already, the book has received a great deal of attention around the blogosphere, not just because of its novelty as a publishing experiment, but out of genuine interest in the subject matter and author. I would say that this is effort well spent.
It’s important to note that, despite CHE’s lovely but slightly sensational coverage of this experiment as a kind of mortal combat between traditional blind peer review and the new blog-based approach, we view the two review processes as complementary, not competitive. At the end, we plan to compare the different sorts of feedback the two processes generate. Our instinct is that it will suggest hybrid models rather than a wholesale replacement of one system with another.
That being said, our instincts tell us that open blog-based review (or other related forms) will become increasingly common practice among the next generation of academic writers in the humanities. The question for publishers is how best to engage with, and ideally incorporate, these new practices. Already, we see a thriving culture of pre-publication peer review in the sciences, and major publishers such as Nature are beginning to build robust online community infrastructures so as to host these kinds of interactions within their own virtual walls. Humanities publishers should be thinking along the same lines, and partnerships with respected blogging communities like GTxA are a good way to start experimenting. In a way, the MIT-GTxA collab represents an interface not just of two ideas of peer review but between two kinds of publishing imprints. Both have built a trusted name and become known for a particular editorial vision in their respective (and overlapping) communities. Each excels in a different sort of publishing, one print-based, the other online community-based. Together they are greater than the sum of their parts and suggest a new idea of publishing that treats books as extended processes rather than products. MIT may regard this as an interesting but not terribly significant side project for now, but it could end up having a greater impact on the press (and hopefully on other presses) than they expect.
All the best,
Ben
Noah Wardrip-Fruin (author, UC San Diego):
Hi Bob –
Yesterday I went to meet some people at a game company. There’s a lot of expertise there – and actually quite a bit of reflection on what they’re doing, how to think about it, and so on. But they don’t participate in academic peer review. They don’t even read academic books. But they do read blogs, and sometimes comment on them, and I was pleased to hear that there are some Grand Text Auto readers there.
If they comment on the Expressive Processing manuscript, it will create more work for me in one sense. I’ll have to think about what they say, perhaps respond, and perhaps have to revise my text. But, from my perspective, this work is far outweighed by the potential benefits: making a better book, deepening my thinking, and broadening the group that feels academic writing and publishing is potentially relevant to them.
What makes this an experiment, from my point of view, is the opportunity to also compare what I learn from the blog-based peer review to what I learn from the traditional peer review. However, this will only be one data point. We’ll need to do a number of these, all using blogs that are already read by the audience we hope will participate in the peer review. When we have enough data points perhaps we’ll start to be able to answer some interesting questions. For example, is this form of review more useful in some cases than others? Is the feedback from the two types of review generally overlapping or divergent? Hopefully we’ll learn some lessons that presses like MITP can put into practice – suggesting blog-based review when it is most appropriate, for example. With those lessons learned, it will be time to design the next experiment.
Best,
Noah
Doug Sery (MIT Press):
Hi Bob,
I know Don’s work in digital libraries and preservation, so I’m not surprised at the questions. While I don’t know the breadth of the discussions Noah and Ben had around this project, I do know that Noah and I approached this in a very casual manner. Noah has expressed his interest in “open communication” any number of times and when he mentioned that he’d like to “crowd-source” “Expressive Processing” on Grand Text Auto I agreed to it with little hesitation, so I’m not sure I’d call it an experiment. There are no metrics in place to determine whether this will affect sales or produce a better book. I don’t see this affecting the way The MIT Press will approach his book or publishing in general, at least for the time being.
This is not competing with the traditional academic press peer-review, although the CHE article would lead the reader to believe otherwise (Jeff obviously knows how to generate interest in a topic, which is fine, but even a games studies scholar, in a conversation I had with him today, laughingly called the headline “tabloidesque.”) . While Noah is posting chapters on his blog, I’m having the first draft peer-reviewed. After the peer-reviews come in, Noah and I will sit down to discuss them to see if any revisions to the manuscript need to be made. I don’t plan on going over the GTxA comments with Noah, unless I happen to see something that piques my interest, so I don’t see any additional work having to be done on the part of MITP. It’s a nice way for Noah to engage with the potential audience for his ideas, which I think is his primary goal for all of this. So, I’m thinking of this more as an exercise to see what kind of interest people have in these new tools and/or mechanisms. Hopefully, it will be a learning experience that MITP can use as we explore new models of publishing.
Hope this helps and that all’s well.
Best,
Doug
Don Waters:
Thanks, Bob (and friends) for this helpful and informative feedback.
As I understand the explanations, there is a sense in which the experiment is not aimed at “peer review” at all in the sense that peer review assesses the qualities of a work to help the publisher determine whether or not to publish it. What the exposure of the work-in-progress to the community does, besides the extremely useful community-building activity, is provide a mechanism for a function that is now all but lost in scholarly publishing, namely “developmental editing.” It is a side benefit of current peer review practice that an author gets some feedback on the work that might improve it, but what really helps an author is close, careful reading by friends who offer substantive criticism and editorial comments. Most accomplished authors seek out such feedback in a variety of informal ways, such as sending out manuscripts in various stages of completion to their colleagues and friends. The software that facilitates annotation and the use of the network, as demonstrated in this experiment, promise to extend this informal practice to authors more generally. I may have the distinction between peer review and developmental editing wrong, or you all may view the distinction as mere quibbling, but I think it helps explain why CHE got it so wrong in reporting the experiment as struggle between peer review and the blog-based approach. Two very different functions are being served, and as you all point out, these are complementary rather than competing functions.
I am very intrigued by the suggestions that scholarly presses need to engage in this approach more generally, and am eagerly learning from this and related experiments, such as those at Nature and elsewhere, more about the potential benefits of this kind of approach.
Great work and many thanks for the wonderful (and kind) responses.
Best,
Don

expressive processing meta

To mark the posting of the final chunk of chapter 1 of the Expressive Processing manuscript on Grand Text Auto, Noah has kicked off what will hopefully be a revealing meta-discussion to run alongside the blog-based peer review experiment. The first meta post includes a roundup of comments from the first week and invites readers to comment on the process as a whole. As you’ll see, there’s already been some incisive feedback and Noah is mulling over revisions. Chapter 2 starts tomorrow.
In case you missed it, here’s an intro to the project.

expressive processing: an experiment in blog-based peer review

An exciting new experiment begins today, one which ties together many of the threads begun in our earlier “networked book” projects, from Without Gods to Gamer Theory to CommentPress. It involves a community, a manuscript, and an open peer review process -? and, very significantly, the blessing of a leading academic press. (The Chronicle of Higher Education also reports.)
Mitpress_logo.png The community in question is Grand Text Auto, a popular multi-author blog about all things relating to digital narrative, games and new media, which for many readers here, probably needs no further introduction. The author, Noah Wardrip-Fruin, a professor of communication at UC San Diego, a writer/maker of digital fictions, and, of course, a blogger at GTxA. His book, which starting today will be posted in small chunks, open to reader feedback, every weekday over a ten-week period, is called Expressive Processing: Digital Fictions, Computer Games, and Software Studies. It probes the fundamental nature of digital media, looking specifically at the technical aspects of creation -? the machines and software we use, the systems and processes we must learn end employ in order to make media -? and how this changes how and what we create. It’s an appropriate guinea pig, when you think about it, for an open review experiment that implicitly asks, how does this new technology (and the new social arrangements it makes possible) change how a book is made?
The press that has given the green light to all of this is none other than MIT, with whom Noah has published several important, vibrantly inter-disciplinary anthologies of new media writing. Expressive Processing his first solo-authored work with the press, will come out some time next year but now is the time when the manuscript gets sent out for review by a small group of handpicked academic peers. Doug Sery, the editor at MIT, asked Noah who would be the ideal readers for this book. To Noah, the answer was obvious: the Grand Text Auto community, which encompasses not only many of Noah’s leading peers in the new media field, but also a slew of non-academic experts -? writers, digital media makers, artists, gamers, game designers etc. -? who provide crucial alternative perspectives and valuable hands-on knowledge that can’t be gotten through more formal channels. Noah:

Blogging has already changed how I work as a scholar and creator of digital media. Reading blogs started out as a way to keep up with the field between conferences — and I soon realized that blogs also contain raw research, early results, and other useful information that never gets presented at conferences. But, of course, that’s just the beginning. We founded Grand Text Auto, in 2003, for an even more important reason: blogs can create community. And the communities around blogs can be much more open and welcoming than those at conferences and festivals, drawing in people from industry, universities, the arts, and the general public. Interdisciplinary conversations happen on blogs that are more diverse and sustained than any I’ve seen in person.
Given that ours is a field in which major expertise is located outside the academy (like many other fields, from 1950s cinema to Civil War history) the Grand Text Auto community has been invaluable for my work. In fact, while writing the manuscript for Expressive Processing I found myself regularly citing blog posts and comments, both from Grand Text Auto and elsewhere….I immediately realized that the peer review I most wanted was from the community around Grand Text Auto.

Sery was enthusiastic about the idea (although he insisted that the traditional blind review process proceed alongside it) and so Noah contacted me about working together to adapt CommentPress to the task at hand.
gtalogo.jpg The challenge technically was to integrate CommentPress into an existing blog template, applying its functionality selectively -? in other words, to make it work for a specific group of posts rather than for all content in the site. We could have made a standalone web site dedicated to the book, but the idea was to literally weave sections of the manuscript into the daily traffic of the blog. From the beginning, Noah was very clear that this was the way it needed to work, insisting that the social and technical integration of the review process were inseparable. I’ve since come to appreciate how crucial this choice was for making a larger point about the value of blog-based communities in scholarly production, and moreover how elegantly it chimes with the central notions of Noah’s book: that form and content, process and output, can never truly be separated.
Up to this point, CommentPress has been an all or nothing deal. You can either have a whole site working with paragraph-level commenting, or not at all. In the technical terms of WordPress, its platform, CommentPress is a theme: a template for restructuring an entire blog to work with the CommentPress interface. What we’ve done -? with the help of a talented WordPress developer named Mark Edwards, and invaluable guidance and insight from Jeremy Douglass of the Software Studies project at UC San Diego (and the Writer Response Theory blog) -? is made CommentPress into a plugin: a program that enables a specific function on demand within a larger program or site. This is an important step for CommentPress, giving it a new flexibility that it has sorely lacked and acknowledging that it is not a one-size-fits-all solution.
Just to be clear, these changes are not yet packaged into the general CommentPress codebase, although they will be before too long. A good test run is still needed to refine the new model, and important decisions have to be made about the overall direction of CommentPress: whether from here it definitively becomes a plugin, or perhaps forks into two paths (theme and plugin), or somehow combines both options within a single package. If you have opinions on this matter, we’re all ears…
But the potential impact of this project goes well beyond the technical.
It represents a bold step by a scholarly press -? one of the most distinguished and most innovative in the world -? toward developing new procedures for vetting material and assuring excellence, and more specifically, toward meaningful collaboration with existing online scholarly communities to develop and promote new scholarship.
It seems to me that the presses that will survive the present upheaval will be those that learn to productively interact with grassroots publishing communities in the wild of the Web and to adopt the forms and methods they generate. I don’t think this will be a simple story of the blogosphere and other emerging media ecologies overthrowing the old order. Some of the older order will die off to be sure, but other parts of it will adapt and combine with the new in interesting ways. What’s particularly compelling about this present experiment is that it has the potential to be (perhaps now or perhaps only in retrospect, further down the line) one of these important hybrid moments -? a genuine, if slightly tentative, interface between two publishing cultures.
Whether the MIT folks realize it or not (their attitude at the outset seems to be respectful but skeptical), this small experiment may contain the seeds of larger shifts that will redefine their trade. The most obvious changes leveled on publishing by the Internet, and the ones that get by far the most attention, are in the area of distribution and economic models. The net flattens distribution, making everyone a publisher, and radically undercuts the heretofore profitable construct of copyright and the whole system of information commodities. The effects are less clear, however, in those hardest to pin down yet most essential areas of publishing -? the territory of editorial instinct, reputation, identity, trust, taste, community… These are things that the best print publishers still do quite well, even as their accounting departments and managing directors descend into panic about the great digital undoing. And these are things that bloggers and bookmarkers and other web curators, archivists and filterers are also learning to do well -? to sift through the information deluge, to chart a path of quality and relevance through the incredible, unprecedented din.
This is the part of publishing that is most important, that transcends technological upheaval -? you might say the human part. And there is great potential for productive alliances between print publishers and editors and the digital upstarts. By delegating half of the review process to an existing blog-based peer community, effectively plugging a node of his press into the Web-based communications circuit, Doug Sery is trying out a new kind of editorial relationship and exercising a new kind of editorial choice. Over time, we may see MIT evolve to take on some of the functions that blog communities currently serve, to start providing technical and social infrastructure for authors and scholarly collectives, and to play the valuable (and time-consuming) roles of facilitator, moderator and curator within these vast overlapping conversations. Fostering, organizing, designing those conversations may well become the main work of publishing and of editors.
I could go on, but better to hold off on further speculation and to just watch how it unfolds. The Expressive Processing peer review experiment begins today (the first actual manuscript section is here) and will run for approximately ten weeks and 100 thousand words on Grand Text Auto, with a new post every weekday during that period. At the end, comments will be sorted, selected and incorporated and the whole thing bundled together into some sort of package for MIT. We’re still figuring out how that part will work. Please go over and take a look and if a thought is provoked, join the discussion.

nominate the best tech writing of 2007

digitalculturebooks, a collaborative imprint of the University of Michigan press and library, publishes an annual anthology of the year’s best technology writing. The nominating process is open to the public and they’re giving people until January 31st to suggest exemplary articles on “any and every technology topic–biotech, information technology, gadgetry, tech policy, Silicon Valley, and software engineering” etc.
The 2007 collection is being edited by Clive Thompson. Last year’s was Steven Levy. When complete, the collection is published as a trade paperback and put in its entirety online in clean, fully searchable HTML editions, so head over and help build what will become a terrific open access resource.

youtube purges: fair use tested

Last week there was a wave of takedowns on YouTube of copyright-infringing material -? mostly clips from television and movies. MediaCommons, the nascent media studies network we help to run, felt this rather acutely. In Media Res, an area of the site where media scholars post and comment on video clips, uses YouTube and other free hosting sites like Veoh and blip.tv to stream its video. The upside of this is that it’s convenient, free and fast. The downside is that it leaves In Media Res, which is quickly becoming a valuable archive of critically annotated media artifacts, vulnerable to the copyright purges that periodically sweep fan-driven media sites, YouTube especially.
In this latest episode, a full 27 posts on In Media Res suddenly found themselves with gaping holes where video clips once had been. The biggest single takedown we’ve yet experienced. Fortunately, since we regard these sorts of media quotations as fair use, we make it a policy to rip backups of every externally hosted clip so that we can remount them on our own server in the event of a takedown. And so, with a little work, nearly everything was restored -? there were a few clips that for various reasons we had failed to back up. We’re still trying to scrounge up other copies.
The MediaCommons fair use statement reads as follows:

MediaCommons is a strong advocate for the right of media scholars to quote from the materials they analyze, as protected by the principle of “fair use.” If such quotation is necessary to a scholar’s argument, if the quotation serves to support a scholar’s original analysis or pedagogical purpose, and if the quotation does not harm the market value of the original text — but rather, and on the contrary, enhances it — we must defend the scholar’s right to quote from the media texts under study.

The good news is that In Media Res carries on relatively unruffled, but these recent events serve as a sobering reminder of the fragility of the media ecology we are collectively building, of the importance of the all too infrequently invoked right of fair use in non-textual media contexts, and of the need for more robust, legally insulated media archives. They also supply us with a handy moral: keep backups of everything. Without a practical contingency plan, fair use is just a bunch of words.
Incidentally, some of these questions were raised in a good In Media Res post last August by Sharon Shahaf of the University of Texas, Austin: The Promises and Challenges of Fan-Based On-Line Archives for Global Television.

reading between the lines?

The NEA claims it wishes to “initiate a serious discussion” over the findings of its latest report, but the public statements from representatives of the Endowment have had a terse or caustic tone, such as in Sunil Iyengar’s reply to Nancy Kaplan. Another example is Mark Bauerlein’s letter to the editor in response to my December 7, 2007 Chronicle Review piece, “How Reading is Being Reimagined,” a letter in which Bauerlein seems unable or unwilling to elevate the discourse beyond branding me a “votary” of screen reading and suggesting that I “do some homework before passing opinions on matters out of [my] depth.”
One suspects that, stung by critical responses to the earlier Reading at Risk report (2004), the decision this time around was that the best defense is a good offense. Bauerlein chastises me for not matching data with data, that is for failing to provide any quantitative documentation in support of various observations about screen reading and new media (not able to resist the opportunity for insult, he also suggests such indolence is only to be expected of a digital partisan). Yet data wrangling was not the focus of my piece, and I said as much in print: rather, I wanted to raise questions about the NEA’s report in the context of the history of reading, questions which have also been asked by Harvard scholar Leah Price in a recent essay in the New York Times Book Review.
If my work is lacking in statistical heavy mettle, the NEA’s description of reading proceeds as though the last three decades of scholarship by figures like Elizabeth Eisenstein, Harvey Graff, Anthony Grafton, Lisa Jardin, Bill Sherman, Adrian Johns, Roger Chartier, Peter Stallybrass, Patricia Crain, Lisa Gitelman, and many others simply does not exist. But this body of work has demolished the idea that reading is a stable or historically homogeneous activity, thereby ripping the support out from under the quaint notion that the codex book is the simple, self-consistent artifact it is presented as in the reports, while also documenting the numerous varieties of cultural anxiety that have attended the act of reading and questions over whether we’re reading not enough or too much.
It’s worth underscoring that the academic response to the NEA’s two reports has been largely skeptical. Why is this? After all, in the ivied circles I move in, everyone loves books, cherishes reading, and wants people to read more, in whatever venue or medium. I also know that’s true of the people at if:book (and thanks to Ben Vershbow, by the way, for giving me the opportunity to respond here). And yet we bristle at the data as presented by the NEA. Is it because, as academics, eggheads, and other varieties of bookwormish nerds and geeks we’re all hopelessly ensorcelled by the pleasures of problematizing and complicating rather than accepting hard evidence at face value? Herein lies the curious anti-intellectualism to which I think at least some of us are reacting, an anti-intellectualism that manifests superficially in the rancorous and dismissive tone that Bauerlein and Iyengar have brought to the very conversation they claim they sought to initiate, but anti-intellectualism which, at its root, is – ?just possibly – ?about a frustration that the professors won’t stop indulging their fancy theories and footnotes and ditzy digital rhetoric. (Too much book larnin’ going on up at the college? Is that what I’m reading between the lines?)
Or maybe I’m wrong about that last bit. I hope so. Because as I said in my Chronicle Review piece, there’s no doubt it’s time for a serious conversation about reading. Perhaps we can have a portion of it here on if:book.
Matthew Kirschenbaum
University of Maryland

Related: “the NEA’s misreading of reading”