Author Archives: ben vershbow

a fork in the road III: fork it over

bent fork.jpg Another funny thing about Larry Sanger’s idea of a progressive fork off of Wikipedia is that he can do nothing, under the terms of the Free Documentation License, to prevent his expert-improved content from being reabsorbed by Wikipedia. In other words, the better the Citizendium becomes, the better Wikipedia becomes — but not vice versa. In the Citizendium (the name still refuses to roll off the tongue), forks are definitive. The moment a new edit is made, an article’s course is forever re-charted away from Wikipedia. So, assuming anything substantial comes of the Citizendium, feeding well-checked, better written content to Wikipedia could end up being its real value. But would it be able to sustain itself under such uninspiring circumstances? The result might be that the experts themselves fork back as well.

a fork in the road II: shirky on citizendium

Clay Shirky has some interesting thoughts on why Larry Sanger’s expert-driven Wikipedia spinoff Citizendium is bound to fail. At the heart of it is Sanger’s notion of expertise, which is based largely on institutional warrants like academic credentials, yet lacks in Citizendium the institutional framework to effectively impose itself. In other words, experts are “social facts” that rely on culturally manufactured perceptions and deferences, which may not be transferrable to an online project like the Citizendium. Sanger envisions a kind of romance between benevolent academics and an adoring public that feels privileged to take part in a distributed apprenticeship. In reality, Shirky says, this hybrid of Wikipedia-style community and top-down editorial enforcement is likely to collapse under its own contradictions. Shirky:

Citizendium is based less on a system of supportable governance than on the belief that such governance will not be necessary, except in rare cases. Real experts will self-certify; rank-and-file participants will be delighted to work alongside them; when disputes arise, the expert view will prevail; and all of this will proceed under a process that is lightweight and harmonious. All of this will come to naught when the citizens rankle at the reflexive deference to editors; in reaction, they will debauch self-certification…contest expert preogatives, rasing the cost of review to unsupportable levels…take to distributed protest…or simply opt-out.

Shirky makes a point at the end of his essay that I found especially insightful. He compares the “mechanisms of deference” at work in Wikipedia and in the proposed Citizendium. In other words, how in these two systems does consensus crystallize around an editorial action? What makes people say, ok, I defer to that?

The philosophical issue here is one of deference. Citizendium is intended to improve on Wikipedia by adding a mechanism for deference, but Wikipedia already has a mechanism for deference — survival of edits. I recently re-wrote the conceptual recipe for a Menger Sponge, and my edits have survived, so far. The community has deferred not to me, but to my contribution, and that deference is both negative (not edited so far) and provisional (can always be edited.)
Deference, on Citizendium will be for people, not contributions, and will rely on external credentials, a priori certification, and institutional enforcement. Deference, on Wikipedia, is for contributions, not people, and relies on behavior on Wikipedia itself, post hoc examination, and peer-review. Sanger believes that Wikipedia goes too far in its disrespect of experts; what killed Nupedia and will kill Citizendium is that they won’t go far enough.

My only big problem with this piece is that it’s too easy on Wikipedia. Shirky’s primary interest is social software, so the big question for him is whether a system will foster group interaction — Wikipedia’s has proven to do so, and there’s reason to believe that Citizendium’s will not, fair enough. But Shirky doesn’t acknowledge the fact that Wikipedia suffers from some of the same problems that he claims will inevitably plague Citizendium, the most obvious being insularity. Like it or not, there is in Wikipedia de facto top-down control by self-appointed experts: the cliquish inner core of editors that over time has becomes increasingly hard to penetrate. It’s not part of Wikipedia’s policy, it certainly goes against the spirit of the enterprise, but it exists nonetheless. These may not be experts as defined by Sanger, but they certainly are “social facts” within the Wikipedia culture, and they’ve even devised semi-formal credential systems like barnstars to adorn their user profiles and perhaps cow more novice users. I still agree with Shirky’s overall prognosis, but it’s worth thinking about some of the problems that Sanger is trying to address, albeit in a misconceived way.

a fork in the road for wikipedia

Estranged Wikipedia cofounder Larry Sanger has long argued for a more privileged place for experts in the Wikipedia community. Now his dream may finally be realized. A few days ago, he announced a new encyclopedia project that will begin as a “progressive fork” off of the current Wikipedia. Under the terms of the GNU Free Documentation License, anyone is free to reproduce and alter content from Wikipedia on an independent site as long as the new version is made available under those same terms. Like its antecedent, the new Citizendium, or “Citizens’ Compendium”, will rely on volunteers to write and develop articles, but under the direction of self-nominated expert subject editors. Sanger, who currently is in the process of recruiting startup editors and assembling an advisory board, says a beta of the site should be up by the end of the month.

We want the wiki project to be as self-managing as possible. We do not want editors to be selected by a committee, which process is too open to abuse and politics in a radically open and global project like this one is. Instead, we will be posting a list of credentials suitable for editorship. (We have not constructed this list yet, but we will post a draft in the next few weeks. A Ph.D. will be neither necessary nor sufficient for editorship.) Contributors may then look at the list and make the judgment themselves whether, essentially, their CVs qualify them as editors. They may then go to the wiki, place a link to their CV on their user page, and declare themselves to be editors. Since this declaration must be made publicly on the wiki, and credentials must be verifiable online via links on user pages, it will be very easy for the community to spot false claims to editorship.
We will also no doubt need a process where people who do not have the credentials are allowed to become editors, and where (in unusual cases) people who have the credentials are removed as editors. (link)

Initially, this process will be coordinated by “an ad hoc committee of interim chief subject editors.” Eventually, more permanent subject editors will be selected through some as yet to be determined process.
Another big departure from Wikipedia: all authors and editors must be registered under their real name.
More soon…
Reports in Ars Technica and The Register.

GAM3R 7H30RY found in online library catalog

Here’s a wonderful thing I stumbled across the other day: GAM3R 7H30RY has its very own listing in North Carolina State University’s online library catalog.
NCSU gamer theory.jpg
The catalog is worth browsing in general. Since January, it’s been powered by Endeca, a fantastic library search tool that, among many other things, preserves some of the serendipity of physical browsing by letting you search the shelves around your title.
(Thanks, Monica McCormick!)

perelman’s proof / wsj on open peer review

Last week got off to an exciting start when the Wall Street Journal ran a story about “networked books,” the Institute’s central meme and very own coinage. It turns out we were quoted in another WSJ item later that week, this time looking at the science journal Nature, which over the summer has been experimenting with opening up its peer review process to the scientific community (unfortunately, this article, like the networked books piece, is subscriber only).
180px-Grigori_Perelman.jpg I like this article because it smartly weaves in the story of Grigory (Grisha) Perelman, which I had meant to write about earlier. Perelman is a Russian topologist who last month shocked the world by turning down the Fields medal, the highest honor in mathematics. He was awarded the prize for unraveling a famous geometry problem that had baffled mathematicians for a century.
There’s an interesting publishing angle to this, which is that Perelman never submitted his groundbreaking papers to any mathematics journals, but posted them directly to ArXiv.org, an open “pre-print” server hosted by Cornell. This, combined with a few emails notifying key people in the field, guaranteed serious consideration for his proof, and led to its eventual warranting by the mathematics community. The WSJ:

…the experiment highlights the pressure on elite science journals to broaden their discourse. So far, they have stood on the sidelines of certain fields as a growing number of academic databases and organizations have gained popularity.
One Web site, ArXiv.org, maintained by Cornell University in Ithaca, N.Y., has become a repository of papers in fields such as physics, mathematics and computer science. In 2002 and 2003, the reclusive Russian mathematician Grigory Perelman circumvented the academic-publishing industry when he chose ArXiv.org to post his groundbreaking work on the Poincaré conjecture, a mathematical problem that has stubbornly remained unsolved for nearly a century. Dr. Perelman won the Fields Medal, for mathematics, last month.

(Warning: obligatory horn toot.)

“Obviously, Nature’s editors have read the writing on the wall [and] grasped that the locus of scientific discourse is shifting from the pages of journals to a broader online conversation,” wrote Ben Vershbow, a blogger and researcher at the Institute for the Future of the Book, a small, Brooklyn, N.Y., , nonprofit, in an online commentary. The institute is part of the University of Southern California’s Annenberg Center for Communication.

Also worth reading is this article by Sylvia Nasar and David Gruber in The New Yorker, which reveals Perelman as a true believer in the gift economy of ideas:

Perelman, by casually posting a proof on the Internet of one of the most famous problems in mathematics, was not just flouting academic convention but taking a considerable risk. If the proof was flawed, he would be publicly humiliated, and there would be no way to prevent another mathematician from fixing any errors and claiming victory. But Perelman said he was not particularly concerned. “My reasoning was: if I made an error and someone used my work to construct a correct proof I would be pleased,” he said. “I never set out to be the sole solver of the Poincaré.”

Perelman’s rejection of all conventional forms of recognition is difficult to fathom at a time when every particle of information is packaged and owned. He seems almost like a kind of mystic, a monk who abjures worldly attachment and dives headlong into numbers. But according to Nasar and Gruber, both Perelman’s flouting of academic publishing protocols and his refusal of the Fields medal were conscious protests against what he saw as the petty ego politics of his peers. He claims now to have “retired” from mathematics, though presumably he’ll continue to work on his own terms, in between long rambles through the streets of St. Petersburg.
Regardless, Perelman’s case is noteworthy as an example of the kind of critical discussions that scholars can now orchestrate outside the gate. This sort of thing is generally more in evidence in the physical and social sciences, but ought too to be of great interest to scholars in the humanities, who have only just begun to explore the possibilities. Indeed, these are among our chief inspirations for MediaCommons.
Academic presses and journals have long functioned as the gatekeepers of authoritative knowledge, determining which works see the light of day and which ones don’t. But open repositories like ArXiv have utterly changed the calculus, and Perelman’s insurrection only serves to underscore this fact. Given the abundance of material being published directly from author to public, the critical task for the editor now becomes that of determining how works already in the daylight ought to be received. Publishing isn’t an endpoint, it’s the beginning of a process. The networked press is a guide, a filter, and a discussion moderator.
Nature seems to grasp this and is trying with its experiment to reclaim some of the space that has opened up in front of its gates. Though I don’t think they go far enough to effect serious change, their efforts certainly point in the right direction.

carleton roadtrip begins at the institute

On Wednesday, we had the pleasure of spending an afternoon with a group of 22 students from Carleton College who are spending a trimester studying and making digital media under the guidance of John Schott, a professor in the Dept. of Cinema & Media Studies specializing in “personal media” production and network culture. This year, his class is embarking on an off-campus study, a ten-week odyssey beginning in Northfield, Minnesota and taking them to New York, London, Amsterdam and Berlin. At each stop, they’ll be visiting with new media producers, attending festivals and exhibitions, and documenting their travels in a variety of forms. Needless to say, we’re deeply envious.
The Institute was the first stop on their trip, so we tried to start things off with a flourish. After a brief peak at the office, we brought the class over to Monkeytown, a local cafe and video club with a fantastic cube-shaped salon in the back where gigantic projection screens hang on each of the four walls.
stein_monkeyclub.jpg
Hooking our computers up to the projectors, we took the students on a tour of what we do: showed them our projects, talked about networked books (it was surreal to see GAM3R 7H30RY blown up 20 feet wide, wavering slightly in the central AC), and finished with a demo of Sophie. John Schott wrote a nice report about our meeting on the class’s blog. Also, for a good introduction to John’s views on personal media production and media literacy, take a look at this interview he gave on a Minnesota video blog back in March.
This is a great group of students he’s assembled, with interests ranging from film production to philosophy to sociology. They also seem to like Macs. This could be an ad for the new MacBook:
laptop_line.jpg
We’ve invited them back toward the end of their three weeks in New York to load Sophie onto their laptops before they head off to Europe.
(photos by John Schott)

wikipedia-britannica debate

The Wall Street Journal the other day hosted an email debate between Wikipedia founder Jimmy Wales and Encyclopedia Britannica editor-in-chief Dale Hoiberg. Irreconcilible differences, not surprisingly, were in evidence. Wales_Jimmy_gst09072006111650.jpg Hoiberg_Dale_gst09072006111650.jpg But one thing that was mentioned, which I had somehow missed recently, was a new governance experiment just embarked upon by the German Wikipedia that could dramatically reduce vandalism, though some say at serious cost to Wikipedia’s openness. In the new system, live pages will no longer be instantaneously editable except by users who have been registered on the site for a certain (as yet unspecified) length of time, “and who, therefore, [have] passed a threshold of trustworthiness” (CNET). All edits will still be logged, but they won’t be reflected on the live page until that version has been approved as “non-vandalized” by more senior administrators. One upshot of the new German policy is that Wikipedia’s front page, which has long been completely closed to instantaneous editing, has effectively been reopened, at least for these “trusted” users.
In general, I believe that these sorts of governance measures are a sign not of a creeping conservatism, but of the growing maturity of Wikipedia. But it’s a slippery slope. In the WSJ debate, Wales repeatedly assails the elitism of Britannica’s closed editorial model. But over time, Wikipedia could easily find itself drifting in that direction, with a steadily hardening core of overseers exerting ever tighter control. Of course, even if every single edit were moderated, it would still be quite a different animal from Britannica, but Wales and his council of Wikimedians shouldn’t stray too far from what made Wikipedia work in the first place, and from what makes it so interesting.
In a way, the exchange of barbs in the Wales-Hoiberg debate conceals a strange magnetic pull between their respective endeavors. Though increasingly seen as the dinosaur, Britannica has made small but not insignificant moves toward openess and currency on its website (Hoiberg describes some of these changes in the exchange), while Wikipedia is to a certain extent trying to domesticate itself in order to attain the holy grail of respectability that Britannica has long held. Think what you will about Britannica’s long-term prospects, but it’s a mistake to see this as a clear-cut story of violent succession, of Wikipedia steamrolling Britannica into obsolescence. It’s more interesting to observe the subtle ways in which the two encyclopedias cause each other to evolve.
Wales certainly has a vision of openness, but he also wants to publish the world’s best encyclopedia, and this includes releasing something that more closely resembles a Britannica. Back in 2003, Wales proposed the idea of culling Wikipedia’s best articles to produce a sort of canonical version, a Wikipedia 1.0, that could be distributed on discs and printed out across the world. Versions 1.1, 1.2, 2.0 etc. would eventually follow. This is a perfectly good idea, but it shouldn’t be confused with the goals of the live site. I’m not saying that the “non-vandalized” measure was constructed specifically to prepare Wikipedia for a more “authoritative” print edition, but the trains of thought seem to have crossed. Marking versions of articles as non-vandalized, or distinguishing them in other ways, is a good thing to explore, but not at the expense of openness at the top layer. It’s that openness, crazy as it may still seem, that has lured millions into this weird and wonderful collective labor.

amazon looks to “kindle” appetite for ebooks with new device

amazon_kindle.jpg
Engadget has uncovered details about a soon-to-be-released upcoming/old/bogus(?) Amazon ebook reading device called the “Kindle,” which appears to have an e-ink display, and will presumably compete with the Sony Reader. From the basic specs they’ve posted, it looks like Kindle wins: it’s got more memory, it’s got a keyboard, and it can connect to the network (update: though only through the EV-DO wireless standard, which connects Blackberries and some cellphones; in other words, no basic wifi). This is all assuming that the thing actually exists, which we can’t verify.
kindlespecs.jpg
Regardless, it seems the history of specialized ebook devices is doomed to repeat itself. Better displays (and e-ink is still a few years away from being really good) and more sophisticated content delivery won’t, in my opinion, make these machines much more successful than their discontinued forebears like the Gemstar or the eBookMan.
Ebooks, at least the kind Sony and Amazon will be selling, dwell in a no man’s land of misbegotten media forms: pale simulations of print that harness few of the possibilities of the digital (apparently, the Sony Reader won’t even have searchable text!). Add highly restrictive DRM and vendor lock-in through the proprietary formats and vendor sites made for these devices and you’ve got something truly depressing.
Publishers need to get out of this rut. The future is in networked text, multimedia and print on demand. Ebooks and their specialized hardware are a red herring.
Teleread also comments.

wall street journal on networked books and GAM3R 7H30RY

The Wall Street Journal has a big story today on networked books (unfortunately, behind a paywall). The article covers three online book experiments, Pulse, The Wealth of Networks, and GAM3R 7H30RY. The coverage is not particularly revelatory. What’s notable is that the press, which over the past decade-plus has devoted so much ink and so many pixels to ebooks, is now beginning to take a look at a more interesting idea. (“The meme is launched!” writes McKenzie.) Here’s the opening section:

Boundless Possibilities
As ‘networked’ books start to appear, consumers, publishers and authors get a glimpse of publishing to come
“Networked” books — those written, edited, published and read online — have been the coming thing since the early days of the Internet. Now a few such books have arrived that, while still taking shape, suggest a clearer view of the possibilities that lie ahead.
In a fairly radical turn, one major publisher has made a networked book available free online at the same time the book is being sold in stores. Other publishers have posted networked titles that invite visitors to read the book and post comments. One author has posted a draft of his book; the final version, he says, will reflect suggestions from his Web readers.
At their core, networked books invite readers online to comment on a written text, and more readers to comment on those comments. Wikipedia, the open-source encyclopedia, is the ultimate networked book. Along the way, some who participate may decide to offer up chapters translated into other languages, while still others launch Web sites where they foster discussion groups centered on essays inspired by the original text.
In that sense, networked books are part of the community-building phenomenon occurring all over the Web. And they reflect a critical issue being debated among publishers and authors alike: Does the widespread distribution of essentially free content help or hinder sales?

If the Journal would make this article available, we might be able to debate the question more freely.

“a duopoly of Reuters-AP”: illusions of diversity in online news

Newswatch reports a powerful new study by the University of Ulster Centre for Media Research that confirms what many of us have long suspected about online news sources:

Through an examination of the content of major web news providers, our study confirms what many web surfers will already know – that when looking for reporting of international affairs online, we see the same few stories over and over again. We are being offered an illusion of information diversity and an apparently endless range of perspectives which in fact what is actually being offered is very limited information.

The appearance of diversity can be a powerful thing. Back in March, 2004, the McClatchy Washington Bureau (then Knight Ridder) put out a devastating piece revealing how the Iraqi National Congress (Ahmad Chalabi’s group) had fed dubious intelligence on Iraq’s WMDs not only to the Bush administration (as we all know), but to dozens of news agencies. The effect was a swarm of seemingly independent yet mutually corroborating reportage, edging American public opinion toward war.

A June 26, 2002, letter from the Iraqi National Congress to the Senate Appropriations Committee listed 108 articles based on information provided by the INC’s Information Collection Program, a U.S.-funded effort to collect intelligence in Iraq.
The assertions in the articles reinforced President Bush’s claims that Saddam Hussein should be ousted because he was in league with Osama bin Laden, was developing nuclear weapons and was hiding biological and chemical weapons.
Feeding the information to the news media, as well as to selected administration officials and members of Congress, helped foster an impression that there were multiple sources of intelligence on Iraq’s illicit weapons programs and links to bin Laden.
In fact, many of the allegations came from the same half-dozen defectors, weren’t confirmed by other intelligence and were hotly disputed by intelligence professionals at the CIA, the Defense Department and the State Department.
Nevertheless, U.S. officials and others who supported a pre-emptive invasion quoted the allegations in statements and interviews without running afoul of restrictions on classified information or doubts about the defectors’ reliability.
Other Iraqi groups made similar allegations about Iraq’s links to terrorism and hidden weapons that also found their way into official administration statements and into news reports, including several by Knight Ridder.

The repackaging of information goes into overdrive with the internet, and everyone, from the lone blogger to the mega news conglomerate, plays a part. Moreover, it’s in the interest of the aggregators and portals like Google and MSN to emphasize cosmetic or brand differences, so as to bolster their claims as indispensible filters for a tidal wave of news. So whether it’s Bush-Cheney-Chalabi’s WMDs or Google News’s “4,500 news sources updated continuously,” we need to maintain a skeptical eye.
***Related: myths of diversity in book publishing and large-scale digitization efforts.