Monthly Archives: February 2007

feeling random

Following HarperCollins’ recent Web renovations, Random House today unveiled their publisher-driven alternative to Google: a new, full-text search engine of over 5,000 new and backlist books including browsable samples of select titles. The most interesting thing here is that book samples can be syndicated on other websites through a page-flipping browser widget (Flash 9 required) that you embed with a bit of cut-and-paste code (like a YouTube clip). It’s a nice little tool, though it comes in two sizes only — one that’s too small to read, and one that embedded would take up most of a web page (plus it keeps crashing my browser). Compare below with HarperCollins’ simpler embeddable book link:



Worth noting here is that both the search engine and the sampling widget were produced by Random House in-house. Too many digital forays by major publishers are accomplished by hiring an external Web shop, meaning of course that little ends up being learned within the institution. It’s an old mantra of Bob’s that publishers’ digital budgets would be better spent by throwing 20 grand at a bright young editor or assistant editor a few years out of college and charging them with the task of doing something interesting than by pouring huge sums into elaborate revampings from the outside. Random House’s recent home improvements were almost certainly more expensive, and more focused on infrastructure and marketing than on genuinely reinventing books, but they indicate a do it yourself approach that could, maybe, lead in new directions.

were the fears of big brother overstated?

The NY Times published an article yesterday about Stewart Brand’s embrace of nuclear energy and genetically engineered foods. Here is a quote:

He thinks the fears of genetically engineered bugs causing disaster are as overstated as the counterculture’s fears of computers turning into Big Brother. “Starting in the 1960s, hackers turned computers from organizational control machines into individual freedom machines,” he told Conservation magazine last year. “Where are the green biotech hackers?”

So what do you think. Were the fears of Big Brother overstated? Did hackers successfully turn computers into individual freedom machines?

who knew?

The following exchange occurred this morning during a long IM session with a close friend and colleague:
dripping sarcasm.jpg
Turns out there is an actual punctuation mark in French to indicate irony which you can read about in this wikipedia article.
irony marks.jpg
I don’t actually use emoticons because i find them so aesthetically uninteresting, so i love the idea of a new class of punctuation marks evolving to take the place of the smiley face in all its saccharine implementations.

gift economy or honeymoon?

There was some discussion here last week about the ethics and economics of online publishing following the Belgian court’s ruling against Google News in a copyright spat with the Copiepresse newspaper group. The crux of the debate: should creators of online media — whether major newspapers or small-time blogs, TV networks or tiny web video impresarios — be entitled to a slice of the pie on ad-supported sites in which their content is the main driver of traffic?
It seems to me that there’s a difference between a search service like Google News, which shows only excerpts and links back to original pages, and a social media site like YouTube, where user-created media is the content. There’s a general agreement in online culture about the validity of search engines: they index the Web for us and make it usable, and if they want to finance the operation through peripheral advertising then more power to them. The economics of social media sites, on the other hand, are still being worked out.
For now, the average YouTube-er is happy to generate the site’s content pro bono. But this could just be the honeymoon period. As big media companies begin securing revenue-sharing deals with YouTube and its competitors (see the recent YouTube-Viacom negotiations and the entrance of Joost onto the web video scene), independent producers may begin to ask why they’re getting the short end of the stick. An interesting thing to watch out for in the months and years ahead is whether (and if so, how) smaller producers start organizing into bargaining collectives. Imagine a labor union of top YouTube broadcasters threatening a freeze on new content unless moneys get redistributed. A similar thing could happen on community-filtered news sites like Digg, Reddit and Netscape in which unpaid users serve as editors and tastemakers for millions of readers. Already a few of the more talented linkers are getting signed up for paying gigs.
Justin Fox has a smart piece in Time looking at the explosion of unpaid peer production across the Net and at some of the high-profile predictions that have been made about how this will develop over time. On the one side, Fox presents Yochai Benkler, the Yale legal scholar who last year published a landmark study of the new online economy, The Wealth of Networks. Benkler argues that the radically decentralized modes of knowledge production that we’re seeing emerge will thrive well into the future on volunteer labor and non-proprietary information cultures (think open source software or Wikipedia), forming a ground-level gift economy on which other profitable businesses can be built.
Less sure is Nicholas Carr, an influential skeptic of most new Web crazes who insists that it’s only a matter of time (about a decade) before new markets are established for the compensation of network labor. Carr has frequently pointed to the proliferation of governance measures on Wikipedia as a creeping professionalization of that project and evidence that the hype of cyber-volunteerism is overblown. As creative online communities become more structured and the number of eyeballs on them increases, so this argument goes, new revenue structures will almost certainly be invented. Carr cites Internet entrepreneur Jason Calcanis, founder of the for-profit blog network Weblogs, Inc., who proposes the following model for the future of network publishing: “identify the top 5% of the audience and buy their time.”
Taken together, these two positions have become known as the Carr-Benkler wager, an informal bet sparked by their critical exchange: that within two to five years we should be able to ascertain the direction of the trend, whether it’s the gift economy that’s driving things or some new distributed form of capitalism. Where do you place your bets?

mashups made easy

Yahoo! recently announced a new service called pipes that hopes to bring the ability to “mash-up” to the common folk.
As always, Tim O’Reilly has a very good description:

Yahoo!’s new Pipes service is a milestone in the history of the internet. It’s a service that generalizes the idea of the mash-up, providing a drag and drop editor that allows you to connect internet data sources, process them, and redirect the output. Yahoo! describes it as “an interactive feed aggregator and manipulator” that allows you to “create feeds that are more powerful, useful and relevant.” While it’s still a bit rough around the edges, it has enormous promise in turning the web into a programmable environment for everyone.

While undeniably exciting, this technology reminds me of a concern I had and wrote about just a few months ago: the ethics of software in the networked world.
The basic problem is that having data spread across large and unreliable networks can lead to a chain reaction of unintended consequences when a service is interrupted. For example, imagine Google Maps changed the way a fundamental part of its mapping tool worked: Since the changes are applied immediately to everyone using the network, serious problems can arise as the necessity for these tools increase.
Also, the responsibility for managing problems can become a lot harder to track down when the network of dependencies becomes complex, and creating a new layer of abstraction, like in Yahoo! pipes, can potentially exacerbate the problem if there is not an clear agreement of expectations between the parties involved.
I think that one of reasons that licenses, like the GPL and the Creative Commons licenses, are popular are because they clearly communicate to the parties involved what their rights are, without ever having to explain the complexities of copyright law. I think it would make sense to come up with similar agreements between nodes in a network on the issues I raised above as we move more of our crucial applications to the web. The problem is, who would ever want to take responsibility for problems that appear far removed? Would there be any interest in creating a network collective of small pieces, closely joined?

monkeybook

monkeytownsketch.jpg New York readers save the date!
Next Wednesday the 28th the Institute is hosting the first of what hopes to be a monthly series of new media evenings at Brooklyn’s premier video salon and A/V sandbox, Monkeytown. We’re kicking things off with a retrospective of work by our longtime artist in residence, Alex Itin. February 15th marked the second anniversary of Alex’s site IT IN place, which we’re preparing to relaunch with a spruced up design and a gorgeous new interface to the archives (design of this interface chronicled here and here). We’d love to see you there.
For those of you who don’t know it, Monkeytown is unique among film venues in New York — an intimate rear room with a gigantic screen on each of its four walls, low comfy sofas and fantastic food. A strange and special place. If you think you can come, be sure to make a reservation ASAP as seating will be tight.
More info about the event here.

an encyclopedia of arguments

I just came across this though apparently it’s been up and running since last summer. Debatepedia is a free, wiki-based encyclopedia where people can collaboratively research and write outlines of arguments on contentious subjects — stem cell reseach, same-sex marriage, how and when to withdraw from Iraq (it appears to be focused in practice if not in policy on US issues) — assembling what are essentially roadmaps to important debates of the moment. Articles are organized in “logic trees,” a two-column layout in which pros and cons, fors and againsts, yeas and neas are placed side by side for each argument and its attendant sub-questions. A fairly strict citations policy ensures that each article also serves as a link repository on its given topic.
Debatepedia.jpg This is an intriguing adaptation of the Wikipedia model — an inversion you could say, in that it effectively raises the “talk” pages (discussion areas behind an article) to the fore. Instead of “neutral point of view,” with debates submerged, you have an emphasis on the many-sidedness of things. The problem of course is that Debatepedia’s format suggests that all arguments are binary. The so-called “logic trees” are more like logic switches, flipped on or off, left or right — a crude reduction of what an argument really is.
I imagine they used the two column format for simplicity’s sake — to create a consistent and accessible form throughout the site. It’s true that representing the full complexity of a subject on a two-dimensional screen lies well beyond present human capabilities, but still there has to be some way to present a more shaded spectrum of thought — to triangulate multiple perspectives and still make the thing readable and useful (David Weinberger has an inchoate thought along similar lines w/r/t to NPR stories and research projects for listeners — taken up by Doc Searls).
I’m curious to hear what people think. Pros? Cons? Logic tree anyone?

try to take some time from your busy day

to read the recent interview with humanist and computer scientist, Alan Kay. Here’s a sample . . .

The things that are wrong with the Web today are due to this lack of curiosity in the computing profession. And it’s very characteristic of a pop culture. Pop culture lives in the present; it doesn’t really live in the future or want to know about great ideas from the past. I’m saying there’s a lot of useful knowledge and wisdom out there for anybody who is curious, and who takes the time to do something other than just executing on some current plan. Cicero said, “Who knows only his own generation remains always a child.” People who live in the present often wind up exploiting the present to an extent that it starts removing the possibility of having a future.

google library dominoes

Princeton is the latest university to partner up with the Google library project, signing an agreement to have 1 million public domain books scanned over the next six years. Over at ALA Techsource Tom Peters voices the growing unease among librarians worried about the long-term implications of commercial enclosure of the world’s leading research libraries.

atomisation, part two

In the last few weeks a number of people have sent me a link to Michael Wesch’s video meditation on the evolution of media and its likely impact on all aspects of human interaction. One of Wesch’s main points is that the development of XML enables the separation of form from content which in turn is fueling the transition to new modes of communication.

Paradoxically Wesch’s video works precisely because of the integration of form and content . . . possibly one of the best uses of animated text and moving images in the service of a new kind of expository essay. If you simply read the text in an RSS reader it wouldn’t have anywhere near the impact it does. Although Wesch’s essay depends on the unity of form and content, he is certainly right about the increasing trend on the web to decontextualize content by making it independent of form. If Mcluhan was right about the medium being a crucial part of the message, then, if we are looking at content in different forms are we getting the same message? If not, what does this mean for social discourse going forward?