If you like Mitchell Stephen’s book-blog about the history of atheism, you might want to compare Mitchell’s approach to that of “The Long Tail,” a book-blog written by Chris Anderson of Wired Magazine. Like Stephens, Anderson is trying to work out his ideas for a future book online: his book looks at the technology-driven atomizaton of our economy and culture, a phenomenon Anderson (and Wired) doesn’t seem particularly troubled by.
On December 18, Anderson wrote a post about what he saw as the real reason people are uncomfortable with Wikipedia: according to Anderson, we’re unable to reconcile with the “alien logic” of probabilistic and emergent systems, which produce “correct” answers on the macro-scale because “they are statistically optimized to excel over time and large numbers” — even though no one is really minding the store.
On one hand, Anderson’s been saying what I (and lots of other people) have been saying repeatedly over the past few weeks: acknowledge that sometimes Wikipedia gets things wrong, but also pay attention to the overwhelming number of times the open-source encyclopedia gets things right. At the same time, I’m not comfortable with Anderson’s suggestion that we can’t “wrap our heads around” the essential rightness of probabalistic engines — especially when he compares this to not being able to wrap our heads around probibalistic systems. This call for greater faith in the algorithm also troubles Nicholas Carr, who responds agnostically:
Maybe it’s just the Christmas season, but all this talk of omniscience and inscrutability and the insufficiency of our mammalian brains brings to mind the classic explanation for why God’s ways remain mysterious to mere mortals: “Man’s finite mind is incapable of comprehending the infinite mind of God.” Chris presents the web’s alien intelligence as something of a secular godhead, a higher power beyond human understanding… I confess: I’m an unbeliever. My mammalian mind remains mired in the earthly muck of doubt. It’s not that I think Chris is wrong about the workings of “probabilistic systems.” I’m sure he’s right. Where I have a problem is in his implicit trust that the optimization of the system, the achievement of the mathematical perfection of the macroscale, is something to be desired….Might not this statistical optimization of “value” at the macroscale be a recipe for mediocrity at the microscale – the scale, it’s worth remembering, that defines our own individual lives and the culture that surrounds us?
Carr’s point is well-taken: what is valuable about Wikipedia to many of us is not that it is an engine for self-regulation, but that it allows individual human beings to come together to create a shared knowledge resource. Anderson’s call for faith in the system is swinging the pendulum too far in the other direction: while other defenders of Wikipedia have pointed out ways to tinker with the encyclopedia’s human interface, Anderson implies that the human interface — at the individual level — doesn’t quite matter. I don’t find this particularly conforting: in fact, this idea seems much scarier than Seigenthaler’s warning that Wikipedia is a playground for “volunteer vandals.”