PointThis week in Wired, Chris "Long Tail" Anderson argues that the scientific method is losing steam. Now that we are awash in data, he claims, connections in those giant pools of data are useful on their own; attaching models, theories and meaning to those correlations has become superfluous (and even impossible, in some cases). He came to this conclusion after considering Google's success with ad placement and translation tools based upon pure statistics. The Google servers don't understand the content of ads they are placing on pages, nor the French text they are converting to Klingon. They don't have to. To take another example, Newton's model for gravitation works great most of the time. But give us some data on ultra-fast and super-tiny things, and it breaks. So we toss the broken model and replace it with relativity. Model-breaking data might appear every few hundred years in a discipline, prompting a revision of its core models (prompting a "scientific revolution," if you are into Kuhn). But the rate at which we can find and store and use data is quickly accelerating. What happens when these model-breakers hit every decade, or every year, or faster? How could you possibly formulate models to keep up? Unthinking databases are looking better and better.
This week in Ars Technica, John Timmer discusses why he thinks Chris Anderson is an idiot child.
This week in IEEE Spectrum, the focus is on the Singularity. The Singularity is a vague and hotly disputed concept, and probably indescribable. For now, let's go with: The Singularity is the point where technological progress radically changes the nature of technological progress itself. Side effects may include: the elimination of scarcity, human immortality, total submission to brutal machine overlords, and televisions the size of the moon. Here's a gallery of random smart folk weighing in, roughly in the style of the Onion's What Do You Think?