… is perhaps what the new dogma should be called.
I’ve just reread this piece by Chris Anderson is the hope that I misread it the first time. I don’t think I did. Among other things it suggests:
“Google’s founding philosophy is that we don’t know why this page is better than that one: If the statistics of incoming links say it is, that’s good enough. No semantic or causal analysis is required.”
“Petabytes [of data being crawled by algorithms] allow us to say: “Correlation is enough.” We can stop looking for models. We can analyze the data without hypotheses about what it might show. We can throw the numbers into the biggest computing clusters the world has ever seen and let statistical algorithms find patterns where science cannot.”
The piece is titled The End of Theory; The Data Deluge Makes the Scientific Method Obsolete. My fear is that the claim may actually be right. And this is not fear rising out of any deep and meaningful relationship with old-school scientific method – I’d consider myself an interpretivist, I suppose, if pushed to declare a broad allegiance. But to see feral, untrammelled positivism taking things by the throat like this is unnerving, to put it mildly. Is this really all we can now look forward to if we turn to research to unpack the conditions of our times and lives?
It’s all too reminiscent of the story told about Mortimore and the research he and his colleagues did in London in the 1970s. It was one of those Big Science pieces where stats and figures played lynchpin roles in the findings. Wearing anoraks in classrooms was – we were told with deadly earnest – clearly and demonstrably causal when it came to misbehaviour. (And sod the lack of safe coat storage and faulty heating in run-down buildings.)
If it is, then as a signed-up, paid up advocate of these digital times I think I want my money back…