The always top-notch 3:AM Magazine, "Whatever it is, We're Against it", just posted an interview with none other than Simon Blackburn. Of special note are his reading suggestions at the close of the interview.
An article in the Times Higher Education (UK) discusses recent books on the subversion of academia by industry and the response from honest scholars. From the article,
Doubt is the lifeblood of the academy. Historians and political scientists try never to take on trust any public statement that cannot be independently verified. Scientists look for every possible alternative factor and explanation before claiming that there is a causal link between A and B. Philosophers have even been known not to take their own existence for granted. An attitude of radical scepticism is essential for most serious research.
Yet there is also a point at which such scepticism becomes pathological and irresponsible. Whole industries have an interest in casting doubt on the overwhelming evidence that smoking damages health, that nuclear energy imposes substantial risks, that climate change is taking place and that the pre-credit crunch banking system was a house of cards. Academics who cultivate the art of spreading doubt - what one scholar calls "agnotology" - are often de facto protecting corporate profits and discouraging governments and individuals from taking action. They also give authority to views that would be taken with a large pinch of salt if put forward by journalists, lawyers or public relations firms.
One particularly soul-crushing tidbit in the article: Controlling for industry funding in meta-analyses.
As the semester gets rolling, inexorably crushing the opportunities one might have had for sleep, it is important to remember that good sleep habits are as necessary as good nutrition for thinking clearly, as this Boing Boing post discusses.
According to this report in Bloomberg News, economist and Federal Reserve chair Ben Bernanke has suggested that perhaps economists would do well to consider also happiness and well-being in their calculations.
The New Yorker's satirist Andy Borowitz describes the less edifying side of science:
The landing of the Mars science rover Curiosity does not qualify as a significant scientific achievement and should not be getting so much of the public’s attention, says the team of scientists who discovered the Higgs boson last month.
“People see these beautiful pictures from outer space and they’re inclined to think that something amazing has been achieved,” a spokesperson for the Higgs-boson team said. “Let the Mars rover do something of genuine value, like, say, discover how the universe was created. Then I’ll be impressed.” As for the NASA scientists behind the Mars rover, the Higgs-boson spokesman said,
“I don’t think we should be too quick to use the word ‘scientist’ here. Honestly, anyone can grow a Mohawk and put on a headset and look cool and all, but that hardly makes you a scientist. Let’s see some of these dudes discover a particle or something along those lines. I mean, come on.”
From the Jet Propulsion Laboratory, in Pasadena, response to the Higgs-boson team’s comments was swift and irate, as a NASA spokesman called the remarks “an unacceptable diss.”
“You know the difference between the Mars rover and the Higgs boson?” said a NASA spokesman, his face red with anger. “You can actually see the Mars rover.” The NASA official went on to say that “I can understand why the Higgs people think they found something that’s real and all, but as far as I can tell their so-called ‘boson’ is about as real as a leprechaun or a Smurf.”
The Times' Stone Series has an essay by Xavier's Richard Polt explaining why the human being resists analysis. From the article,
So why have we been tempted for millenniums to explain humanity away? The culprit, I suggest, is our tendency to forget what Edmund Husserl called the “lifeworld” — the pre-scientific world of normal human experience, where science has its roots. In the lifeworld we are surrounded by valuable opportunities, good and bad choices, meaningful goals, and possibilities that we care about. Here, concepts such as virtue and vice make sense. Among our opportunities are the scientific study of ants or the construction of calculating machines. Once we’ve embraced such a possibility, it’s easy to get so absorbed in it that we try to interpret everything in terms of it — even if that approach leaves no room for value and meaning. Then we have forgotten the real-life roots of the very activity we’re pursuing. We try to explain the whole in terms of a part.
The NY Times has a piece on change-blindness and its epistemic implications. From the article,
What are the neural correlates of these cognitive hiccups? One possible answer comes from studies of the so-called face test, in which a volunteer is shown two faces in quick succession. Normally, just about anyone can distinguish the faces provided they’re shown within about half a second. But if the person is distracted by a task like counting, or by a flashing light, the faces start to look the same.
Here’s where it gets interesting, though. Scientists have found a way to induce change blindness, with a machine called a transcranial magnetic stimulator, which uses a magnetic field to disrupt localized brain regions. In one experiment, a T.M.S. was used to scramble the parietal cortex, which controls attention. Subjects were then given the face test. With the machine turned off, they did fine. But when the T.M.S. was on, most failed the test. Conclusion? Misdirection paralyzes part of your cortex.
Such blind spots confirm what many philosophers have long suspected: reality and our perception of it are incommensurate to a far greater degree than is often believed. For all its apparent fidelity, the movie in our heads is a “Rashomon” narrative pieced together from inconsistent and unreliable bits of information. It is, to a certain extent, an illusion.
The BBC has a series of essays this week on A.M. Turing in honor of his centenary, beginning with this essay by Vint Cerf. From the essay,
This year, in the centenary of his birth, there is one man in particular who is deservedly the focus of attention: Alan Turing.
Turing was born into a world that was very different, culturally and technologically, yet his contribution has never been more important.
His is a story of astounding highs and devastating lows. A story of a genius whose mathematical insights helped save thousands of lives, yet who was unable to save himself from social condemnation, with tragic results. Ultimately though, it's a story of a legacy that laid the foundations for the modern computer age.
In a lengthy and fascinating interview, Gila Sher discusses the nature of philosophy, logic, and epistemology. From the interview,
In my sophomore year as an undergraduate at the Hebrew University of Jerusalem two of my professors were engaged in a very lively and passionate debate about how philosophy should be done: should it follow the traditional, especially Kantian, template or should it follow the contemporary analytic template? We, the students, had a special place in this debate: each side wanted to pull us in its direction, and we played the role of both interrogators and jury.
So from the very start I was taught to have an active attitude to the question “What is philosophy?” Philosophy was not something you just learn; it was something you always approach with questions like “what philosophical method is used here? is it problematic? can it be improved? is there an idea for a new philosophical method here?” in the back of your mind.
Finally, consider the anti-philosophical strictures of Richard Feynman. “Cocktail party philosophers,” he said in a lecture, think they can discover things about the world “by brainwork” rather than by experiment (“the test of all knowledge”). But in another lecture, he announced that the most pregnant hypothesis in all of science is that “all things are made of atoms.” Who first came up with this hypothesis? The ancient philosophers Leucippus and Democritus. And they didn’t come up with it by doing experiments.
Today the world of physics is in many ways conceptually unsettled. Will physicists ever find an interpretation of quantum mechanics that makes sense? Is “quantum entanglement” logically consistent with special relativity? Is string theory empirically meaningful? How are time and entropy related? Can the constants of physics be explained by appeal to an unobservable “multiverse”? Philosophers have in recent decades produced sophisticated and illuminating work on all these questions. It would be a pity if physicists were to ignore it.
And what about the oft-heard claim that philosophy, unlike science, makes no progress? As Bertrand Russell (himself no slouch at physics and mathematics) observed, philosophy aims at knowledge, and as soon as it obtains definite knowledge in a specific area, that area ceases to be called “philosophy.” And scientific progress gives philosophers more and more to do. Allow me to quote Nietzsche (although I know that will be considered by some to be in bad taste): “As the circle of science grows larger, it touches paradox at more places.” Physicists expand the circle, and philosophers help clear up the paradoxes. May both camps flourish.