29 December 2005

Pain and Anxiety

Word on DefenseTech (see also MilitaryDOTcom and Wonkette) is that the Army wants to deploy the “Active Denial System” to Iraq essentially immediately. Pausing for a moment to penetrate the Orwellian name, I remind the reader that the ADS is a microwave system that looks like a prop from a Godzilla movie, and creates pain without (supposedly) actually cooking the target. The system is described at length at GlobalSecurity. The plan (dubbed Project Sheriff) is to send 15 truck-mounted systems to Iraq, while a more deployable airborne version is developed.

The system operates at 95 GHz. (That’s a frequency roughly 1,000 times higher than the FM band.) The beam is absorbed within the upper millimeter or so of skin; a 2 second burst raises the skin temperature to roughly 50°C. The burst is too short to burn, but it stimulates the nociceptors (more specifically, I am told, the C polymodal nociceptors) to produce an intense sensation of pain and a strong reflex to withdraw. The pain of a 5 second exposure is said to be intolerable.

The decision to get Project Sheriff under way was made by Army Col. Robert Lovett, the project manager for the Rapid Equipping Force, which tries to substantially reduce the time that it normally takes to get newly developed technology into the field. In an October 11 memo to the Deputy Undersecretary of Defense for Advanced Systems and Concepts, Col. Lovett asked to stop the ADS-1 ACTD (advanced concept technology demonstration – basically, a set of experiments to show that the system is not so risky as to threaten the budgets of future users), and redirect it to immediate deployment to Iraq. “System 1 capabilities have, to date, been sufficiently demonstrated in the ACTD to prove its value to the solider,” Lovett says in the memo. “System 1 current operating constraints can be mitigated if used in Iraq during the November through March time frame without modification.”

Operating constraints? Ah, yes. You’ve got a microwave oven, don’t you? When you cook a TV dinner, the instructions tell you to wait a minute after cooking, to allow the temperature to even out. Reflections within the oven produce zones of constructive interference where the microwave intensity is significantly above the average, and so the food gets extra hot. Of course, the bad guys aren’t in an oven, but there are still lots of opportunities for reflections and focusing. The operator has to be able to judge the power level required given the distance of the targets, while avoiding circumstances that could lead to unhealthy reflections (e.g., too many cars, shop awnings, etc.).

The ADS has been tested on human volunteers for several years. Details of tests in 2003 and 2004 were obtained under the Freedom of Information Act by Edward Hammond, director of the Sunshine Project. – an organization campaigning against the use of biological and non-lethal weapons. They were set up to prevent as much as possible any reflection or focusing – by excluding contact lenses and eyeglasses, and various types of buttons and zippers. Oh, and they took metallic objects (keys, coins, etc) away from the test subjects, to avoid hot spots – the same reason that you don’t leave a fork in the microwave. Even so, one volunteer was burned when the operator used the wrong power setting.

I have personal interest in this, because of my background in radio astronomy. These frequencies have been of interest to students of interstellar chemistry since the 1970s, because a number of the relatively simple molecules found in star-forming clouds have measurable emission in this range. (For example, the fundamental emission of carbon monoxide is at 115 GHz.) We observe them with telescopes equipped with fancy versions of FM receivers, mixing the celestial radiation with a nearby frequency generated by a local oscillator to produce much lower frequency signals that are easier to handle. Because the frequencies are so high, you can’t use wires, so the sky and local oscillator signals are piped together via hollow waveguides.

In the earliest receivers, the feed pipe carrying the sky signal was open to the outside, which meant that bugs could fly in and create excess noise. As the last step in tuning the receiver, we would often squint down the feed to check that it was clear. That stopped, though, when an engineer pointed out that half the radiation from the local oscillator was coming out the feed – right into the inquiring eye. He noted that prolonged exposure, even at such low power levels, might have long-term impacts, like tendency to cataracts.

Now, I’m not saying that those volunteers at Kirtland AFB are in trouble. I’m just saying that part of the infrastructure development in Iraq ought to involve training optometrists.

27 December 2005

A Biology Teacher I Wish I’d Had

Science News Online is mostly a subscription-access site, but some articles are posted for free access. Fortunately, one of the freebies from the year-end issue is a delightfully snarky take on teaching the controversy, by Bruce Bower. Thus:
Old-school evolution often occurs too slowly for an observer to see. That's inconvenient for those who limit reality to anything that can be captured on their digital video cameras. For those interested in seeing for themselves, ponder artificial evolution. Consider, for example, dog breeding over the past century or Michael Jackson's face over the past 25 years.

And on the subject of missing links:
Although the intelligent-design people put a lot of stock in missing links, those wacky creatures tell you squat about evolution. So what if we never stumble over the remains of, say, the last common ancestor of apes and people? … Since nobody knows what the common ancestor looked like, scientists in their prickly way may never agree that they've found it. Many questions remain about the ways in which fundamental shape changes arise and foster the evolution of new types of animals. These aren't signs that evolution never happened. They're signs that fascinating turns in evolutionary biology lie ahead for the intellectually curious.

I ommend the reader to peruse the full article.

26 December 2005

Monday Quote

Robinson Jeffers was the poetry of non-sunny California, of the foggy, rainy places where people had yet to make inroads on nature. He seemed to have drawn from his observations of science and war the inference that humanity would soon remove itself from the world – to the world’s advantage. In Roan Stallion, he interrupts a story of an unhappy farmwife (sorry, I know that’s a terrific simplification) with a rant that gives some sense of his argument.

is the start of the race; I say
Humanity is the mould to break away from, the crust to
break through, the coal to break into fire,
The atom to be split.
Tragedy that breaks man’s face
and a white fire flies out of it; vision that fools him
Out of his limits, desire that fools him out of his limits,
unnatural crime, inhuman science,
Slit eyes in the mask; wild loves that leap over the walls
of nature, the wild fence-vaulter science,
Useless intelligence of far stars, dim knowledge of the
spinning demons that make an atom,
These break, these pierce, these deify, praising their God
shrilly with fierce voices: not in a man’s shape
He approves the praise, he that walks lightning-naked on
the Pacific, that laces the suns with planets,
The heart of the atom with electrons: what is humanity
in this cosmos? For him, the last
Least taint of a trace in the dregs of the solution; for
itself, the mould to break away from, the coal
To break into fire, the atom to be split.

Yes, I know it's Tuesday. Blogger wasn't working for me yesterday.

21 December 2005


I am not much of a self-Googler, but I cannot resist playing with statistics. Hence, I tested the sluttishness of this blog by means of this calculator. Set among the results that had been tabulated recently, we find From the Rachel at 3.48%, sluttier than CNN (at 1.46%), far more innocent than Pharyngula (25.31%) or Daily Kos (53.41%), and flush up against Wikipedia (3.5%).

Curiously, while the etymology of ‘slut’ is imperfectly established (apparently, it is not significant that it is one of several Swedish words meaning ‘end’), it seems to have been drawn into English in the 15th century from a German word meaning an untidy or slovenly person. Could it be that higher Slut-o-Meter readings reflect sites that are more free-wheeling in their expression?

From the Rachel: as sluttish as Wikipedia. Something to be proud of.

Temper and Mood in Blogging

I have yet to satisfy myself that I have struck the right tone in my writing. Many of the blogs that I otherwise enjoy often adopt an impatient and confrontational aspect that makes me uncomfortable. Perhaps I am archaic, but I think Kant would say that a world of such voices would be an unhappy place, and devoid of pleasant dinner conversation.

Still, there are times when intemperance is appropriate. I recently read Cotton Mather’s sermon, A Man of Reason, in which he argues that “a World of Evil would be prevented in the World, if Men would once become so Reasonable.” Of course, Reason supports Faith in his view, but the argument is well made. And in addressing what he calls “a very unreasonable Thing to put me upon the Proof of this Assertion; Or, demand of me a Reason, why a Man should hearken to Reason,” Mather includes a defense of the sarcastical and satirical response to foolishness:
The Man who does not Hearken to Reason, does the part of a Brute, yea, he does worse than a Brute, that is destitute of Reason. We read of Bruitish Men; and of those who are, Jude 10. As Brute Beasts: Men, who as far as they can, quit the order of Men, & rank themselves with Brutes. Verily, A Man who does not Hearken to Reason, so far Unmans himself; Transforms himself so far into a Brute. It is a Bruitish Thing, to refuse the direction of Reason. A Man who abandons the Rules of Reason in what he does, – Pardon me, wretch, that I miscall’d thee, A Man; I will Recall it – such a Brute is worthy to be addressed with nothing but Sarcasm, and Satyr; Go, Thou Brute, Get a little more Hair, and crawl upon all Four, and come not among the Children of Reason any more.

I believe that this sermon was preached in 1709. It is in the anthology American Sermons.

19 December 2005

Monday Quote Troublemaker

In the spirit of "Let's you and her fight," an excerpt from the first page of Patricia Churchland's textbook Brain-Wise:
The weight of evidence now implies that it is the brain, rather than some nonphysical stuff, that feels, thinks, and decides. That means there is no soul to fall in love. We do fall in love, certainly, and passion is as real as it ever was. The difference is that now we understand those important feelings to be events happening in the physical brain. It means that there is no soul to spend its postmortem eternity blissful in Heaven or miserable in Hell.

I encourage religious folk to go bother the neurobiologists, and leave off the evolutionary biologists for a while.

12 December 2005

Monday Quote Frenzy - Classic Chides for Bloggers

Milton, in Areopagitica, is disdainful of the arguments left half composed, unpublished:
I can not praise a fugitive and cloistered virtue, unexercised and unbreathed, that never sallies out and sees her adversary, but slinks out of the race, where that immortal garland is to be run for, not without dust and heat.

And Hamlet (Act IV, Scene IV) agrees that we should not avoid intellectual engagement with the world:
Sure he that made us with such large discourse,
Looking before and after, gave us not
That capability and god-like reason
To fust in us unus’d.

11 December 2005

About Steve Fuller

As the tumult over Steve Fuller appears to be ebbing (e.g., here and here), I thought I would add a few comments on aspects that I feel have been addressed incompletely.

Steve Fuller, for those unaware, is a professor of sociology at Warwick University, and is rapidly becoming one of the most recognized exponents for the field of sociology of science – which is too bad for the field of sociology of science. He has been in the public eye recently as an expert witness at the Dover PA School Board trial, where he testified in favor of teaching Intelligent Design in the high schools. Surprised by the fact that critics linked Fuller’s argument to excesses of postmodernism, Michael Bérubé encouraged him to respond, and the resulting post spawned many comments and much discussion elsewhere.

The first thing that distresses me about Dr. Fuller is his condescension toward the readers.
You guys are getting better! The level of debate has been definitely raised on these matters. But there are still problems…

Again, my apologies if your brilliant ripostes failed to move me. Try harder!

I find the periodic claims to superior intelligence (not ID, mind you!) on the part of some of the posters truly touching. Rather than reminding me how smart you are, you should try to demonstrate it in what you say.

It reminds me of some of the awful teaching assistants I’ve seen over the years, who treat the students worse than dogs because they are less likely to bite when teased.

Fuller affects weariness at the reaction to his stand on ID, as if he was fully expecting that we great unwashed would not “get it”. I’m not particularly surprised that Fuller is not particularly surprised about the intensity of the reaction to his opinions, because I suspect that it comes from the same lack of empathy that plays such a strong role in his coming to those opinions. History of science, philosophy of science, sociology of science – these are all fine disciplines. But they teach as much about the doing of science as music appreciation does about performing.

There is real mental, and often physical, labor in learning how to be a scientist, in doing creative science, and in teaching how to do science. Much hard work leads to unproductive, or inconclusive, or negative results. Many exciting projects turn out to be rooted in misunderstandings that are exposed by your peers in very public forums. The process of self-correction has sharp edges. With no understanding of the real life of science, Fuller sees only behavior in the form of an empty ritual, meaningful only for what they say about the power structure that choreographs it. With no understanding of the inner life of scientists, how could he understand that he is insulting them?

An aside: from the critic’s elevated perspective, the strange behaviors to be observed in a kung fu dojo may look very like a cult. But such people can still kick your butt.

My second source of distress emerges from Fuller’s offhand remarks on historical examples. I am not an expert in this area, but I have done some reading. Thus, when I see such remarks as:
The US has always had a ‘difficult’ relationship with religion because of the traumatic origins of the nation. The original British settlers, especially in what became the liberal northern establishment, were wealthy dissenters (including Catholics and Jews) who were prohibited from political participation in their homeland. Henceforth, all attempts to impose a religious orthodoxy would be prohibited – in the name of protecting religious freedom, of course.
(This comment is actually in Fuller’s post on ID in the UK, which was spotlighted by Tony Jackson in the Pharyngula commentary on the Bérubé discussion.)

Even a shallow reading of the early history of the US gives the lie to this sweeping assessment. In the ‘liberal north’, the original British settlers were not particularly wealthy and, where dissenters, were primarily dissenters from Anglicanism. Many attempted to impose their dissenting faiths as orthodoxy within their own colonies, prompting further diffusion of dissenters from the dissenters. Where they were in any sense unified against ‘religious orthodoxy’, it was in opposing the imposition of Anglican bishops by the Church in England. But five of the post-Revolutionary states had tax-supported established churches.

I have belabored this point before. Among the victims of ID politics is the practice of history. Why are there not more historians on the barricades?

Closer to the topic of evolution, Fuller says, in one of his responses to the Bérubé thread,
But even evolution’s staunchest defenders have remarked on the strong iconic role that Darwin continues to play in this field, which is quite unusual in the natural sciences. An important reason is the politically correct lesson that his life teaches: the idea that science causes you to lose your faith. Newton, unfortunately, thought his theory confirmed his reading of the Bible. Not very politically correct.

Historian Brian Ogilvie called Fuller to task on his interpretation of Newton (comments 27, and 84), and PZ Myers stood up for Darwin (comment 21). It is important to note that professional historians such as Janet Browne, working from letters and diaries, and Darwin’s great-great-grandson Randal Keynes, working from family papers, have shown that it was the death of his 10-year-old daughter Annie that precipitated Darwin’s change from conflicted believer to conflicted non-believer.

And thus my second complaint: When I write for this blog, it takes me a painfully long time to finish. It’s easy to get ideas of what to write about, but as I get into the composition, I keep stopping every few lines because I feel the need to check the accuracy of some assertion, or because I need to check for whether someone else has already posted a similar comment, or because I need to track down the correct version of some half remembered quote, or just because I want to find a clearer expression. If I could just rant on about whatever idea comes to mind, without worrying about it being gibberish or conflicting with well-known facts, then I could write a lot more a lot faster.

But then, I fear, I might end up sounding like Steve Fuller.

05 December 2005

Monday Quote Frenzy – Galileo and Lec

By now, the term “Galileo Gambit” seems to be entrenched as a description of the ploy by which an adherent of some very non-mainstream idea argues that, just as people scoffed at Galileo, so they now scoff at him, ergo he will eventually be proven right. This has had a lot of play in the recent press about Intelligent Design. Thus, in the September 12 Guardian interview with Michael Behe, John Sutherland asks:
JS: Has the National Academy of Science taken an interest?
MB: It takes a position strongly condemning it. The recently retired president, Bruce Albert, sent a letter to all 2,000 members of the NAS essentially naming me.
JS: Did Galileo come to mind?
MB: Yeah. In a way it's flattery.

This prompted a slapdown from Jerry Coyne in the September 19 letters:
Behe … appears proud of his ostracism from the scientific community, drawing analogies with Galileo. It seems that everyone with a crackpot theory compares themselves to Galileo once their theory is criticised. But the fact that a theory receives a drubbing from the scientific community does not mean that it is correct.

I have two favorite versions of the Galileo Counter-Gambit. One is from Peter Cook’s screenplay for the original (1967) Bedazzled, as Cook (as Mr. Spiggott, aka Lucifer) is attempting to convince Dudley Moore (as Stanley Moon) that he is, in fact, The Hornèd One:
Stanley Moon: You're a nutcase! You're a bleedin' nutcase!
George Spiggott: They said the same of Jesus Christ, Freud, and Galileo.
Stanley Moon: They said it of a lot of nutcases too.
George Spiggott: You're not as stupid as you look are you, Mr. Moon?

The other is from the great Polish aphorist, Stanisław Jerzy Lec (pronounced ‘Letz’):
Every stink that fights the ventilator thinks it is Don Quixote.

Lec, by the way, was a remarkable fellow. Born in Lwów in 1909, he was a leftist poet and was interned by the Nazis in 1941. He escaped the camp (and extermination) in 1943 in a stolen German uniform. He spent the rest of the war fighting as part of the Communist resistance movement in Warsaw. After the war, he was a member of the Polish mission in Vienna. Life under Stalinist influence eventually drove him to emigrate to Israel in 1950, but he returned to Warsaw two years later.

His prose aphorisms were provocative and poetical. He called them fraszki (“trifles”), highlighting their linkage to the short, satirical poetical form called fraszki by Jan Kochanowski. They were first collected in 1957 in Myśli nieuczesane (Unkempt Thoughts); other collections followed. The English translations are not currently in print, but there is a good online collection, translated by Jacek Galazka.

01 December 2005

I Know It When I See It

I’ve been working my way through John Searle’s The Construction of Social Reality. In so doing, I was reminded of a distinction that he has made elsewhere, and that strikes me as very useful outside its original realm of application (theory of mind).

Searle has argued that a lot of confusion about theories of mind comes from failing to recognize that both ontology (what is) and epistemology (what is known) include objective and subjective things. Thus, your ontology, your set of real things, can include both objective things, such as the neurons in your brain, and subjective things, such as the pain you feel. Similarly, your epistemology, your set of things that can be known, can include both objective things, such as the statement “That flower is red”, and subjective things, such as “That flower is beautiful”.

The distinction is important for Searle, because theories of mind attempt to collect epistemologically objective statements about ontologically subjective things, and he wants to be sure that that is not an empty set. Thus, he rejects the presumption that ontologically subjective things are epistemologically subjective by necessity, and he chides us (nicely) that, having establishing the status of something ontologically, we still have work to do to say anything about its epistemological status.

The mental state “I think that picture is beautiful” is real. It’s ontologically subjective, but it’s real. Fortunately, we generally recognize that it is also epistemologically subjective, and so we generally try to avoid defining objective tests for its beauty. But substitute “pornographic” for “beautiful” and see what trouble we get into. There are a lot of people who think they can define what is “pornographic” in an objective way suitable for legislation, and the courts are full of defense attorneys arguing epistemological subjectivity.

Recall the words of Justice Potter Stewart in the 1964 Supreme Court decision Jacobellis v. Ohio, 378 U.S. 184:
It is possible to read the Court's opinion in Roth v. United States and Alberts v. California, 354 U.S. 476 , in a variety of ways. In saying this, I imply no criticism of the Court, which in those cases was faced with the task of trying to define what may be indefinable. I have reached the conclusion, which I think is confirmed at least by negative implication in the Court's decisions since Roth and Alberts, that under the First and Fourteenth Amendments criminal laws in this area are constitutionally limited to hard-core pornography. I shall not today attempt further to define the kinds of material I understand to be embraced within that shorthand description; and perhaps I could never succeed in intelligibly doing so. But I know it when I see it, and the motion picture involved in this case is not that.

The phrase “I know it when I see it” is the red flag that someone is attempting to treat something that may well be epistemologically subjective as if it were epistemologically objective.

Scientific questions are about epistemologically objective entities. This is not to say, of course, that things cannot be moved from one category to the other. I would argue, for example, that the question “What distinguishes living matter from nonliving matter?” is scientific, with the caveat that the scientific work involves establishing what elements of the question can be moved into the epistemologically objective category.

But when Michael Behe says
[W]e infer design when we see that parts appear to be arranged for a purpose.
(Proceedings of Kitzmiller et al. v. Dover Area School District et al., 10/17/05, AM session, p. 90)

then I think we have a case of “I know it when I see it,” with all the problems that entails.

Oh, dear. I seem to have drawn a connection between Intelligent Design and obscenity.

My bad.