Wednesday, August 29, 2007

Letter to Prospect: a response

My column for the June issue of Prospect (available in the archives here) can be seen as somewhat sceptical about the value of the Large Hadron Collider, so it is right that Prospect should publish a letter defending it. But the one that appears in the September issue is a little odd:

“Philip Ball (June) says that "the only use of the LHC [Large Hadron Collider] that anyone ever hears about is the search for the Higgs boson." But this is not so. Physicists may look crazy, but they are not crazy enough to build such a complicated and technically demanding installation just to hunt down one particle. The LHC will be the world's most powerful instrument in particle physics for the next ten to 20 years, and it has been built to help us understand more about the 96 per cent of our universe that remains a mystery. The first thing physicists will be looking for is the Higgs boson, but this is just the beginning of a long journey into the unknown. As with earlier accelerators, there will be surprises.”

I’m glad that the author, Reinhard Budde, quoted my remark, because it reveals his non-sequitur. I did not say, as he implies, “all the LHC will do is look for the Higgs boson.” As a writer, I will make factual mistakes and no doubt also express opinions that are not wholly fair or justified. But I do try to choose my words carefully. Let me repeat them more fully:

“Particle physicists point out that because it will smash subatomic particles into one another with greater energy than ever before, it will open a window on a whole new swathe of reality. But the only use of the LHC that anyone ever hears about is the search for the Higgs boson… The LHC may turn up some surprises—evidence of extra dimensions, say, or of particles that lie outside the standard model.”

(It’s interesting that even Dr Budde doesn’t enlighten us about what else. exactly, the LHC might do, but I was happy to oblige.)

It’s a small point, but it does frustrate me; as I found out as a Nature editor, scientists seem peculiarly bad at comprehension of the written word (they have many other virtues to compensate).

For the record, I support the construction of the LHC, but with some reservations, as I stated in my piece. And by the way, I am a physicist, and I do not feel I look particularly crazy. Nor do I feel this is true of physicists as a whole, although many do have a tendency to look as though they belong in The Big Lebowski (this is a good thing). And the LHC was not built by “physicists” – it was built at the request of a rather small subsection of the global physics community. Not all physicists, or even most, are particle physicists.

Tuesday, August 28, 2007

Check out those Victorian shades, dude

For people interested in the cultural histories of materials, there is a lovely paper by Bill Brock in the latest issue of the Notes and Records of the Royal Society on the role of William Crookes in the development of sunglasses. Bill has written a new biography of Crookes (William Crookes (1832-1919) and the Commercialization of Science, in press with Ashgate), who was one of the most energetic and colourful figures in nineteenth-century British science. Shortly to be made the octogenarian president of the Royal Society, Crookes became involved in the 1900s in a search for forms of glass that would block out infrared and ultraviolet radiation. This search was stimulated by the Workman’s Compensation Act on 1897, which allowed workers to claim compensation for work-related injuries. Glassworkers were well known to suffer from cataracts, and it was hoped by the Home Office that prevention of eye damage by tinted glass would obviate the need for compensation. Crookes began to look into the question, and presented his results to the Royal Society in 1913: a glass formulation that was opaque to UV and reduced IR by 90 percent. Always with an eye on commercial possibilities, he suggested that lenses made of this stuff could have other applications too, for example to prevent snow-blindness. “During the brilliant weather of the late summer [of 1911]”, he said, “I wore some of these spectacles with great comfort; they took off the whole glare of the sun on chalk cliffs, and did not appreciably alter the natural colours of objects. Lady Crookes, whose eyes are more sensitive to glare or strong light than are my own, wore them for several hours in the sun with great comfort.” Before long, these spectacles were being considered by London opticians, although commercialization was hindered by the war. Soon the original aim of cataract prevention in glassmakers was forgotten.

Friday, August 24, 2007

Spider-Man’s buddies and other elites
[This is the pre-edited version of my latest article for]

Marvel Universe reflects some of the undesirable properties of our social webs, while suppressing others for moral ends.

In which society do powerful males form a dominant, elitist network while the women are relegated to peripheral roles?

In which society are all the villains portrayed as warped loners while the heroes are a fraternal team united by their fight against evil?

Banish those wicked thoughts. This isn’t the real world, you cynics, but pure fantasy. We’re talking about the Marvel Universe.

This, as any comic-book geek will tell you, is where Spider-Man, Captain America and the Hulk do their thing. As indeed does the Thing. But no, not Batman or Superman – they are part of the DC Universe, and not on speaking terms with the Marvelites. Do keep up.

The story so far: in 2002 Spanish mathematician Ricardo Alberich and colleagues in Mallorca analysed the social network formed by the Marvel comic characters according to whether they have appeared together in the same story [1]. There are around 6,500 of these characters in nearly 13,000 Marvel comic books, so there’s plenty of data to reveal patterns and trends. Indeed, even the world of classical Greek and Roman mythology looks puny by comparison, with just 1,600 or so named characters in its pantheon.

What’s more, the Marvel Universe has become as incestuous as Hollywood in the way the stars hang out with one another – particularly after the relaunch of Marvel Comics in 1961, which spawned well-known gangs such as the Fantastic Four and the X-Men. Take the character Quicksilver, for instance, who first appeared as a member of Magneto’s Brotherhood of Evil Mutants (he is Magneto’s son). He later became a member of the Avengers, and then of the X-Factor, and finally the Knights of Wundagore. His twin sister is the Scarlet Witch, and his wife is Crystal, who was previously dating the Fantastic Four’s Human Torch and was a member of the Inhumans. Are you following this?

Perhaps fortunately, Alberich and team did not have to read every Marvel comic since they began (as Timely Comics) in 1939, for all these connections have been gathered into a database called the Marvel Chronology Project. They deduced that the Marvel network looks in many ways remarkably like those formed in real-world collaborations. Not only is the Marvel Universe a small world, where just about any character can be linked to any other by just a few ‘degrees of separation’, but it is a so-called scale-free world, where the distribution of links has a characteristic form that includes a few very highly connected hubs. In comparison, a random distribution of links would create no such network superstars.

This scale-free structure seems to arise when networks grow in a particular way: each new node forms links to existing nodes in a way that is probabilistic but biased so that nodes that are already highly connected are more likely to receive the new links. In this way, the rich get richer (where richness is a matter of links you have).

The Marvel Universe, like our own societies, is unplanned: it has grown from the work of many comic-book story-writers who have made no attempt to engineer any overall social network. It seems that this joint effort guided them not towards a random network, as might have been expected, but towards one that (somewhat) mirrors reality. The same thing seems to have happened in classical mythology, another ‘multi-author’ story cycle that turns out to share the scale-free social network structure [2].

But Marvel Universe isn’t a perfect match for the way real people interact. In particular, a few of the most popular characters, such as Spider-Man and Captain America, have more connections and thus more influence than anyone would in the real social world. The Marvel writers appear to have succumbed to an unrealistic degree of favouritism.

The scale-free properties of social and other networks were discovered several years ago, but it’s become increasingly clear that looking simply at the statistics of linkages is a fairly crude way of interpreting the web’s structure. Researchers are now keen to ferret out the ways in which a network is divided into distinct communities – friendship circles, say, or professional collaborators – which might then be woven together by a more tenuous web of links. Social networks are, in the sense, hierarchically organized.

A key characteristic of human social networks, identified by Mark Newman of the University of Michigan, is that highly connected (‘rich’) nodes are more likely to be connected to other rich nodes than would be expected by chance – and likewise for ‘poor’ nodes [3]. In other words, the hub members are more likely to be pals with each other than with an individual selected at random. This is called assortative mixing, and contrasts with the structure of scale-free technological and biological networks, such as the Internet and food webs, for which the opposite is true.

Pablo Gleiser of the Centro Atómico Bariloche in Argentina has now delved into the community structure of the Marvel Universe, and shows that it too has assortative ‘rich clubs’ [4], whose well-connected members provide the ‘glue’ that binds distinct communities into a cohesive net. That in itself seems to suggest that here is another way in which the Marvel Universe is a pretty good mimic of reality.

But who are in these rich clubs? Their members are all heroes, and they’re all male. In this universe, women don’t bring communities together but sit on the fringes. That’s an old story – with a smattering of honourable exceptions, women have never fared well in comic books.

Yet the bad guys are no good at forming teams either. Why is that? Gleiser thinks the answer lies in the code created by the Comics Magazine Association of America in 1954 to govern the moral framework of the stories. It stipulates that criminals should not be glamorized, and that evil should be portrayed only to make a moral point: “In every instance good shall triumph over evil and the criminal punished for his misdeeds.”

This means, says Gleiser, that “villains are not destined to play leading roles”, and as a result they aren’t going to become hub characters. He thinks that the predestined victory of the good guys meanwhile encourages collaborations so as to avoid the impression that they are omnipotent and can do their job easily.

But in a world where the CIA is devoting considerable effort to understanding the structures of organized-crime and terrorist networks, it seems that Marvel Universe has become outdated in insisting that its villains work alone (and equally, one might add, that ‘heroes’ prefer collaboration over unilateralism). And sadly, in the real world there is no one to insist that good shall triumph over evil. Not even Captain America.

1. Alberich, R. et al. preprint (2002).
2. Choi, Y.-M. & Kim, H.-J. Physica A 382, 665-671 (2007).
3. Newman, M. E. J. Phys. Rev. Lett. 89, 208701 (2002).
4. Gleiser, P. M. preprint (2007).

Wednesday, August 22, 2007

“Here lies one whose name was writ in water…”

[It’s been pointed out to me that my commentary on the Homeopathy special issue on the memory of water, posted on the Nature news site, is now available only to subscribers. For shame. So here it is. This is the version I returned to the editors, but I’ve not checked what final small changes they might have added subsequently.]

A survey of evidence for the ‘memory’ of liquid water casts little light on its putative role in homeopathy.

I suspect it will be news to most scientists that Elsevier publishes a peer-reviewed journal called Homeopathy. I also suspect that many, on discovering this, would doubt there is anything published there that it would profit them to read. But I propose that such prejudices be put aside for the current special issue, released this Friday, which collects a dozen papers devoted to the ‘memory of water’ [1]. It’s worth seeing what they have to say – if only because that reveals this alleged phenomenon to be as elusive as ever.

The inability of water to act as a memorial was a well known poetical trope before the poet John Keats chose as his epitaph the quotation that serves as a headline here; its ephemerality was noted by Heraclitus in the fifth century BC. But ‘the memory of water’ is a phrase now firmly lodged in the public consciousness – it even supplied the title for a recent play in London’s West End. Scientists, though, tend to side with the poets in rejecting any notion that water can hold lasting impressions. Indeed, Homeopathy’s editor, Peter Fisher of the Royal London Homeopathic Hospital admits that the memory of water “casts a long shadow over homeopathy and is just about all that many scientists recall about the scientific investigation of homeopathy, equating it with poor or even fraudulent science.”

The term was coined by the French newspaper Le Monde in the wake of the 1988 Nature paper [2] that kicked off the whole affair. The lead author was the late Jacques Benveniste, head of a biomedical laboratory in Clamart run by the French National Institute of Health and Medical Research (INSERM).

Benveniste’s team described experiments in which antibodies stimulated an allergic response in human white blood cells called basophils, even when the antibody solutions were diluted far beyond the point where they would contain a single antibody molecule. The activity seemed to disappear and then reappear periodically during serial dilutions.

The results seemed to offer some experimental justification for the use of such high-dilution remedies in homeopathy. But they defied conventional scientific understanding, specifically the law of mass action which demands that the rates of chemical reactions be proportional to the concentrations of reagents. How could this be? Benveniste and colleagues suggested that perhaps the antibody activity was ‘imprinted’ in some fashion on the structure of liquid water, and transferred with each dilution.

The idea made no sense in terms of what was known about the structure of water – but what prevented it from being dismissed straight away was that liquid water has a complicated molecular-scale structure that is still not perfectly understood. Water molecules associate by means of weak chemical bonds called hydrogen bonds. Though in the main they form and break on timescales of about a trillionth of a second, nonetheless they appear to offer a vague possibility that water might form clusters of molecules with specific shapes and behaviours.

Benveniste’s experiments were investigated by a team of ‘fraud-busters’ led by Nature’s then editor John Maddox, who demanded that the studies be repeated under careful observation. Although Benveniste acquiesced (and the results proved utterly inconclusive), he complained of a witch-hunt. Certainly, it was an unprecedented act of scrutiny that not even the proponents of cold fusion a year later – another water-related pathology – had to endure.

In any event, the results were never unambiguously repeated by others. Benveniste, however, progressed from high-dilution experiments to the claim that the activity of biomolecules could be ‘digitally recorded’ and imprinted on water using radio waves. Until his death in 2004, he insisted that this would lead to a new age of ‘digital biology.’

There are many good reasons – too many to fit in this column – to doubt that water molecules in the liquid state could mimic the behaviour of antibodies or other complex biomolecules in a way that persists through dilution after dilution. As water expert José Teixeira, who bravely contributes a sceptic’s perspective to the Homeopathy special issue, says, “Any interpretation calling for ‘memory’ effects in pure water must be totally excluded.” But the idea won’t be squashed that easily, as some of the other papers show.

They report several experimental results that, at face value, are intriguing and puzzling. Louis Rey, a private researcher in Switzerland, reports that salt solutions show markedly different thermoluminescence signals, for different homeopathic dilutions, when frozen and then rewarmed. Bohumil Vybíral and Pavel Vorácek of the University of Hradec Králové in the Czech Republic describe curious viscosity changes in water left to stand undisturbed. And Benveniste’s collaborator Yolène Thomas of the Institut Andre Lwoff in France reports some of the results of radiofrequency ‘programming’ of water with specific biomolecular behaviour, including the induction of E. coli-like ‘signals’, the inhibition of protein coagulation, and blood vessel dilation in a guinea pig heart.

The volume is, in other words, a cabinet of curiosities. There is rarely even a token effort to explain the relevance of these experiments to the supposed workings of homeopathy, with its archaic rituals of shaking (‘succussion’) and ‘magic-number’ dilutions (one must always use factors of ten, and generally only specific ones, such as 100**6, 100**12 and 100**30). The procedures and protocols on display here are often unusual if not bizarre, because it seems the one thing you must not do on any account is the simplest experiment that would probe any alleged ‘memory’ effect: to look for the persistent activity of a single, well-defined agent in a simple reaction – say an enzyme or an inorganic catalyst – as dilution clears the solution of any active ingredient.

If that sounds bad, it is nothing compared with the level of theoretical discussion. This ‘field’ has acquired its own deus ex machina, an unsubstantiated theory of ‘quantum coherent domains’ in water proposed in 1988 [3] that is vague enough to fit anything demanded of it. Aside from that, the ‘explanations’ on offer seem either to consider that water physics can be reinvented from scratch by replacing decades of careful research with wishful thinking, or they call on impurities to perform the kind of miraculous feats of biomolecular mimicry and replication that chemists have been striving to achieve for many years.

The French philosopher Gaston Bachelard once wrote “We attribute to water virtues that are antithetic to the ills of a sick person. Man projects his desire to be cured and dreams of a compassionate substance.” On this evidence, that dream is as strong as ever.

1. Homeopathy 96, 141 - 226 (2007).
2. Davenas, E. et al., Nature 333, 816 (1988).
3. Del Guidice, E. et al. Phys. Rev. Lett. 61, 1085 (1988).

Tuesday, August 21, 2007

After the flood

[This is the pre-edited version of my Lab Report column for the September issue of Prospect.]

Can there be a pub in the country that has not witnessed some sage shaking his head over his pint and opining “Well, if you will build on a flood plain…”? These bar-room prophets are, as usual, merely parroting the phrases they have heard from Westminster, where flood plains have become the talk of the House. “Gordon Brown has to accept the inconvenient truth that if you build houses on flood plains it increases the likelihood that people will be flooded”, says shadow local government secretary Eric Pickles. But the chief executive of the National Housing Federation counters that “there’s simply no way we can’t build any more new homes because of concerns about flood plains… much of the country is a flood plain.”

But what exactly is a flood plain? Perhaps the most pertinent answer is that it is a reminder that rivers are not, like canals, compelled to respect fixed boundaries. They are, in fact, not things at all, but processes. Surface water flow from rain or snow melt, erosion, and sediment transport combine to produce a river channel that constantly shifts, redefining its own landscape. The meanders gradually push back the surrounding hill slopes and smooth out a broad, flat valley floor, thick with fertile sediment: the perfect setting for agrarian settlements, or so it seems. The catch is that when the river waters rise above the banks, there is nothing to hold them back from washing across this wide plain. Levees may try, but struggle against the fact that a river’s curves are always moving slowly: the Mississippi shifts its tracks by up to 20 m a year. One of the fundamental problems for building near rivers is that buildings stay put, but rivers don’t.

What’s the solution? To judge from recent events, it hasn’t changed much in a hundred years: you pile up sandbags. But some precautions are still little heeded: replacing soil with concrete exacerbates the dangers by increasing runoff, and the inadequacies of Britain’s Victorian drainage system are no secret. There’s nothing particularly sophisticated about flood defence: it’s largely question of installing physical barriers and gates. But permanent walls can create conflicts with access and amenity – no one would tolerate a three-foot wall all along the Thames. And some areas are simply impossible to protect this way. So there’s no real call for new science or technology: it’s more a matter of recognizing that flood threats now have to be considered routine, not once-in-a-lifetime risks.

The UK floods were the worst for 60 years, and claimed at least nine lives. But the tribulations of a soggy summer in Gloucester are put in perspective by the situation in Asia. In China, heavy rainfall in the north brought flooding to the Yangtze, and the combined effects of storms has affected one tenth of the population. In a reversal of the usual situation where the parched north envies the moist south, a heatwave in the southern provinces has left more than a million short of drinking water. Meanwhile, an unusually intense monsoon has devastated parts of India and Bangladesh, killing more than 2000, displacing hundreds of thousands from their homes and affecting millions more. A map of the flooded areas of Bangladesh is almost surreal, showing more than half the country ‘under water’.

There’s little new in this, however. Low-lying Bangladesh floods to some degree most years. The Yellow River is commonly known as China’s Sorrow, which has brought recurrent catastrophe to the country’s Great Plain well over a thousand times in history despite herculean efforts to contain its flow with dikes. A flood in 1887-8 created a lake the size of Lake Ontario and, one way or another, killed an estimated six million.

But perhaps surprisingly, some in China have been more ready than in the West to blame the recent events on global warming. Dong Wenjie, director-general of the Beijing Climate Centre, claims that the frequency and intensity of extreme weather events are increasing, and that this “is closely associated with global warming.” Well, maybe. No single event can itself be interpreted one way or the other. The most one can really say is that it is in line with what global warming predicts, as the hydrological cycle that moves water between the seas and skies intensifies – although that by no means implies more rain everywhere. That regional variation, in fact, was a central component of the recent claim by scientists to have detected evidence of global warming on 20th-century rainfall: computer models predict that this influence has a particular geographical fingerprint that has now been identified in the data. It’s a clear sign that the future predictions of more extreme weather – droughts as well as floods – need to be taken seriously.

One question so far given rather little consideration is what this implies for the major hydraulic engineering projects underway in Asia. Ten years ago, specialists in water-resource management were predicting that the problems evident with existing big projects, such as the Aswan Dam on the Nile, might curtail the era of mega-dams and suchlike. Now that looks unlikely: China’s Three Gorges dam is basically complete, and both China and India seem set on ambitious and controversial schemes to transfer waters between their major rivers. The South-North Water Diversion Project in China is scheduled to deliver water to Beijing in time for the Olympics from over 1,000 km away, while the massive Interlinking Rivers project in India would convert the entire country into a grid of waterways controlled by dams, with the aim of alleviating both flooding and drought.

Both of these schemes are already fraught with economic, environmental, social and scientific questions. The prospect of greater variability and more extremes of rainfall can only make the issues more uncertain, and prompts us to shed the illusion that we understand what rivers can and will do.

Sunday, August 19, 2007

The Hydra lives:
more on homeopathy

There’s no rest for the wicked, it seems. My wickedness was to voice criticisms, here and on the Nature site, of a collection of papers on the ‘memory of water’ published in the journal Homeopathy, and I return from holiday to find many responses (see Nature’s weblog and the comments on my article below) to attend to. So here goes.

I am gratified that I found the right metaphor: the ‘memory of water’ does indeed seem to be a many-headed Hydra on which new heads appear as fast as you can lop them off. I’ve discussed several of the papers in the journal, but it seems that I’m being called upon to address them all. Peter Fisher complains that I don’t discuss the experiments at all, but surely he must now know about my Nature column, in which I say:
“These papers report several experimental results that, at face value, are intriguing and puzzling. Louis Rey, a private researcher in Switzerland, reports that salt solutions show markedly different thermoluminescence signals, for different homeopathic dilutions, when frozen and then rewarmed. Bohumil Vybíral and Pavel Vorácek of the University of Hradec Králové in the Czech Republic describe curious viscosity changes in water left to stand undisturbed. And Benveniste's collaborator Yolène Thomas, of the Andre Lwoff Institute in Villejuif, outside Paris, reports some of the results of radiofrequency 'programming' of water with specific biomolecular behaviour, including the induction of Escherichia coli -like 'signals', the inhibition of protein coagulation, and blood-vessel dilation in a guinea pig heart.”

To do a thorough analysis of all the papers would require far more words than I can put into a Nature news article, or could reasonably post even on my own blog (the original piece below already ran to over 2000 words). The problem is that, as I’ve said before, the devil is in the details – and there are a lot of details.

Let me illustrate that with reference to Rustum Roy’s paper (Rao et al.), which Martin Chaplin, Dana Ullman (apologies for the gender confusion) and Rustum himself all seem keen that I talk about. I’m all too happy to acknowledge Rustum’s credentials. I have the highest respect for his work, and in fact I once attempted to organize a symposium for a Materials Research Society meeting with him on the ethics of that topic (something that was shamefully declined by the MRS, of which I am otherwise a huge fan, on the grounds that it would arouse too much controversy).

The paper is hard to evaluate on its own – it indicates that the full details will be published elsewhere. They key experimental claim is that the UV-Vis spectra of different remedies (Natrum muriaticum and Nux vomica) are distinguishable not only from one another but also among the different potencies (6C, 12C, 30C) of remedy. That is surprising if, chemically speaking, the solutions are all ‘identical’ mixtures of 95% ethanol in water. But are they? Who knows. There is no way of evaluating that here. There is no analysis of chemical composition – it looks as though the remedies were simply bought from suppliers and not analysed by any other means than those reported. So I find these to be a really odd set of experiments: in effect, someone hands you a collection of bottles without any clear indication of what’s in them, you conduct spectroscopy on them and find that the spectra are different, and then you conclude, without checking further, that the differences cannot be chemical. If indeed these solutions are all nominally identical ethanol solutions that differ only in the way they have been prepared, these findings are hard to explain. But this paper alone does not make that case – it simply asks us to believe it. One does not have to be a resolute sceptic to demand more information.

There is a troubling issue, however. In searching around to see what else had been written about this special issue, I came across a comment on Paul Wilson’s web site suggesting that the comparisons shown in Figures 1 and 2 are misleading. In short, the comparisons of spectra of Nat mur and Nux vom in Figure 1 are said to be “representative”. Figure 2, meanwhile, shows the range of variation for 10 preparations of each of these two remedies. But the plot for Nat mur in Figure 1 corresponds to the lowest boundary of the range shown in Figure 2, while the plot for Nux vom corresponds to the uppermost boundary. In other words, Figure 1 shows not representative spectra at all, but the two that are the most different out of all 10 samples. I have checked this suggestion for myself, and found it to be true, at least for the 30C samples. I may simply be misunderstanding something here, but if not, it’s hard not to see this aspect of the paper as very misleading, whatever the explanation for it. Why wasn’t it picked up in peer review?

I’m not going to comment at any length on the hypotheses put forward in Rao et al., because they aren’t in any way directly connected to the experiments, and so there’s simply no support for them at all in the results. I don’t see for a moment, however, either how these hypotheses can be sustained in their own right, or (less still) how they can explain any physiological effects of the remedies.

I don’t, as Rustum implies, demand an explanation for alleged ‘memory of water’ effects before I will accept them as genuine – I agree that experiment should take primacy. I merely want to point out that the ‘explanations’ on offer do not offer much cause to think that a great deal of critical thinking is going on here. Rustum is perhaps right to suggest that I may have been too acquiescent to the Nature news editor’s erudite suggestion for the title of my column. I’m not sure, however, that Keats really meant to imply that his name would so soon be forgotten…

On the Nature site, George Vithoulkas gives me great delight, for it seems that homeopaths aren’t even sufficiently agreed about how their remedies are supposed to work that they can distinguish ‘evidence’ for a mechanism from its opposite. My only other comment in this regard is to use Vithoulkas’s comment to point out that the common attribution of this ‘like cures like’ notion, as a general principle of medicine, to Paracelsus is wrong (not that this would give it any greater credibility!).

OK, am I excused now?

The Faculty of Homeopathy has just issued the following rejoinder to Richard Dawkins' TV programme last night in which he exposed the lack of scientific credibility of homeopathy. This special issue of Homeopathy on the memory of water has been cited as 'evidence' that there is some scientific weight to the field after all. This is exactly what I knew would happen: the mere fact of the papers' existence will now be used to defend homeopathy as a science. Let's hope that some people, at least, will be moved to examine the quality of that evidence.
I hope to comment on Richard's series in a later post. It was nice to see him being more charming and less bristling than he tends to be when talking about religion - his points carry much more force this way.

Statement in response to The Enemies of Reason - “The Irrational Health Service” Channel 4, Monday 20 August

The Faculty of Homeopathy and British Homeopathic Association support an easily understood approach to difficult scientific issues. However, Professor Richard Dawkins’ Channel 4 programme “The Irrational Health Service” presented an unbalanced and biased picture of the facts and evidence about homeopathy.

Contrary to the impression given by the programme, there has never been more evidence for the effectiveness of homeopathy than now: This comes from audits and outcome studies, cost effectiveness studies, narrative medicine and statistical overviews (or meta-analyses). Four out of five meta-analyses of homeopathy as a whole show positive effect for homeopathy, as do several focusing on specific conditions.

There is also an increasing body of work about the scientific properties of highly diluted substances, which Professor Dawkins dismissed. The most recent issue of the Faculty of Homeopathy’s journal Homeopathy contains articles by scientists from around the world, which are a timely reminder about how much there is still to learn about the science of these dilutions. The outright dismissal of any potential activity of these substances is increasingly untenable.

Thursday, August 09, 2007

Chemistry in pictures

Joachim Schummer and Tami Spector have just published in Hyle a paper based on their presentation at the 2004 meeting ‘The Public Images of Chemistry’ in Paris. This was one of the most interesting talks of the conference, looking at how the images used to portray chemists and their profession both by themselves and by others over the past several centuries have influenced and been influenced by public perceptions. They look at tropes drawn (often subconsciously) from aesthetics in the visual arts, and at how the classic ‘brochure’ photos of today often still allude to the images of flask-gazing cranks found in depictions of alchemists and derived from uroscopy. (See, for example, the logo for my ‘Lab Report’ column in Prospect.) I shamelessly plagiarize these ideas at every opportunity. Recommended reading.

Monday, August 06, 2007

A wardrobe for Mars

[This is my Material Witness column for the September issue of Nature Materials.]

No one has a date booked for a party on the moon or on Mars, but that hasn’t stopped some from thinking about what to wear. One thing is clear: there is nothing fashionably retro about the Apollo look. If, as seems to be the plan, we are going out there this time to do some serious work, the bulky gas bags in which Alan Shepard and his buddies played golf and rode around in buggies aren’t up to the job. Pressurized with oxygen, the suits could be bent at the arm and leg joints only with considerable effort. A few hours’ of lunar hiking and you’d be exhausted.

In comparison, the fetching silver suits worn for the pre-Apollo Mercury missions look almost figure-hugging. But that’s because they were worn ‘soft’ – the astronauts didn’t venture outside their pressurized cabins, and the suits would have inflated only in the event of a pressure loss. In the vacuum of space, high pressure is needed to prevent body fluids from boiling.

But pressurization is only a part of the challenge. Space-suit design presents a formidable, multi-faceted materials challenge. The solution has to involve a many-layered skin – sometimes more than a dozen layers, each with a different function. This makes the suit inevitably bulky and expensive.

While the Mercury suits were basically souped-up high-altitude pilots’ suits, made from Neoprene-coated and aluminized nylon, today’s spacewear tends to follow the Apollo principle of several distinct garments worn in layers. A liquid cooling and ventilation garment (LCVG) offers protection from temperatures that can reach 135 oC in the Sun’s glare, while allowing body moisture to escape; a pressure suit (PS) acts as a gas-filled balloon; and a thermomechanical garment (TMG) protects against heat loss, energetic radiation, puncture by micrometeoroids, and abrasion.

These suits initially made use of the materials to hand, but inevitably this resulted in some ‘lock-in’ whereby ‘tradition’ dominated materials choices rather than this being reconsidered with each redesign. Some Apollo materials, such as the polyimide Kapton and the polyamide Kevlar, are still used – Kapton’s rigidity and low gas permeability recommends it for containing the ballooning PS, while Kevlar’s strength is still hard to beat for the TMG. But not all the choices are ideal: a Spandex LCVG has rather poor wicking and ventilation properties. Indeed, a reanalysis from scratch suggests superior replacements for most of the ‘traditional’ materials (J. L. Marcy et al., J. Mat. Eng. Perf. 13, 208; 2004; see paper here).

Space suits have increased in mass since Apollo, because they are now used in zero rather than lunar gravity. But martian gravity is a third that of Earth. To improve suit flexibility and reduce mass, Dava Newman and coworkers at the Massachusetts Institute of Technology are reconsidering the basic principles: using tight-fitting garments rather than gas to exert pressure, while strengthening with a stiff skeleton along lines of non-extension. These BioSuits are several years away from being ready for Mars – but there’s plenty of time yet to prepare for that party.

Friday, August 03, 2007

A bad memory

I have just read all the papers on ‘the memory of water’ published in a special issue of the journal Homeopathy, which will be released in print on 10 August. Well, someone had to do it. I rather fear that my response, detailed below, will potentially make some enemies of people with whom I’ve been on friendly terms. I hope not, however. I hope they will respect my right to present my views as much as I do theirs to present theirs. But I felt my patience being eroded as I waded through this stuff. Might we at least put to rest now the tedious martyred rhetoric about ‘scientific heresy’, which, from years of unfortunate experience, I can testify to being the badge of the crank? I once tried to persuade Jacques Benveniste of how inappropriate it was to portray a maverick like John Maddox as a pillar of the scientific establishment – but he wouldn’t have it, I suppose because that would have undermined his own platform. Ah well, here’s the piece, a much shortened version of which will appear in my Crucible column in the September issue of Chemistry World.


I met Jacques Benveniste in 2004, shortly before he died. He had tremendous charm and charisma, and I rather liked him. But I felt then, and still feel now, that in ‘discovering’ the so-called memory of water he lost his way as a scientist and was sucked into a black hole of pseudoscience that was just waiting for someone like him to come along.

This particular hole is, of course, homeopathy. In 1988, Benveniste published a paper in Nature that seemed to offer an explanation for how homeopathic remedies could retain their biological activity even after being diluted so much that not a single molecule of the original ‘active’ ingredients remains [1]. It is common for homeopathic remedies to have undergone up to 200 tenfold dilutions of the original ‘mother tincture’, which is quite sufficient to wash away even the awesome magnitude of Avogadro’s constant.

Benveniste and his coworkers studied the effect of dilution of an antibody that stimulates human immune cells called basophils to release histamine – a response that can provoke an allergic reaction. In effect, the antibody mimics an allergen. The researchers reported that the antibody retains its ability to provoke this response even when diluted by 10**60 – and, even more oddly, that this activity rises and falls more or less periodically with increasing dilution.

The paper’s publication in Nature inevitably sparked a huge controversy, which turned into a media circus when Nature’s then editor John Maddox led an investigation into Benveniste’s laboratory techniques. Several laboratories tried subsequently to repeat the experiment, but never with unambiguous results. The experiment proved irreproducible, and came to be seen as a classic example of what US chemist Irving Langmuir christened ‘pathological science’. (The details are discussed in my book on water [2], or you can read Michel Schiff’s book [3] for a deeply partisan view from the Benveniste camp.)

Benveniste remained convinced of his results, however, and continued working on them in a privately funded lab. He eventually claimed that he could ‘programme’ specific biological activity into pure water using electromagnetic radiation. He predicted a forthcoming age of ‘digital biology’, in which the electromagnetic signatures of proteins and other biological agents would be digitally recorded and programmed into water from information sent down phone lines.

Homeopaths have persistently cited Benveniste’s results as evidence that their treatments do not necessarily lack scientific credibility. Such claims have now culminated in a special issue of the journal Homeopathy [4] that presents a dozen scientific papers on the ‘memory of water.’

In at least one sense, this volume is valuable. The memory of water is an idea that refuses to go away, and so it is good to have collected together all of the major strands of work that purport to explain or demonstrate it. The papers report some intriguing and puzzling experimental results that deserve further attention. Moreover, the issue does not duck criticism, including a paper from renowned water expert José Teixeira of CEA Saclay in France that expresses the sceptic’s viewpoint. Teixeira points out that any explanation based on the behaviour of pure water “is totally incompatible with our present knowledge of liquid water.”

But perhaps the true value of the collection is that it exposes this field as an intellectual shambles. Aware that I might hereby be making enemies of some I have considered friends, I have to say that the cavalier way in which ‘evidence’ is marshalled and hypotheses are proposed with disregard for the conventions of scientific rigour shocked even me – and I have been following this stuff for far too long.

Trying to explain homeopathy through some kind of aqueous ‘memory’ effect has plenty of problems created by the traditions of the field itself, in which ‘remedies’ are prepared by serial dilution and vigorous shaking, called succussion. For example, it is necessary not only that the memory exists but that it is amplified during dilution. In his overview paper, guest editor Martin Chaplin, a chemist at South Bank University in London whose web site on water is a mine of valuable information, points to the surprising recent observation that some molecules form clusters of increasing size as they get more dilute. But this, as he admits, would imply that most homeopathic solutions would be totally inactive, and only a tiny handful would be potent.

Another problem, pointed out by David Anick of the Harvard Medical School and John Ives of the Samueli Institute for Information Biology in Virginia, is that if we are to suppose the ‘memory’ to be somehow encoded in water’s structure, then we must accept that there should be many thousands of such stable structures, each accounting for a specific remedy – for several thousand distinct remedies are marketed by homeopathic companies, each allegedly distinct in its action.

Yet another difficulty, seldom admitted by homeopaths, is that the dilutions of the mother tincture must allegedly be made by factors of ten and not any other amount. This is not mentioned in the papers here, presumably because it is too absurd even for these inventive minds to find an explanation. A related issue that is addressed by Anick is the tradition of using only certain dilution factors, such as 10**6, 10**12, 10**30 and 10**200. He offers a mathematical model for why this should be so that masquerades as an explanation but is in fact tantamount to a refutation: “it would be inconceivable”, he says, “that one number sequence would work in an ideal manner for every mother tincture.” Still, he concludes, the convention might be ‘good enough’. So why not perhaps test if it makes any difference at all?

One of the challenges in assessing these claims is that they tend to play fast and loose with original sources, which obliges you to do a certain amount of detective work. For example, Chaplin states that the ability of enzymes to ‘remember’ the pH of their solvent even when the water is replaced by a non-aqueous solvent implies that the hydrogen ions seem to have an effect in their absence, “contrary to common sense at the simplistic level.” But the paper from 1988 in which this claim is made [5] explains without great ceremony that the ionizable groups in the enzyme simply retain their same ionization state when withdrawn from the aqueous solvent and placed in media that lack the capacity to alter it. There’s no mysterious ‘memory’ here.

Similarly, Chaplin’s comment that “nanoparticles may act in combination with nanobubbles to cause considerable ordering within the solution, thus indicating the possibility of solutions forming large-scale coherent domains [in water]” is supported by a (mis-)citation to a paper that proposes, without evidence, the generally discredited idea of ‘ice-like’ ordering of water around hydrophobic surfaces.

One of the hypotheses for water’s ‘memory’, worked out in some detail by Anick and Ives, invokes the dissolution of silicate anions from the glass walls of the vessel used for dilution and succussion, followed by polymerization of these ions into a robust nanostructured particle around the template of the active ingredient initially present. Certainly, silicate does get added, in minute quantities, to water held in glass (this seemed to be one of the possible explanations for another piece of water pathological science, polywater [6]). But how to progress beyond there, particularly when such a dilute solution favours hydrolysis of polysilicates over their condensation?

Well, say Anick and Ives, there are plenty of examples of silicate solutions being templated by solutes. That’s how ordered mesoporous forms of silica are synthesized in the presence of surfactants, which aggregate into micelles around which the silica condenses [7]. This, then, wraps up that particular part of the problem.

But it does nothing of the sort. This templating has been seen only at high silicate concentrations. It happens when the template is positively charged, complementary to the charge on the silicate ions. The templating gives a crude cast, very different from a biologically active replica of an enzyme or an organic molecule. Indeed, why on earth would a ‘negative’ cast act like the ‘positive’ mold anyway? The template is in general encapsulated by the silica, and so doesn’t act as a catalyst for the formation of many replicas. And for this idea to work, the polysilicate structure has to be capable of reproducing itself once the template has been diluted away – and at just the right level of replicating efficiency to keep its concentration roughly constant on each dilution.

The last of these requirements elicits the greatest degree of fantastical invention from the authors: during the momentary high pressures caused by succussion, the silicate particles act as templates that impose a particular clathrate structure on water, which then itself acts as a template for the formation of identical silicate particles, all in the instant before water returns to atmospheric pressure. (Elsewhere the authors announce that “equilibrium of dissolved [silicate] monomers with a condensed silica phase can take months to establish.”) None of this is meanwhile supported by the slightest experimental evidence; the section labelled ‘Experiments to test the silica hypothesis’ instead describes experiments that could be done.

Another prominent hypothesis for water’s memory draws on work published in 1988 by Italian physicists Giuliano Preparata and Emilio Del Guidice [8]. They claimed that water molecules can form long-ranged ‘quantum coherent domains’ by quantum entanglement, a phenomenon that makes the properties of quantum particles co-dependent over long ranges. Entanglement certainly exists, and it does do some weird stuff – it forms the basis of quantum computing, for example. But can it make water organize itself into microscopic or even macroscopic information-bearing domains? Well, these ‘quantum coherent domains’ have never been observed, and the theory is now widely disregarded. All the same, this idea has become the deus ex machina of pathological water science, a sure sign that the researchers who invoke it have absolutely no idea what is going on in their experiments (although one says such things at one’s peril, since these researchers demonstrated a litigious tendency when their theory was criticized in connection with cold fusion).

Such quantum effects on water’s memory are purportedly discussed in the special issue by Otto Weingärtner of Dr Reckeweg & Co. in Bensheim, Germany – although the paper leaves us none the wiser, for it contains neither experiments nor theory that demonstrate any connection with water. The role of entanglement is made more explicit by Lionel Milgrom of Imperial College in London, who says that “the homeopathic process is regarded as a set of non-commuting complementary observations made by the practitioner… Patient, practitioner, and remedy comprise a three-way entangled therapeutic entity, so that attempting to isolate any of them ‘collapses’ the entangled state.” In other words, this notion is not really about quantum mechanics at all, but quantum mysticism.

Benveniste’s long-term collaborator Yolène Thomas of the Institut Andre Lwoff in Villejuif argues, reasonably enough, that in the end experiment, not theory, should be the arbiter. And at face value, the ‘digital biology’ experiments that she reports are deeply puzzling. She claims that Benveniste and his collaborators accumulated many examples of biological responses being triggered by the digitized radiofrequency ‘fingerprints’ of molecular substances – for example, tumour growth being inhibited by the ‘Taxol signal’, the lac operon genetic switch of bacteria being flipped by the signal from the correct enantiomeric form of arabinose, and vascular dilation in a guinea pig heart being triggered by the signal from the classic vasodilator acetylcholine. What should one make of this? Well, first, it is not clear why it has anything to do with the ‘memory of water’, nor with homeopathy. But second, I can’t help thinking that these experiments, however sincere, have an element of bad faith about them. If you truly believe that you can communicate molecular-recognition information by electromagnetic means, there is no reason whatsoever to study the effect using biological systems as complex as whole cells, let alone whole hearts. Let’s see it work for a simple enzymatic reaction, or better still, an inorganic catalyst, where there is far less scope for experimental artefacts. It is hard to imagine any reason why such experiments have not been attempted, except for the reason that success or failure would be less ambiguous.

What emerges from these papers is an insight into the strategy adopted more or less across the board by those sympathetic to the memory of water. They begin with the truism that it is ‘unscientific’ to simply dismiss an effect a priori because it seems to violate scientific laws. They cite papers which purportedly show effects suggestive of a ‘memory’, but which often on close inspection do nothing of the kind. They weave a web from superficially puzzling but deeply inconclusive experiments and ‘plausibility arguments’ that dissolve the moment you start to think about them, before concluding with the humble suggestion that of course all this doesn’t provide definitive evidence but proves there is something worth further study.

One has to conclude, after reading this special issue, that you can find an ‘explanation’ at this level for water’s memory from just about any physical phenomenon you care to imagine – dissipative non-equilibrium structures, nanobubbles, epitaxial ordering, gel-like thixotropy, oxygen free radical reactions… In each case the argument leaps from vague experiments (if any at all) to sweeping conclusions that typically take no account whatsoever of what is known with confidence about water’s molecular-scale structure, and which rarely address themselves even to any specific aspect of homeopathic practice. The tiresome consequence is that dissecting the idea of the memory of water is like battling the many-headed Hydra, knowing that as soon as you lop off one head, another will sprout.

In his original paper in Nature, Jacques Benveniste offered a hypothesis for how the memory effect works: “specific information must have been transmitted during the dilution/shaking process. Water could act as a template for the [antibody] molecule, for example by an infinite hydrogen-bonded network or electric and magnetic fields.” Read these sentences carefully and you will perhaps decide that Benveniste missed his calling as a post-modernist disciple of his compatriot Jacques Derrida. It has no objective meaning that I can discern. It sounds like science, but only because it copies the contours of scientific prose. This, I would submit, is a fair metaphor for the state of ‘water memory’ studies today.

I once read a book supposedly about the philosophy of religion which was in fact an attempt to make a logical case for God’s existence. Having stepped through all of the traditional arguments – the ontological, the argument from design and so forth – the author admitted that all of them had significant flaws, but concluded that collectively they made a persuasive case. This group of papers is similar, implying that a large enough number of flimsy arguments add up to a single strong one. It leaves me feeling about homeopathy much as I do about religion: those who find it genuinely helpful are right to use it, but they shouldn’t try to use scientific reason to support their decision.

1. E. Davenas et al., Nature 333, 816 (1988).
2. P. Ball, H2O: A Biography of Water (Weidenfeld & Nicolson, 1999).
3. M. Schiff, The Memory of Water (Thorsons, 1995).
4. Homeopathy 96, 141-226 (2007).
5. A. Zaks & A. Klibanov, J. Biol. Chem. 263, 3194 (1988).
6. F. Franks, Polywater (MIT Press, Cambridge, MA, 1981).
7. C. T. Kresge et al., Nature 359, 710 (1992).
8. E. Del Guidice et al. Phys. Rev. Lett. 61, 1085 (1988).

Wednesday, August 01, 2007

Pay your money and take your chances
[This is the pre-edited version of my latest muse article for]

Fatalities are an inevitable part of human spaceflight, and space tourism companies will have to face up to it.

The tragic deaths of three workers in an explosion at the Mojave Air and Space Port in California, while testing a rocket propulsion system for a privately funded spacecraft, shouldn’t be seen as the first fatalities of commercial spaceflight. This was an industrial accident, not a failure of aerospace engineering.

All the same, the accident will surely provoke questions about the safety of space tourism. The victims worked for Scaled Composites, a company that has been commissioned to make a new spacecraft for Richard Branson’s Virgin Galactic space-tourism enterprise. Virgin has announced the intention of launching the first commercial space flights in 2009.

Scaled Composites is run by entrepreneur Burt Rutan, whose SpaceShipOne became the first privately funded craft to reach space in 2004, winning the $5 m Ansari X Prize created to stimulate private manned spaceflight technology. Virgin Galactic aims to use a successor, SpaceShipTwo, to take space tourists 62 miles up into sub-orbital space at a cost of around £100,000 ($200,000) each.

Other aerospace engineers have been keen to emphasize that the accident (which seems to have been caused by a component of rocket fuel) does not reflect on the intrinsic safety of space flight. They are right in a sense, although the incident seems likely to set back Virgin’s plans. Nevertheless, it is a reminder that rocket science is potentially lethal – and not just in flight. Three US astronauts died in a fire during supposedly routine launch-pad tests for the Apollo 1 mission in 1967.

Virgin insists that “safety is at the heart of the design” of their space tourism programme. Perhaps it is now time to ask what this might mean – or more precisely, how the issue of safety in commercial space travel can be reconciled with its economic viability, accessibility, and projected traffic volume.

These factors make up a complex equation, and it is fair to say that no one yet has shown clearly how it might be solved. What, in short, is the business model for space tourism?

So far, the marketing strategy has relied on rhetoric that sounds stirring but which makes it just as well these companies do not need to seek a start-up bank loan. The vision simply isn’t coherent.

On the one hand, there is the pretence of democratizing space. While governments have jealously kept spaceflight in the hands of a closed elite, says the X Prize Foundation, commercial spacecraft will make it available to everyone. Virgin Galactic is not motivated by quite the same anti-government libertarianism, but does suggest that “safety and cost issues [have] previously made space travel the preserve of the privileged few.”

All of this, of course, sits uneasily with the fact that the only space tourists so far have been multi-millionaires, and that a $200,000-per-head ticket price does not exactly fall within the range of your average family holiday.

Ah, but that will change as the industry grows, says Peter Diamandis, chairman of the X Prize Foundation. “Over the next decade we’ll see the price of seats drop from $200 K to $50 K, and perhaps as low as $25 K per person”, he says. That’s more expensive than a luxury cruise, admittedly, but many might consider it for a once-in-a-lifetime experience.

I’ve yet to see a convincing explanation of the economics, however. Diamandis has outlined the sums on the basis that “the cost of operating a mature transportation system (car, train, plane) is typically three times the cost of the fuel.” But one of the reasons the Space Shuttle is so cripplingly expensive is that the inspections and repairs needed after each flight are on a quite different scale from those of airlines. And, one has sadly to add, even then they are evidently flawed.

Even if the business model can be made to work, it will clearly need to depend initially on rich thrill-seekers. But the early days of every new transportation technology have been hazardous, aviation especially so. Safety has tended to be a luxury afforded only once the industry is established.

The current history of manned spaceflight bears this out. As of 2003, 18 of the 430 humans who had flown in space died in accidents: a fatality rate of about 4 per cent (although the precise figures can be debated because of multiple flights by individuals). That’s comparable to the risk of dying in an Everest expedition. The odds haven’t stopped (mostly rich) people from scaling Everest, but former US astronaut Rick Hauck says that he wouldn’t have flown if he’d known what his chances of coming back alive were.

Looked at another way, manned spaceflight has so far proved to be 45,000 times more dangerous than taking a commercial air flight. It is perhaps unfair to compare craft like SpaceShipTwo with Apollo missions – SpaceShipOne has been compared instead to the US’s experimental X-15 rocket plane, which had only one crash in 199 flights. But however you look at it, Virgin Galactic is inventing a new technology, while Virgin Atlantic had decades of experience to draw on.

Who cares, advocates of human space travel will respond. Without risk, we’d never achieve anything. “It’s the dreamers, it’s the doers, it’s the furry mammals who are evolved, take the risks, or die”, says Diamandis. “That’s what we stand for.”

But wait a minute. Are you saying that space tourism will put safety first, or that it depends on the bravery of do-or-die pioneers? Either will play well to a particular audience, but you can’t have it both ways. If the argument is that a few foolhardy fat cats must put their lives on the line so that the industry can ultimately become cheap and safe enough to reach a mass market, so be it. But somehow, I can’t see that sales pitch working.