Thursday, December 22, 2011

400 years of snowflakes


Here is the pre-edited version of my In Retrospect piece for Nature celebrating the 400th anniversary of Kepler’s seminal little treatise on snowflakes.
_________________________________________________________________

Did anyone ever receive a more exquisite New Year’s gift than the German scholar Johannes Matthäus Wackher von Wackenfels, four hundred years ago? It was a booklet of just 24 pages, written by his friend Johannes Kepler, court mathematician to the Holy Roman Emperor Rudolf II in Prague. The title was De nive sexangula (On the Six-Cornered Snowflake), and herein Kepler attempted to explain why snowflakes have this striking hexagonal symmetry. Not only is the booklet charming and witty, but it seeded the notion from which all of crystallography blossomed: that the geometric shapes of crystals can be explained in terms of the packing of their constituent particles.

Like Kepler, Wackher was a self-made man of humble origins whose brilliance earned him a position in the imperial court. By 1611 he had risen to the position of privy councillor, and was a man of sufficient means to act as Kepler’s some-time patron. Sharing an interest in science, he was also godfather to Kepler’s son and in fact a distant relative of Kepler himself. It is sometimes said that Kepler’s booklet was in lieu of a regular gift which the straitened author, who frequently had to petition Rudolf’s treasury for his salary, could not afford. In his introduction, Kepler says he had recently noticed a snowflake on the lapel of his coat as he crossed the Charles Bridge in Prague, and had been moved to ponder on its remarkable geometry.

Kepler came to the imperial court in 1600 as an assistant to the Danish astronomer Tycho Brahe. When Tycho died the following year, Kepler became his successor, eagerly seizing the opportunity to use Tycho’s incomparable observational data to deduce the laws of planetary motion that Isaac Newton’s gravitational theory later explained.

Kepler’s analysis of the snowflake comes at an interesting juncture. It unites the older, Neoplatonic idea of a geometrically ordered universe that reflects God’s wisdom and design with the emerging mechanistic philosophy, in which natural phenomena are explained by proximate causes that, while they may be hidden or ‘occult’ (like gravity), are not mystical. In Mysterium Cosmographicum (1596) Kepler famously concocted a model of the cosmos with the planetary orbits arranged on the surfaces of nested polyhedra, which looks now like sheer numerology. But unlike Tycho, he was a Copernican and came close to formulating the mechanistic gravitational model that Newton later developed.

Kepler was not by any means the first to notice that the snowflake is six-sided. This is recorded in Chinese documents dating back to the second century BCE, and in the Western world the snowflake’s ‘star-like’ forms were noted by Albertus Magnus in the thirteenth century. René Descartes included drawings of sixfold stars and ice ‘flowers’ in his meteorological book Les Météores (1637), while Robert Hooke’s microscopic studies recorded in Micrographia (1665) revealed the elaborate, hierarchical branching patterns.

“There must be a cause why snow has the shape of a six-cornered starlet”, Kepler wrote. “It cannot be chance. Why always six? The cause is not to be looked for in the material, for vapour is formless and flows, but in an agent.” This ‘agent’, he suspected, might be mechanical, namely the orderly stacking of frozen ‘globules’ that represent “the smallest natural unit of a liquid like water” – not explicitly atoms, but as good as. Here he was indebted to the English mathematician Thomas Harriot, who acted as navigator for Walter Raleigh’s voyages to the New World in 1584-5. Raleigh sought Harriot’s expert advice on the most efficient way to stack cannonballs on the ship’s deck, prompting the ingenious Harriot to theorize about the close-packing of spheres. Around 1606-8 he communicated his thoughts to Kepler, who returned to the issue in De nive sexangula. Kepler asserted that hexagonal packing “will be the tightest possible, so that in no other arrangement could more pellets be stuffed into the same container.” This assertion about maximal close-packing became known as Kepler’s conjecture, which was proved using computational methods only in 1998 (published in 2005) [1].

Less commonly acknowledged as a source of inspiration is the seventeenth-century enthusiasm for cabinets of curiosities (Wunderkammern), collections of rare and marvelous objects from nature and art that were presented as microcosms of the entire universe. Rudolf II had one of the most extensive cabinets, to which Kepler would have had privileged access. The forerunners of museum collections, the cabinets have rarely been recognized as having any real influence on the nascent experimental science of the age. But Kepler mentions in his booklet having seen in the palace of the Elector of Saxony in Dresden “a panel inlaid with silver ore, from which a dodecahedron, like a small hazelnut in size, projected to half its depth, as if in flower” – a showy example of the metalsmith’s craft which may have stimulated his thinking about how an emergent order gives crystals their facets.

Yet despite his innovative ideas, in the end Kepler is defeated by the snowflake’s ornate form and its flat, plate-like shape. He realizes that although the packing of spheres creates regular patterns, they are not necessarily hexagonal, let alone as ramified and ornamented as that of the snowflake. He is forced to fall back on Neoplatonic occult forces: God, he suggests, has imbued the water vapour with a “formative faculty” that guides its form. There is no apparent purpose to the flake’s shape, he observes: the “formative reason” must be purely aesthetic or frivolous, nature being “in the habit of playing with the passing moment.” That delightful image, which touches on the late Renaissance debate about nature’s autonomy, remains resonant today in questions about the adaptive value (or not) of some complex patterns and forms in biological growth [2]. Towards the end of his inconclusive tract Kepler offers an incomparably beautiful variant of ‘more research is needed’: “As I write it has again begun to snow, and more thickly than a moment ago. I have been busily examining the little flakes.”

Kepler’s failure to explain the baroque regularity of the snowflake is no disgrace, for not until the 1980s was this understood as a consequence of branching growth instabilities biased by the hexagonal crystal symmetry of ice [3]. In the meantime, Kepler’s vision of crystals as stackings of particles informed the eighteenth-century mineralogical theory of René Just Haüy, the basis of all crystallographic understanding today.

But the influence of Kepler’s booklet goes further. It was in homage that crystallographer Alan Mackay called his seminal 1981 paper on quasicrystals ‘De nive quinquanglua’ [4]. Here, three years before the experimental work that won Dan Shechtman this year’s Nobel prize in chemistry, Mackay showed that a Penrose tiling could, if considered the basis of an atomic ‘quasi-lattice’, produce fivefold diffraction patterns. Quasicrystals showed up in metal alloys, not snow. But Mackay has indicated privately that it might indeed be possible to induce water molecules to pack this way, and quasicrystalline ice was recently reported in computer simulations of water confined between plates [5]. Whether it can furnish five-cornered snowflakes remains to be seen.

References
1. Hales, T. C. Ann. Math. 2nd ser. 162, 1065-1185 (2005).
2. Rothenberg, D. Survival of the Beautiful (Bloomsbury, New York, 2011).
3. Ben-Jacob, E., Goldenfeld, N., Langer, J. S. & Schön, G. Phys. Rev. Lett. 51, 1930-1932 (1983).
4. Mackay, A. L. Kristallografiya 26, 910-919 (1981); in English, Sov. Phys. Crystallogr. 26, 517-522 (1981).
5. Johnston, J. C., Kastelowitz, N. & Molinero, V. J. Chem. Phys. 133, 154516 (2010).

Reputations matter

Rather a lot of posts all at once, I fear. Here is the first, which I meant to put up earlier – last Saturday’s column in the Guardian.
_______________________________________________________________
Johannes Stark was a German physicist whose Nobel prize-winning discovery in 1913, the Stark effect (don’t ask), is still useful today. Just the sort of person, then, who you might expect to have scientific institutes or awards named after him.

The fact that there aren’t any is probably because Stark was a Nazi – a bitter and twisted anti-Semite who rejected relativity because Einstein was Jewish.

Scientists concur that, while your discovery should bear your name no matter how despicable (or just plain crazy) you are, you need a little virtue to be commemorated in other ways.

But how little? Everyone knows Isaac Newton was a grumpy and vindictive old sod, but that hardly seems reason to begrudge the naming of the Isaac Newton Institute for Mathematical Sciences in Cambridge. Yet when the Dutch Nobel laureate Peter Debye was accused in a 2006 book of collusion with the Nazis during his career in pre-war Germany, the Dutch government insisted that the Debye Institute at the University of Utrecht be renamed, and an annual Debye Prize awarded in his hometown of Maastricht was suspended.

Reputations matter, then. Two researchers have claimed this week to lay to rest the suggestion that Charles Darwin stole some of his ideas on natural selection from Alfred Russel Wallace, who sent Darwin a letter explaining his own theory in 1858. Darwin passed it on to other scientific authorities as Wallace requested, but it has been suggested that he first sat on it for weeks and revised his theory in the light of it.

No proper Darwin historian ever took that accusation seriously, not least because everything we know about Darwin’s character makes it highly implausible. But Wallace has admirers on the fringe who identify with his image of the wronged outsider and will stop at nothing to see him given priority. And knocking Darwin’s character is a favourite tactic of creationists for discrediting his science.

This isn’t the last word on that matter, not least because the dates of Wallace’s letter still aren’t airtight. Evolutionary geneticist Steve Jones has rightly said that “The real issue is the science and not who did it.” Oh, but we do care who did it. We do care if Einstein nicked his ideas from his first wife Mileva Maric (another silly notion), or if Gottfried Leibniz pilfered the calculus from Newton.

Partly we like the whiff of scandal. Partly we love seeing giants knocked off their pedestals. But in cases like Debye’s there are more profound questions. Debye finally left his physics institute in Berlin and moved to the US in 1940 because he refused to give up his Dutch citizenship and become German, as the Nazis demanded when they commandeered his institute for war research. Into the breach stepped Werner Heisenberg, among others, whose work on the nuclear programme still excites debate about whether or not he tried to make an atom bomb for Hitler.

After the war, Heisenberg encouraged the myth that he and his colleagues purposely delayed their research to deny Hitler such power. It’s more likely that they never in fact had to make the choice, since they weren’t given the resources of the Manhattan Project. In any event, Heisenberg began the war patriotically anticipating a quick victory. Yet he was never a Nazi, and today we have the Werner Heisenberg Institute and Prize.

Unlike Stark, Heisenberg and Debye weren’t terrible people – they behaved in the compromised, perhaps naïve way that most of us would in such circumstances. But engraving their names in stone and bronze creates difficulties. It forces us to make them unblemished icons, or conversely tempts us to demonize them. This rush to beatify brings down a weight of moral expectation that few of us could shoulder – even the deeply humane Einstein was no saint towards Maric. Why not give time more chance to weather and blur the images of great scientists, to produce enough distance for us to celebrate their achievements while overlooking their all-too-human foibles?

Wednesday, December 21, 2011

Happy Christmas to the Godless


This week I had the pleasure of taking part in one of Robin Ince’s Nine Lessons and Carols for Godless People at the Bloomsbury Theatre in London. Fending off the “I am not worthy” feeling amidst the likes of Simon Singh, Alexei Sayle and Mark Thomas, and knowing what a terrible idea it would be to try to make people laugh, I plucked a few things from my forthcoming book on curiosity, in particular Kepler’s treatise on snowflakes (on which, more shortly). But I couldn’t resist poking some fun at a few of the scientifically illiterate snowflakes we always get at Christmas, including the one above from dear Ed Milliband. I wanted to offer Ed a little get-out clause for his pentagonal snowflakes on the basis of quasicrystalline ice, but time did not permit.

Anyway, it’s a great show if you still have time to catch the last ones. I did a little interview for a podcast by New Humanist, which I mention mostly so that you can get a flavour of the other folk in the show.

Friday, December 16, 2011

Unweaving tangled relationships

Here’s the original text of my latest news story for Nature.
___________________________________________
A new statistical method discovers hidden correlations in complex data.

The American humorist Evan Esar once called statistics the science of producing unreliable facts from reliable figures. A new technique now promises to make those facts a whole lot more dependable.

Brothers David Reshef of the Broad Institute of MIT and Harvard in Cambridge, Massachusetts, Yakir Reshef of the Weizmann Institute of Science in Rehovot, Israel, and their coworkers have devised a method to extract from complex sets of data relationships and trends that are invisible to other types of statistical analysis. They describe their approach in a paper in Science today [1].

“This appears to be an outstanding achievement”, says statistician Douglas Simpson of the University of Illinois at Urbana-Champaign. “It opens up whole new avenues of inquiry.”

Here’s the basic problem. You’ve collected lots of data on some property of a system that could depend on many governing factors. To figure out what depends on what, you plot them on a graph.

If you’re lucky, you might find that this property changes in some simple way as a function of some other factor: for example, people’s health gets steadily better as their wealth increases. There are well known statistical methods for assessing how reliable such correlations are.

But what if there are many simultaneous dependencies in the data? If, say, people are also healthier if they drive less, which might not bear any obvious relation to their wealth (or might even be more prevalent among the less wealthy)? The conflict might leave both relationships hidden from traditional searches for correlations.

The problems can be far worse. Suppose you’re looking at how genes interact in an organism. The activity of one gene could be correlated with that of another, but there could be hundreds of such relationships all mixed together. To a cursory ‘eyeball’ inspection, the data might then just look like random noise.

“If you have a data set with 22 million relationships, the 500 relationships in there that you care about are effectively invisible to a human”, says Yakir Reshef.

And the relationships are all the harder to tease out if you don’t know what you’re looking for in the first place – if you have no a priori reason to suspect that this depends on that.

The new statistical method that Reshef and his colleagues have devised aims to crack precisely those problems. It can spot many superimposed correlations between variables and measure exactly how tight each relationship is, according to a quantity they call the maximal information coefficient (MIC).

A MIC of 1 implies that two variables are perfectly correlated, but possibly according to two or more simultaneous and perhaps opposing relationships: a straight line and a parabola, say. A MIC of zero indicates that there is no relationship between the variables.

To demonstrate the power of their technique, the researchers applied it to a diverse range of problems. In one case they looked at factors that influence people’s health globally in data collected by the World Health Organization. Here they were able to tease out superimposed trends – for example, how female obesity increases with income in the Pacific Islands, where it is considered a sign of status, while in the rest of the world there is no such link.

In another example, the researchers identified genes that were expressed periodically, but with differing cycle times, during the cell cycle of yeast. And they uncovered groups of human gut bacteria that proliferate or decline when diet is altered, finding that some bacteria are abundant precisely when others are not. Finally, they identified which performance factors for baseball players are most strongly correlated to their salaries.

Reshef cautions that finding statistical correlations is only the start of understanding. “At the end of the day you'll need an expert to tell you what your data mean”, he says. “But filtering out the junk in a data set in order to allow someone to explore it is often a task that doesn't require much context or specialized knowledge.”

He adds that “our hope is that this tool will be useful in just about any field that is amassing large amounts of data.” He points to genomics, proteomics, epidemiology, particle physics, sociology, neuroscience, earth and atmospheric science as just some of the scientific fields that are “saturated with data”.

Beyond this, the method should be valuable for ‘data mining’ in sports statistics, social media and economics. “I could imagine financial companies using tools like this to mine the vast amounts of data that they surely keep, or their being used to track patterns in news, societal memes, or cultural trends”, says Reshef.

One of the big remaining questions is about what causes what: the familiar mantra of statisticians is that “correlation does not imply causality”. People who floss their teeth live longer, but that doesn’t mean that flossing increases your lifespan.

“We see the issue of causality as a potential follow-up”, says Reshef. “Inferring causality is an immensely complicated problem, but has been well studied previously.”

Biostatistician Raya Khanin of the Memorial Sloan-Kettering Cancer Center in New York acknowledges the need for a technique like this but reserves judgement about whether we yet have the measure of MIC. “I’m not sure whether its performance is as good as and different from other measures”, she says.

For example, she questions the findings about the mutual exclusivity of some gut bacteria. “Having worked with this type of data, and judging from the figures, I'm quite certain that some basic correlation measures would have uncovered the same type of non-coexistence behavior,” she says.

Another bioinformatics specialist, Simon Rogers of the University of Glasgow in Scotland, also welcomes the method but cautions that the illustrative examples are preliminary at this stage. Of the yeast gene linkages, he says “one would have to do more evaluation to see if they are biologically significant.”


References
1. Reshef, D. N. et al. Science 334, 1518–1524 (2011).

Monday, December 12, 2011

Darwin not guilty: shock verdict

Here’s the pre-edited version of my latest news story for Nature. There’s somewhat more to it than can all be fitted in here, or indeed that I am at liberty to say. It seems that some may still find the authors’ reconstruction of the shipping route of Wallace’s letter open to question, even if they accept (as it seems all serious historians do) that the ‘conspiracy theory’ is bunk.

There was also more to Wallace’s letter to Hooker in September 1858 than I’ve quoted here. He said:
“I cannot but consider myself a favoured party in this matter, because it has hitherto been too much the practice in cases of this sort to impute all the merit to the first discoverer of a new fact or a new theory, & little or none to any other party who may, quite independently, have arrived at the same result a few years or a few hours later.
I also look upon it as a most fortunate circumstance that I had a short time ago commenced a correspondence with Mr. Darwin on the subject of “Varieties,” since it has led to the earlier publication of a portion of his researches & has secured to him a claim of priority which an independent publication either by myself or some other party might have injuriously affected, — for it is evident that the time has now arrived when these & similar views will be promulgated & must be fairly discussed.”

So whatever one thinks of the evidence put forward here, the notion that Darwin pilfered from Wallace really is a non-starter. Not that its advocates will take the slightest notice.
_____________________________________________________
Charles Darwin was not a plagiarist, according to two researchers who claim to have refuted the idea that he revised his own theory of evolution to fit in with that proposed in a letter Darwin received from the naturalist Alfred Russel Wallace.

This accusation has received little support from serious historians of Darwin’s life and work, who concur that Darwin and Wallace came up with the theory of evolution by natural selection independently at more or less the same time. But it has proved hard to dispel, thanks to some vociferous advocates of Wallace’s claim to primacy of the theory of evolution by natural selection.

The charge rests largely on a suggestion that in 1858 Darwin sat on a letter sent from Indonesia by Wallace, including an essay in which he described his ideas, for about two weeks before passing it on to the geologist Charles Lyell as Wallace requested.

After inspecting historical shipping records, John van Wyhe and Kees Rookmaaker, curators of the archives Darwin Online and Wallace Online and historians of science at the National University of Singapore, claim that Wallace’s letter and essay could not in fact have arrived sooner than 18 June, the very day that Darwin told Lyell he had received it [1].

Darwin had begun work on the text that became On the Origin of Species, published in 1859, as early as the 1840s, but had dallied over it. In his letter to Lyell he admitted rueing his own dilatoriness. “I never saw a more striking coincidence”, he said. “If Wallace has my M.S. sketch written out in 1842 he could not have made a better abstract.”

In the event – but not without misgivings about whether it was the honourable thing – Darwin followed the suggestion of Lyell and his friend Joseph Hooker that he write up his own views on evolution so that the papers could be presented side by side to the Linnaean Society in London. This took place on 1 July, but Darwin wasn’t present, for he was still devastated by the death of his youngest son from scarlet fever three days earlier.

The controversy about attribution would probably have mystified both Darwin and Wallace, who remained mutually respectful throughout their lives. Darwin was even ready to relinquish all priority to the idea of natural selection after seeing Wallace’s essay, until Lyell and Hooker persuaded him otherwise. And in September 1858 Wallace wrote to Hooker that “It would have caused me such pain & regret had Mr. Darwin’s excess of generosity led him to make public my paper unaccompanied by his own much earlier & I doubt not much more complete views on the same subject.”

Although most historians have accepted that Darwin’s account of the events was honest, others have argued that Wallace’s letter, sent from the island of Ternate in the Moluccas, arrived at Darwin’s house in Down in southern England, several weeks earlier than 18 June. They suggest that Darwin lied about the date of receipt because he used the intervening time to revise his own ideas in the light of Wallace’s.

The most extreme accusation came in a 2008 book The Darwin Conspiracy: Origins of a Scientific Crime by the former BBC documentary-maker Roy Davies. “Ideas contained in Wallace’s Ternate paper were plagiarised by Charles Darwin”, wrote Davies, who called this “a deliberate and iniquitous case of intellectual theft, deceit and lies.” Others have claimed that Darwin wrote to Hooker on 8 June saying that he had found a ‘missing keystone’ to his theory, and allege that he took this from Wallace’s essay.

“Many conspiracy theorists have made hay because of this unexplained date mystery”, says van Wyhe. He and Rookmaaker have now painstakingly retraced the tracks of the letter. They have discovered the sailing schedules of mail boats operated by Dutch firms in what was then the Dutch East Indies, and claim that these indicate the letter could not have left Ternate sooner than about 5 April. It was carried via Jakarta, Singapore and Sri Lanka, and then overland from Suez to Alexandria. “We found that Wallace’s essay travelled across Egypt on camels”, says van Wyhe. “That was not known before, and it’s a rather charming image to think of this essay that will change the world swaying on the back of a camel for two days.”

The researchers say that the letter then passed on by boat to Gibraltar and Southampton in England, arriving on 16 June. It was taken by train to London and then on to Down to arrive on the morning of the 18th.

“I'm not sure there really ever has been a controversy over this within the history of science community”, says evolutionary biologist John Lynch of Arizona State University, who has written extensively on cultural responses to evolutionary theory. He says that the claims of plagiarism “have had marginal, if any, influence - the evidence has failed to convince most readers.”

The story “has always seemed unlikely to me given what we know about Darwin’s generally kind and tolerant personality”, agrees geneticist Steve Jones of University College, London, whose 1999 book Almost like a Whale was an updated version of the Origin of Species.

But van Wyhe says that “these conspiracy stories are very widely believed. Thousands of people have heard that something fishy happened between Darwin and Wallace. I hear these stories very often when I give popular lectures.”

Historian of science James Lennox of the University of Pittsburgh says that “this is an important piece of evidence for Davies’ claim of deceit on Darwin’s part. I think that claim has been undermined.”

But Lennox adds that he doesn’t think it will close the ‘controversy’. “For a variety of different motives, there will, I fear, always be people who see it as their mission to attack Darwin's character as a way of undermining his remarkable scientific achievements.”

References


1. Van Wyhe, J. & Rookmaaker, K. Biol. J. Linnaean Soc. 105, 249-252 (2012). See here.

Saturday, December 10, 2011

Creativ thinking

Here’s my latest Critical Scientist column in the Guardian, published today. It now seems that this back page of the Saturday issue is going to be reshuffled for various reasons, so it isn’t clear what the column’s fate will be in the New Year. Enjoy/criticize/excoriate it while you can.
_______________________________________________________________________
The kind of idle pastime that might amuse physicists is to imagine drafting Einstein’s grant applications in 1905. “I propose to investigate the idea that light travels in little bits”, one might say. “I will explore the possibility that time slows down as things speed up” goes another. Imagine what comments those would have elicited from reviewers for the German Science Funding Agency, had such a thing existed. Instead, Einstein just did the work anyway while drawing his wages as a Technical Expert Third Class at the Bern Patent Office. And that’s how he invented quantum physics and relativity.

The moral seems to be that really innovative ideas don’t get funded – indeed, that the system is set up to exclude them. To wring research money from government agencies, you have to write a proposal that gets assessed by anonymous experts (“peer reviewers”). If its ambitions are too grand or its ideas too unconventional, there’s a strong chance it’ll be trashed. So does the money go only to only ‘safe’ proposals that plod down well-trodden avenues, timidly advancing the frontiers of knowledge a few nanometres?

There’s some truth in the accusation that grant mechanisms favour mediocrity. After all, your proposal has to specify exactly what you’re going to achieve. But how can you know the results before you’ve done the experiments, unless you’re aiming to prove the bleeding obvious?

To address this complaint, the US National Science Foundation has recently announced a new scheme for awarding grants. From next year – if Congress approves – the Creative Research Awards for Transformative Interdisciplinary Ventures (CREATIV – oh, I get it) will have $24 million to give to “unusually creative high-risk/high-reward interdisciplinary proposals.” In other words, it’s looking for really new ideas that might not work, but which would be massive if they do.

As science funding goes, $24m is peanuts – the total NSF pot is $5.5 bn. And each application is limited to $1m. But this is just a pilot project; more might follow. The real point is that CREATIV has been created at all, because it could be interpreted as an admission of NSF’s failure to support innovation previously. Needless to say, that’s not how NSF would see it. They would argue that the usual funding mechanisms have blind spots, especially when it comes to supporting research that crosses disciplinary boundaries.

This is a notorious problem. Talking up the importance of “interdisciplinarity” is all the rage, but most funds are still marshaled into conventional boundaries – medicine, say, or particle physics – so that if you have an idea for how to apply particle physics to medicine, each agency directs your grant request to the other one.

The problem is all the worse if you want to tackle a really big problem. To make a new drug you need chemists; to tackle Africa’s AIDS epidemic you will require not only drugs but the expertise of epidemiologists, sociologists, virologists and much else. The buzzword for really big solutions and technologies is “transformative” – the Internet is transformative, Viagra is not. This big-picture thinking is in vogue; the European Commission’s Future Emerging Technologies programme is promising to award €1 bn (now you’re talking) next year for transformational projects under the so-called Flagship Initiative.

Are schemes like CREATIV the way forward? Because the funding will be allocated by individual project managers rather than risking the conservatism of review panels, it could fall prey to cronyism. And who’s to say that those project managers will be any more broad-minded or perceptive? In the end, it’s a Gordian knot: only experts can properly assess proposals, but by definition their vision tends to be narrow. It’s good that CREATIV acknowledges the problem, but it remains to be seen if it’s a solution. Like movie-making or publishing, it’ll need to accept that there will be some duds. It’s a shame there aren’t more scientific problems that can be solved with pen, paper, and a patent clerk’s pay packet.

Saturday, December 03, 2011

Science criticism

My first of an undisclosed number of columns in the Saturday Guardian has appeared today. And got a shedload of online feedback.

I’m grateful for all these comments, good and bad (and indifferent), for giving me some sense of how the aims of this column are being perceived. It would be as premature for me to tell you what it is going to do at this point, as it is for anyone else to judge it. This is an experiment. We don’t know yet quite where it will go (that’s how it is with experiments, right?). No doubt feedback will have an influence on that. But I think I’d better make a few things more clear than I could in the piece itself:

1. This isn’t going to be a science-knocking column. Wouldn’t that be bizarre? Like appointing a theatre critic who hates theatre. (Someone, I am sure, will now come up with a few candidates for that description.) Theatre, art and literary critics almost inevitably think that theatre, art and literature are the most wonderful things: essential, inspiring, and deeply life-affirming. It is precisely caring strongly about it their subject that constitutes a necessary (if not sufficient) qualification for the job. Well, ditto here.

2. I’m not going to be peer-reviewing anyone’s work. It’s interesting that some of the comments still seem to evince a notion that this is the full extent of the meaningful evaluation of a piece of scientific work. Look at what Dorothy Nelkin brought to the discussion about DNA and genetics – in my view, important questions that were pretty much off the radar screen of most scientists working on those things. Sadly, the Guardian hasn’t got Dorothy Nelkin, though – it’s got me. She would never have done it for this kind of money.

3. But it’s not necessarily about bringing scientists to task for what they do or don’t do or say – at least, not uniquely. I like the three definitions of “critic” in the Free Dictionary:
i. One who forms and expresses judgments of the merits, faults, value, or truth of a matter. [Mostly what peer reviewers are supposed to do, yes?]
ii. One who specializes especially professionally in the evaluation and appreciation of literary or artistic works: a film critic; a dance critic.
iii. One who tends to make harsh or carping judgments; a faultfinder. [Mostly bores and climate sceptics, yes?]

So (ii) then: I don’t see why it’s just ‘literary or artistic works’ that deserve ‘evaluation and appreciation’. Remember that critics praise as well as pillory (and in my view, the best ones always make an effort to find what is valuable in a work). The critic is also there to offer context, draw analogies and comparisons, point to predecessors. (The sceptic might here scoff “Oh yeah, very valuable in science – the predecessors of E=mc2?” To which my answer is here). I also feel that the best critics don’t try to tell you what to think, but just suggest things it might be worth thinking about.

4. Some of these folks will be disappointed – in particular, those who seem to think that the column is going to be concerned mainly with highlighting why science has lost its way, or ignores deep philosophical conundrums, or fails in its social duty. I really hope to be able to touch on some of those issues (that is, to consider whether they’re really true), and I have much sympathy with some of what Nicholas Maxwell has written. But my themes will generally be considerably less grand and more specific, perhaps even parochial. Weekly critics tend to review what’s just opened at the Royal Court, not the state of British theatre, right? Besides, it’s important that I’m realistic about what can be attempted (let alone achieved) in this format. Remember that this is a weekly column in a newspaper, not an academic thesis. I have 600 words, and then you get Lucy Mangan.

All we want to try for, really, is a somewhat different way of writing about science: not merely explaining who did what and why it will transform our lives (which of course it mostly doesn’t), but writing about science as something with its own internal social dynamics, methodological dilemmas, cultural pressures and drivers, and as something that reflects and is reflected by the broader culture. That’s what I have generally attempted to do in my books already. And I want to make it very clear that I don’t claim any great originality in taking this perspective. Many writers have done it before, and doubtless better. It’s just that there is rarely a chance to discuss science in this way in newspapers, where it is all too often given its own little geeks’ ghetto. Indeed, Ben Goldacre’s Bad Science was one of the first efforts that successfully broke that mould. What’s new(ish) is not the idea but the opportunity.

Friday, December 02, 2011

Diamond vibrations neither here nor there

Here’s the pre-edited version of my latest news story for Nature online.
_________________________________________________

Two objects big enough for the eye to see have been placed in a weirdly connected quantum state.

A pair of diamond crystals has been spookily linked by quantum entanglement by researchers working in Oxford, Canada and Singapore.

This means that vibrations detected in the crystals could not be meaningfully assigned to one or other of them: both crystals were simultaneously vibrating and not vibrating.

Quantum entanglement is well established between quantum particles such as atoms at ultra-cold temperatures. But like most quantum effects, it doesn’t usually tend to survive either at room temperature or in objects large enough to see with the naked eye.

The team, led by Ian Walmsley of Oxford University, found a way to overcome both those limitations – demonstrating that the weird consequences of quantum theory don’t just apply at very small scales.

The result is “clever and convincing” according to Andrew Cleland, a specialist in the quantum behaviour of nanometre-scale objects at the University of California at Santa Barbara.

Entanglement was first mooted by Albert Einstein and two of his coworkers in 1935, ironically as an illustration of why quantum theory could not tell the whole story about the microscopic world.
Einstein considered two quantum particles that interact with each other so that their quantum states become interdependent. If the first particle is in state A, say, then the other must be in state B, and vice versa. The particles are then said to be entangled.

Until a measurement is made on one of the particles, its state is undetermined: it can be regarded as being in both states A and B simultaneously, known as a superposition. But a measurement ‘collapses’ this superposition into just one state or the other.

The trouble is, Einstein said, that if the particles are entangled then this measurement determines which state the other particle is in too – even if they have become separated by a vast distance. The effect of the measurement is transmitted instantaneously to the other particle, via what Einstein called ‘spooky action at a distance’. That can’t be right, he argued.

But it is, as countless experiments have since shown. Quantum entanglement is not only real but could be useful. Entangled photons of light have been used to transmit information in a way that cannot be intercepted and read without that being detectable – a technique called quantum cryptography.

And entangled quantum states of atoms or light can be used in quantum computing, where the superposition states allow much more information to be encoded in them than in conventional two-state bits.

But superpositions and entanglement are usually seen as delicate states, easily disrupted by random atomic jostling in a warm environment. This scrambling also tends to happen very quickly if the quantum states contain many interacting particles – in other words, for larger objects.

Walmsley and colleagues got round this by entangling synchronized atomic vibrations called phonons in diamond. Phonons – wavelike motions of many atoms, rather like sound waves in air – occur in all solids. But in diamond, the stiffness of the atomic lattice means that the phonons have very high frequencies and energy, and are therefore not usually active even at room temperature.

The researchers used a laser pulse to stimulate phonon vibrations in two crystals 3 mm across and 15 cm apart. They say that each phonon involves the coherent vibration of about 10**16 atoms, corresponding to a region of the crystal about 0.05 mm wide and 0.25 mm long – large enough to see with the naked eye.

There are three crucial conditions for getting entangled phonons in the two diamonds. First, a phonon must be excited with just one photon from the laser’s stream of photons. Second, this photon must be sent through a ‘beam splitter’ which directs it into one crystal or the other. If the path isn’t detected, then the photon can be considered to go both ways at once: to be in a superposition of trajectories. The resulting phonon is then in an entangled superposition too.

“If we can’t tell from which diamond the photon came, then we can’t determine in which diamond the phonon resides”, Walmsley explains. “Hence the phonon is ‘shared’ between the two diamonds.”

The third condition is that the photon must not only excite a phonon – also, part of its energy must be converted into a lower-energy photon, called a Stokes photon, that signals the presence of the phonon.

“When we detect the Stokes photon we know we have created a phonon, but we can’t know even in principle in which diamond it now resides”, says Walmsley. “This is the entangled state, for which neither the statement ‘this diamond is vibrating’ nor ‘this diamond is not vibrating’ is true.”

To verify that it’s been made, the researchers fire a second laser pulse into the two crystals to ‘read out’ the phonon, from which it draws extra energy. All the necessary conditions are satisfied only very rarely during the experiment. “They have to perform an astronomical number of attempts to get a very finite number of desired outcome”, says Cleland.

He doubts that there will be any immediate applications, partly because the entanglement is so short-lived. “I am not sure where this particular work will go from here”, he says. “I can’t think of a particular use for entanglement that lasts for only a few picoseconds [10**-12 s].”

But Walmsley is more optimistic. “Diamond could form the basis of a powerful technology for practical quantum information processing”, he says. “The optical properties of diamond make it ideal for producing tiny optical circuits on chips.”

1. K. C. Lee et al., Science 334, 1253-1256 (2011).

Thursday, December 01, 2011

Beautiful labs


Here is my latest Crucible column for the December issue of Chemistry World.
_________________________________________________________________

Fresh from visiting some science departments in China, I figure that, in appearance, these places don’t vary much the world over. They have the same pale corridors lined with boxy offices or neutral-hued, cluttered lab spaces; the same wood-clad lecture theatres with their raked seating and projection screens (few sliding blackboards now survive); the same posters of recent research lining the walls. They are unambiguously work places: functional, undemonstrative, bland.

Yet people spend their working lives here, day after day and sometimes night after night. Doesn’t all this functionalist severity and gloom stifle creativity? Clearly it needn’t, but increasingly we seem to suspect that conducive surroundings can offer stimulus to the advancement of knowledge. When the Wellcome Wing of the biochemistry department at Cambridge was designed and built in the early 1960s, its rectilinear modernist simplicity realised in concrete and glass was merely the order of the day, and celebrated by some (notably the influential architectural critic Nikolaus Pevsner) for its precision [1]. Today, stained and weathered, it fares less well, engendering that feeling I get from my old copy of Cotton & Wilkinson that learning chemistry is a dour affair.

Yet no longer are labs and scientific institutions built just to place walls around the benches and fume cupboards. Increasingly, for example, their design takes account of how best to encourage researchers to engage in informal discussions over coffee: comfy seating, daylight and blackboards are supplied to lubricate the exchanges. The notion that all serious work has to take place out of sight behind closed doors has yielded to the advent of open atria and glass walls, exemplified by the new biochemistry laboratory at Oxford, designed by Hawkins/Brown, which opened three years ago at a cost of nearly £50m. Not only does this space take its cue from the open-plan office, but it also follows the corporate habit of adorning the interior with expensive artworks, such as the flock of resin birds that hang suspended in the atrium. Some might grumble that the likes of Hans Krebs and Dorothy Hodgkin did not seem to need art around them to think big thoughts – but the department’s Mark Sansom has eloquently defended thus the value of the project’s artistic component: “if you have a greater degree of visual literacy, you reflect more on both the way you represent things, and also the way that may limit the way you think about them” [2]. Besides, where would you rather work?

The watchword for this new approach to laboratory design is accessibility: physically, visually, intellectually. Jonathan Hodgkin in Oxford’s biochemistry department explains that, in making art a part of the new building’s design, “part of our aim is to humanize the image of science for the public" [2]. Similarly, Terry Farrell, who was behind the dramatic (and controversial) redesign of the Royal Institution in London, says that his aim was to reconfigure the place “not as a museum but as a living, working, lively and engaging institution, which will inspire an enthusiasm for science in future generations” [3]. Even someone like me who loved the dusty, crammed warren that was the old RI has to admire the result, although the compromises to the research lab space contributed to the internal tensions of the project.

Or take the striking glass facades of the new Frick Chemistry Laboratory at Princeton, whose chief architect Michael Hopkins says that "We wanted to inspire the younger students by letting them see the workings of the department.” A common theme is to use the design to echo the science, as for example in the double-helical staircase of the European Molecular Biology Laboratory’s Advanced Training Centre in Heidelberg.

However, not all beautiful labs are new ones, a point illustrated in a recent list of “the 10 most beautiful college science labs” compiled by the US-based OnlineColleges.net. While some of these have been selected for their sleek contemporary feel – the Frick building is one, and the stunning new Physical Sciences Building that abuts Cornell’s neoclassical Baker Laboratory is another – others are more venerable. Who could quibble, for example, with the inclusion of Harvard’s Mallinckrodt Laboratory, an imposing neoclassical edifice built in the 1920s and home to the chemistry department? And then there is the Victorian gothic of the ‘Abbot’s kitchen’ in the Oxford inorganic chemistry labs, a delightful feature that I shamefully overlooked on many a tramp along South Parks Road to coax more crystals out of solution. Or the ivy-coated mock-gothic of Chicago’s Ryerson Physical Laboratory, where Robert Millikan deduced the electron’s charge.

Among these ‘beautiful labs’, chemistry seems to be represented disproportionately. Have we chemists perhaps a stronger aesthetic sensibility?

References
1. M. Kemp, Nature 395, 849 (1998).
2. G. Ferry, Nature 457, 541 (2009).
3. T. Farrell, Interiors and the Legacy of Postmodernism (Laurence King, London, 2011).