Category Archive • Science
December 21, 2004
David Tebbutt on blogging: "It's ridiculously like how the brain appears to work"

I am surely going to have more to say here about this gentleman, but for the moment, read this fascinating little aborted tangent, so to speak:

What I have discovered (hence all the links above) is that the intensely networked or linked nature of blogs is what gives it massive value – way beyond that of the internet generally. It's a case of following trusted chains – If A links to B and A is trustworthy, then B is likely to be trustworthy too.

Within an organisation, this is even more likely to be true. The trustworthy people will gain connections and the less trustworthy will be sidelined. It's ridiculously like how the brain appears to work. But I won't go there. Suffice it to say that I've spent a lot of my life thinking about things like this.

I really hope that he does "go there", some time Real Soon Now.

Posted by Brian Micklethwait at 03:25 PM
November 12, 2004
Michael Jennings on scientists getting the credit they deserve

MichaelJCanonA75s.jpgMy latest CNE Intellectual Property piece is up. It was triggered by the student who is suing Ground Zero architect David Childs for allegedly nicking one of his student designs to use for the big tower at the heart of the scheme. I then talked about academic idea-stealing in other fields, especially science.

This article, linked to today by A&LD, discusses how the expansion of science may have lowered its ethical standards, a matter also touched on by Michael Jennings (recent picture of him there), in the following email which he sent me in response to my CNE piece:

Whilst academia is indeed full of asymmetric relationships in which more senior academics gain credit for the work of younger people, most scientific fields are small enough, and the participants meet each other at conferences and talk to one another often enough, that in the case of any important work everybody knows who actually did it, regardless of whose name is on the paper. In practice, it is usually a case of figuring out which of the multiple people whose names are on the paper actually did the work. Maybe this is changing as academia gets bigger and more corporate, but I am not so sure. For one thing, scientific research responds to this by breaking up into more and more fields with a relatively small number of individuals in them, and I think this is unlike architecture. It varies from field to field though. Some fields consist of large laboratories with hundreds of people, but most are as I describe.

And the most asymmetric relationship that exists in scientific academia is that between a supervisor/adviser and a PhD student. In most circumstances a supervisor has a de facto veto over whether a student gets a PhD. This can lead to abuses of various kinds, and also to somewhat weird human relationships. Nothing bad happened to me personally in this regard, but I have seen one or two slightly dubious things happen to other people.

Rather amusingly if you have done a PhD, science fiction writer Vernor Vinge – a former mathematics professor himself – wrote a story a year or so back about a professor who has a virtual reality simulation of one of his students created and told that he has to get so much work done in the next year, or he will not be allowed to get his PhD and runs it over and over again to get this hypermotivated student to do near infinite amounts of work for him.

And as for your final comment about someone suing Nobel Laureates, the interesting issue is that in the sciences the Nobel Prize committees have credibility, and scientific Nobel Prizes are considered such a great honour at least partly because they are seen to have almost invariably been given to the right people, and that means the committee goes to great trouble to see that they are given to the people who actually did the work. In particularly controversial circumstances, there have been a number of incidents where people have not received the Nobel prize until decades after they did the original work, and where the prize was awarded within a year or two of the death of the more senior academic who laid claim to the work. More senior academics are usually older, so waiting for the
wrong person to die before giving the award to the right person is a workable strategy.

One thing that comes into play here is that there is no limit on the number of authors that may appear on a paper published in most journals, whereas a Nobel prize in the sciences is never shared by more than three people, which means that if the wrong people are awarded a Nobel prize, the right ones usually miss out. Even within this constraint, though, simply giving the prize to the three people whose names are on the paper is never done. In such circumstances the prize tends to be shared between people doing work in the same or closely related fields for different universities/laboratories rather than by people who worked together.

You can actually tell certain things about who did what by the way in which the prize money is split in a three way award. If the three recipients each get a third of the money, this means either that the three of them did related but separate pieces of work, or that the three of them were involved in doing the same piece of work (either as collaborators or (more often) by coming up with the same results independently). If one of the recipients gets 50% of the money and the others 25% each, then this means that the one who got 50% did a separate but related piece of work to the other two, who were involved in doing the same work, either together or independently.

Posted by Brian Micklethwait at 12:33 PM
July 15, 2004
Scientists with model

dnamodel.gifAnother "culture means what I say it means" posting.

While concocting a posting (it will appear tomorrow – link here when it does) for the blog that pays me, I came across this famous photograph, and in a particularly clear version (often it is very blurred), here.

I put in my posting that how Crick and Watson communicated their DNA idea didn't matter. It was enough that they got it across somehow.

But I wonder. There is something very beautiful about a helix, and all the more so when the elements that go to make it are complicated and cloudy and confused. The essential helicalness of the combined object is then all the more remarkable. Complexity leading to simplicity, blah blah. I wonder how well Crick and Watson would have done with such primitive modelling technology had the shape they were chasing been less simple and elegant. Try googling for images of "protein". See what I mean?

Well, I don't know. This is really just an excuse to stick up that picture.

Has anyone redone this, and redone it better, as an oil painting? It would make sense if they did. Here is a clue to what that might look like.

Posted by Brian Micklethwait at 03:47 AM
June 28, 2004
The poetry of cloud classification

I continue to enjoy this book by the great Bill Bryson. Here is another snatch from it, about one of the key figures in the history of an activity I greatly admire:

LukeHoward.jpg

The person most frequently identified as the father of modern meteorology was an English pharmacist named Luke Howard, who came to prominence at the beginning of the nineteenth century. Howard is chiefly remembered now for giving cloud types their names in 1803. …

Howard divided clouds into three groups: stratus for the layered clouds, cumulus for the fluffy ones (the word means heaped in Latin) and cirrus (meaning curled) for the high, thin feathery formations that generally presage colder weather. To these he subsequently added a fourth term, nimbus (from the Latin for cloud), for a rain cloud. The beauty of Howard's system was that the basic components could be freely recombined to describe every shape and size of passing cloud - stratocumulus, cirrostratus, cumulo-nimbus, and so on. It was an immediate hit, and not just in England. Goethe was so taken with the system that he dedicated four poems to Howard.

And how about this?

Howards system has been much added to over the years, so much so that the encyclopedic if little-read International Cloud Atlas runs to two volumes, but interestingly virtually all the post-Howard cloud types - mammatus, pileus, nebulosis, spissatus, floccus and mediocris are a sampling - have never caught on with anyone outside meteorology and not terribly much within it, I'm told. Incidentally, the first, much thinner edition of that atlas, produced in 1896, divided clouds into ten basic types, of which the plumpest and most cushiony-looking was number nine, cumulo-nimbus. That seems to have been the source of the expression 'to be on cloud nine'.

Howard is not forgotten.

Posted by Brian Micklethwait at 01:36 PM
June 17, 2004
Bill Bryson on the History of Nearly Everything – including Max Planck and J. Willard Gibbs

A Short History of Nearly Everything, which is Bill Bryson's latest book, needs no plug from a mere blogger like me. But I am going to plug it anyway. I am in the middle of reading it, and am enjoying it hugely.

BrysonEverything.jpgI don't just admire the book itself; I further hugely admire Bill Bryson's decision to write it in the first place. This decision now makes perfect sense, but when Bryson first embarked on this project he must have felt that he was taking quite a risk.

After all, this man is not a science writer; he's a travel writer.

But think about it for a moment. How do you now write a book about the history of science? Anyone attempting this faces huge problems. There is just so damn much of it, for starters. And then, there is the problem that however hard you try to explain it all, there are great chunks of it that just won't make sense to most people, no matter what you say, and no matter how perfectly you may understand it all yourself.

Above all, books that attempt to popularise science can be deadly dull. All those abstractions. All those fancy semi- or in-comprehensible ideas and diagrams and graphs and long, long words. How do you keep your reader's attention?

When you think about it, an expert travel writer is the ideal person to tackle all these problems. Rabitting on about foreign countries you have visited is, notoriously, a habit which can empty rooms which were at first jam packed with your most devoted friends. Successful travel writers have mastered the subtle art of not being travel bores, because they had to, to even get published. They know how to sprinkle engaging anecdotes into their narrative. They know how to enliven the journey with intriguing little trips down entertaining byways. They know how to keep you interested and amused and diverted.

Faced with the fact that they don't fully understand the place that they are writing about, they don't panic and treat this as a scandalous anomaly. On the contrary, they expect to be somewhat confused, and to be able to tell only some of the story. The Innocent Abroad is just the guy to tell you as much as you are ever going to get about General Relativity or the nuances of the Big Bang, and such a guide to the territory can supply further insights into the nature of science that a more seasoned observer of science might be too close to observe. So Bryson is the ideal man for this job.

It also helps enormously that the job is also just what Bryson himself has been needing.

Frankly, Bryson's books have been (a) that fantastically, superbly, insanely great book about small town America, called The Lost Continent (what was insanely great about it for me was probably that it was the first Bryson book I read), and then (b) several other travel books which are pretty great but not quite as great as The Lost Continent. Oh yes, and (c) there was that (those?) quite good book(s?) about the history of language. All good stuff. But, frankly, the travel books in particular were becoming something of a stale formula. With, as I say, the wisdom of hindsight, we can see that in Science, Bryson has at last discovered a new continent worthy of his whole attention. With Science, the Bryson formula is renewed and reinvigorated.

Risk? New, unexplored territory? A step into the unknown and a stab in the dark? This is exactly what Bryson has actually been missing for some time.

On the front cover, John Waller of the Guardian says that this book is a great "rough guide" to science, which shows that Bryson's publishers also entirely understand the travel guide nature of his achievement, and the appropriateness of this way of tackling the subject.

So anyway, here's a chunk from A Short History of Everything that struck me as especially entertaining.

In 1875, when a young German in Kiel named Max Planck was deciding whether to devote his life to mathematics or to physics, he was urged most heartily not to choose physics because the breakthroughs had all been made there. The coming century, he was assured, would be one of consolidation and refinement, not revolution. Planck didn't listen. He studied theoretical physics and threw himself body and soul into work on entropy, a process at the heart of thermodynamics, which seemed to hold much promise for an ambitious young man. In 1891 he produced his results and learned to his dismay that the important work on entropy had in fact been done already, in this instance by a retiring scholar at Yale University named J. Willard Gibbs.

Gibbs is perhaps the most brilliant person most people have never heard of. Modest to the point of near-invisibility, he passed virtually the whole of his life, apart from three years spent studying in Europe, within a three-block area bounded by his house and the Yale campus in New Haven, Connecticut. For his first ten years at Yale he didn't even bother to draw a salary. (He had independent means.) From 1871, when he joined the university as a professor, to his death in 1903, his courses attracted an average of slightly over one student a semester. His written work was difficult to follow and employed a private form of notation that many found incomprehensible. But buried among his arcane formulations were insights of the loftiest brilliance.

In 1875-8, Gibbs produced a series of papers, collectively titled On the Equilibrium of Heterogeneous Substances, which dazzlingly elucidated the thermodynamic principles of, well, nearly everything –
'gases, mixtures, surfaces, solids, phase changes ... chemical reactions, electrochemical cells, sedimentation, and osmosis', to quote William H. Cropper. In essence, what Gibbs did was show that thermodynamics didn't apply simply to heat and energy at the sort of large and noisy scale of the steam engine, but was also present and influential at the atomic level of chemical reactions. Gibbs's Equilibrium has been called 'the Principia of thermodynamics', but for reasons that defy speculation Gibbs chose to publish these landmark observations in the Transactions of the Connecticut Academy of Arts and Sciences, a journal that managed to be obscure even in Connecticut, which is why Planck did not hear of him until too late.

I can't resist adding this next bit, about Planck, which is only a footnote. I have read a lot of scientific popularisations over the years, but this was totally new to me:

Planck was often unlucky in life. His beloved first wife died early, in 1909, and the younger of his two sons was killed in the First World War. He also had twin daughters whom he adored. One died giving birth. The surviving twin went to look after the baby and fell in love with her sister's husband. They married and two years later she died in childbirth. In 1944, when Planck was eighty-five, an Allied bomb fell on his house and he lost everything – papers, diaries, a lifetime of accumulations. The following year his surviving son was caught in a conspiracy to assassinate Hitler and executed.

How's that for bad luck?

It helps greatly that you really feel that Bryson has done his preparatory reading really thoroughly, and that all these out-of-the-way facts that he has dug up really are facts. I certainly haven't spotted any wrong notes so far.

I've only begun to enumerate the many virtues of this book, which is only appropriate since I have only begun to read it. As I read on, I will doubtless have further praise to heap upon it.

But now that Bryson has finished this book, what next? How about a history of art? Maybe that's a bit too obvious a follow up. But I would love to read it, as I'm sure would millions of others.

Posted by Brian Micklethwait at 11:49 PM
June 07, 2004
A little cat-blogging – Stephen Budiansky on The Character of Cats

cat1.jpgThere is something of a tradition in the blogosphere of writing about cats. Does their individualism and their take-it-or-leave-it psychological self-sufficiency appeal to the blogging mind? Possibly.

My own motive for wanting to learn more about cats is that not so long ago I finally learned a bit about dogs, from my sister and her husband, who live out in the wondrous wilds of West Wales. I've always liked cats, and we always had a cat in our home when I was young. (Too bad my current home is not cat-appropriate.) Also, I get along with and feel that I understand cats, or like to think that I do, although I'm sure I have much to learn about them too.

And I think I have now found an excellent cat book to enable me to do this, namely The Character of Cats by Stephen Budiansky. (As is my usual habit, I got it for next to nothing in a London remainder shop.)

Happily, this is a book based on science, and not just on down-market chatsy anecdote. Books like this, which present the latest scientific evidence and thinking about this or that topic in an accessible yet non-patronising way are, in my opinion, one of the great unsung glories of our culture. The writing of the best of these books is as good as writing gets, yet there are none of those self-consciously arty affectations which (if my prejudices are anything to go by) disfigure so much of contemporary 'literature'.

This, I think, is because these writers really do have important and interesting stories to tell. For all its faults and disappointments, our civilisation shows no sign of flagging scientifically, with new dramas and excitements being uncovered year by year. Nor is our culture flagging technologically, and new science owes a lot to new technology (for example the computer technology that is now being used to unscramble DNA), as new science always has.

Anyway, here are the first few pages of The Character of Cats. The opening paragraph is especially fine, I think. So did the publishers, because, along with the customary eulogies from other writers, they put it on the back cover of the book.

There are no search-and-rescue cats, guard cats, Seeing Eye cats, bomb-detecting cats, drug-sniffing cats, escaped-convict-tracking cats, sheep cats, sled cats, gun cats, obedience-trained cats, Frisbee-catching cats, or slipper-fetching cats. This is a matter of considerable relief. To tell the scientific story of dogs is to risk bringing down the wrath of legions of myth-soaked animal lovers, saturated as they are with tales of canine derring-do, loyalty, and "unconditional love," whatever that means. No one has any illusions about cats. Cats are cats, and any real cat owner knows it. That constant fraction of the human race that stalwartly admires and enjoys the company of cats long ago realized that they had better accept cats on their own terms, for the cats would have it no other way.

Dog science, inevitably, is about shattering myths. Cat science, rather more happily, is about explaining mysteries.

Of mysteries there is no shortage. Cats, with their shining eyes and silent footfalls, have always eluded explanation. Throughout the several thousand years of shared history between cats and human beings, cats have been a source of wonder and unease, reverence and superstition. Needless to say, given that man in his natural state is a simple and impressionable being, a certain amount of this mystification is the product of nothing more than man's own overworked imagination. Primitive peoples who lacked cats were perfectly capable of finding mystery and magic in rocks, trees, blades of grass, and cargo pallets dropped from Allied bombers.

But, in fact, cats really are mysterious. The ambivalent and superstitious emotions that the cat has evoked over the centuries mirror well the ambivalent and paradoxical place the cat truly occupies in nature and in the world of humans. Cats defy most of the normal rules about how and why animals came to enter the company of humans. The behavior of the cat in its association with human society is extraordinarily varied and complex: adaptive and perverse, affectionate and wary, gregarious and reclusive, dependent and aloof. The intelligence of the cat is an amalgam of extremes, of hard-wired instinct and adaptive learning. Cats have spread over the world in the company of man faster than man himself ever did, all the while keeping one foot in the jungle. Cats are the least tamed and the most successful of domestic species, the least altered within but the most changed in circumstance without.

So these mysteries are real – they are the product of nature, not merely our superstitious or ignorant imaginations – but even so they are our own doing in a way, because until recently science has ignored cats. The domestic cat's wild counterpart, the European, African, and Asiatic wildcat Felis silvestris, is among the least studied of wild felines. It is a small, elusive, mostly forest-dwelling animal, and scientists were not able to find out much about the behavior, ecology, and genetics of small, elusive felines until the tools of molecular genetics and radiotelemetry lately began to change things. There has been a degree of scientific snobbery at work, too. Real wildlife biologists don't study pussycats. They don their safari jackets, clamber aboard their Land Rovers, and plunge down some rough and foreboding dirt track in dangerous pursuit of lions and tigers and bears (oh my). The flawed but longstanding belief held by many zoologists and ethologists that domestic animals are all just a bunch of sappy degenerates unworthy of serious scientific scrutiny has not helped, either. So the kind of insights that only science can offer – to help us understand why cats do the things they do, how they perceive their universe, and how they came to share, with such remarkable success, our homes and lives and hearts – has been notably absent from the considerable literature of the cat.

On the other hand, domestic cats did figure prominently in early studies of intelligence and learning and psychology, largely because they were so readily available and so cooperative. And in part because of that foundation there has been a great deal of new research on cognition and brain perception and the neurochemistry of emotion involving cats in recent years. A newfound recognition that all domestic animals represent a vibrant evolutionary story of adaptation and change has also brought newfound and well-deserved attention to the cat from evolutionary biologists and conservation biologists. And perhaps most of all, there is a cadre of basic research scientists today in fields ranging from the neuroscience of vision to molecular genetics who simply like cats, and who are eager to apply the tools of their trades to understanding what makes them tick. It doesn't hurt that many genetic diseases in humans, including hemophilia, diabetes, and Tay-Sachs disease, also occur in cats and that more than twenty-five genes responsible for such inborn diseases have been found in cats. That gives cat genome research a practical payoff, of interest to the powers that be that dispense research grants. In doing this very practical medical research, however, a lot of other stuff comes tumbling out, for the genome of a species is not just a catalogue of ailments or even a blueprint for an organism but also a recorded history of that species, of its travels and fate over space and time.

Cat science is the biography of a species. It is an exploration of where cats came from and how they flourished in the company of man, how they changed and how they stayed the same; it is about their wants and needs, their thoughts and urges, their rationality and their perversity, their group mores and their individual distinctiveness. Like any good biography, it is a tale worth reading for its own sake, but it is also a story with a moral: Cats are not so much pets as fellow travelers, and we impose our hopes and wishes and expectations upon them to our peril. They have their own biological niche and destiny, their own rules of social interaction, their own ways of ordering and perceiving the world. Their astonishing adaptability has found them a place with us, but that one foot is ever in the jungle. Understanding the true nature of cats, with all that science has to offer, is enlightening to us, and good for cats.

Posted by Brian Micklethwait at 02:02 PM
May 25, 2004
Scott Hillis on John C Wright's The Golden Age

This email has arrived at Samizdata HQ and been sent around to all the Samizdatistas. In case it falls between the cracks, with each of us assuming that someone else will flag it up, I take the liberty of reproducing this eamil here in its entirety, which is about a matter where the interests of Samizdata and Brian's Culture Blog overlap. If there is no response to this email at Samizdata I will perhaps (I know from bitter experience that I am bad at keeping promises so I do not promise this) do a shorter posting there myself.

I have not myself read the book about which Scott Hillis writes so enthusiastically.

GoldenAge.jpgGreetings Samizdatistas!

This to call your attention to a science fiction novel that I believe is one of the most important pieces of libertarian fiction in recent memory.

The book is The Golden Age by John C. Wright. It was published in 2002 and won critical praise for his flowery revival of the romantic space opera. It is one of the finest works of science fiction I have read in at least a decade.

I am writing you because I found no reference to it while searching the Samizdata site (of which I am a regular reader). Please accept my apologies if this perception is mistaken.

Mr. Wright was schooled in classics from Homer to The Federalist Papers, and his erudition shines through on every page. Characters are named after personages from ancient myth. He appears equally passionate about scientific realism. While the book paints incredible advances in computing and nano-scale technology, there are no warp drives or blatant breaking of the known laws of physics.

Yet his scenarios and inventions are so fantastic, so wonderfully fresh and well-crafted, as to send the mind reeling.

All this would be enough to recommend the book on its own, but I believe the book's philosophical merits will make of particular interest to Samizdata's contributors and readers. In interviews, Mr. Wright states outright that he created his future society to be a libertarian utopia. In fact, he wrote it partly as an explicit rebuttal to certain portrayals of communist utopias.

Of course, there is not much drama in an actual utopia, and the central conflict in the novel arises out of the desire of one man to upset the conventions of his prosperous society in pursuit of a magnificent vision. In one interview, Mr. Wright named his target audience when he says, "I am certainly writing for those who believe in the American dream."

The book is not long, but it took me more than a week to finish it simply because it is so dense. Every page is packed with meaning, and I found myself rereading passages over and over to extract their full meaning. No words are wasted, and readers are rewarded for paying attention to details like names, titles and descriptions of the various factions and elements in Mr. Wrights fabulous future society.

Here are two amazing and revealing interviews with Mr. Wright. I challenge any Samizdata sci-fi fan to read these exchanges and NOT immediately rush to read "The Golden Age" (and its sequel, "The Phoenix Ascendant").

For the record, I have no association with Mr. Wright, his publisher or any of that. I am simply a long-time Samizdata reader who has been deeply affected by a remarkable work of science fiction, and hope the word can be spread.

Thank you for your attention.

Scott Hillis - Beijing - China

Posted by Brian Micklethwait at 03:03 PM
April 14, 2004
Scientists fighting for The Truth on the telly

Yesterday evening I watched two television plays of a very similar sort, which often seems to happen on TV. One channel puts on a Clint Eastwood movie, and to cut into that audience another channel puts on another Clint movie, often at the exact same time. Most aggravating, if you're a Clint fan, which I often am.

BBC2 TV last night showed Hawking, and then later BBC4 TV showed Life Story. But this time the BBC was cooperating with itself, because after Hawking on BBC2 they had another little show about Hawking's work on BBC4, just before they then showed Life Story also on BBC4. There was no Clint style clash this time.

Hawking was about Stephen Hawking, and Life Story was about the cracking of DNA by Watson and Crick. I saw Life Story when it was first shown ages ago (1986?), but like everyone else watching it, I was watching Hawking for the first time.

The trouble with plays about science of this exalted sort is that someone like me has only a very dim idea of what is being talked about by all those brainy people, and I was agreeably surprised by how much incidental information I did manage to gather up, not just about the personalities involved, but about some of the actual key concepts.

Both types of information were very welcome. For example, I have never until now known just where Roger Penrose fits into the larger scientific scheme of things. Penrose: brainbox. That was about the limit of my knowledge of this man and his works. Now I learn that he was the first bloke to propose the existence of Black Holes. And as for Hawking …

Until now I have always been deeply suspicious of the cult of Hawking, suspecting that, had he not been so photogenically crippled and obliged to talk with a machine jammed against his emaciated throat, we would pay him no attention at all. But now I learn that Hawking actually has contributed something of scientific substance to the ongoing debate about what The Universe consists of. By applying Penrose's Black Hole notion to The Entire Universe, while reversing the direction of its occurrence, he has turned a relatively small planets-disappearing-down-a-local-plughole act into The Entire Universe starting out from a single point in a huge explosion. A Big Bang, that is to say. Okay, I am hazy about the proof of all that. I could not cover a blackboard with mathematical equations which meaningfully allude to all this. But, very roughly, I get it. Since I expected to get exactly nothing when I started watching Hawking, that was a real plus.

I now actually want to read this.

To put it another way, I stopped feeling sorry for Hawking and started feeling appropriately envious. He is not the physically ruined object of an idiot modern celeb-cult, or not only that. He actually did get his trembling hands onto a major piece of The Truth, the jammy bastard. His grin of self-satisfaction and self-congratulation as he staggered off into his own version of the sunset – his unthreatened mind trapped inside his ever more unreliable body – was really something to see, and a triumph for all concerned.

And nor did I know that Fred Hoyle was famous for disagreeing with all this Big Bang stuff.

Oh, I sort of knew, in the sense of having read it somewhere, and having then forgotten it. And no doubt I will forget all this stuff again very soon. After all, knowing what Penrose and Hawking and Hoyle all said is of no direct importance to me. I won't have to remember any of this, so presumably I won't. Nevertheless, acquainting oneself with the mysteries of cosmology, which have (and here I complete agree with Hoyle's ferocious atheism and despise the deluded religiosity of Mrs Hawking) now entirely replaced the mysteries of the Christian version of cosmology, is something that all educated people should do from time to time.

Personally, I now think that cosmology is an excuse for more total rubbish than any other ology around these days., my favourite "you have got to be kidding" piece of "science" these days being all that malarkey that says that there are lots of different multidimensional universes fanning out in all directions from every single moment in time and space, or whatever the hell it says. Now to me that is just these people ing, in high faluting language: "Well actually we don't know." When multiple universes shows up on telescopes and give us better flat screen TV sets then I'll believe them. Until then, I'm a multi-universe agnostic.

But insofar as the Big Bang has apparently shown up on the telescopes as otherwise inexplicable hissing (as a scientist played by Dempsey from Dempsey and Makepiece explained), then fine: I believe in the Big Bang, and I await the resulting improved TV sets eagerly. I'm a member of the congregation of science, in other words, even if I choose to regard some of the sermons as drivel. Me watching these TV shows approvingly is me nodding towards the altar of my religion.

Life Story caused quite a stir when it was first shown, because it showed scientists not as ego-less priests of The Truth, but as fiercely competitive racers after it. Well, it showed Rosie Franklin as an ego-less priest of The Truth, but the point was, as she herself admitted, she did not crack DNA, while the boy racers Crick and Watson did. When Crick and Watson began their version of the quest, the theory was that cracking DNA would swallow up the lifetimes of all who embarked on it. Crick and Watson had it all up and modelled within a few months, or whatever it was.

This lesson – that, even though the truth is The Truth, scientists are human – now having been thoroughly learned by the sort of people who like me watch TV shows about scientific breakthroughs, I was not at all shocked to learn that cosmology is also a field in which those racing each other for The Truth cover each other in great jets of mud and generally fight like hell to win their various races. Quite right, and good for them.

And good for the BBC. Nobody has much good to say of the BBC in my part of the political landscape, and I often join in with such complaining myself. But this kind of thing justifies the license fee if anything can, I think.

Posted by Brian Micklethwait at 11:14 PM
February 29, 2004
The neuroscience of art

The Telegraph reports on this man and this book, which has a bearing on culture and all that.

Quote:

Newly hatched gulls get their food by pecking at a red spot on their mother's yellow beak. The birds don't even need their mother to be present - they are as happy pestering a disembodied beak as the real thing.

But 50 years ago Niko Tinbergen, an Oxford University scientist, made an extraordinary discovery. When presented with an abstract version of the beak - a yellow stick with three red stripes - the chicks went crazy. The stick excited the baby birds far more than their mothers' bills.

Tinbergen's creation bore no resemblance to a real beak and yet to the birds' brains it was somehow more "real". By exaggerating the reality of a beak, Tinbergen did what all artists strive for - he captured the essence of reality.

The experiment raised intriguing questions about the nature of art. If a hyper-real painting triggered such a reaction in the visual processing regions of a bird's brain, might not art be doing the same in human minds?

Vilayanur Ramachandran, professor of neuroscience and psychology at the University of California, San Diego, author of The Emerging Mind and one of the world's leading neuroscientists, believes the answer is yes.

So Modern Art is the product of Darwinian evolution. Well, cartoons anyway.

Posted by Brian Micklethwait at 11:44 PM
November 11, 2003
The evolutionary biology of music appreciation

I enjoyed this article by Christine Kenneally, linked to today by the indispensable Arts & Letters Daily.

The concluding paragraphs tickled me especially:

No matter how the connection between language and music is parsed, what is apparent is that our sense of music, even our love for it, is as deeply rooted in our biology and in our brains as language is. This is most obvious with babies, says Sandra Trehub at the University of Toronto, who also published a paper in the Nature Neuroscience special issue.

For babies, music and speech are on a continuum. Mothers use musical speech to "regulate infants' emotional states," Trehub says. Regardless of what language they speak, the voice all mothers use with babies is the same: "something between speech and song." This kind of communication "puts the baby in a trance-like state, which may proceed to sleep or extended periods of rapture." So if the babies of the world could understand the latest research on language and music, they probably wouldn't be very surprised. The upshot, says Trehub, is that music may be even more of a necessity than we realize.

That being only the checkmate, so to speak, of a quite extended argument, involving the ways in which animals might appreciate music (the point being that it would have to be their music rather than ours), and much else besides. What I found persuasive was that several times while reading the piece, I found myself asking: but what about …?, only for that exact point to be answered in the next paragraph.

Worth reading all of it, in other words.

Of course, the piece doesn't explain music in its entirety. In particular it doesn't explain how music has changed and developed – and sometimes, I suppose, retreated and regressed – over the centuries. But it does sketch out the biological, species-specific expressive language within the limits of which the human effort to make music has necessarily expressed itself.

In particular, it explains with great finality that music will always be with us.

Posted by Brian Micklethwait at 03:52 PM
September 13, 2003
Yoghurt-on-a-disc – "What would happen if I purposely grew fungi, yeast or bacteria in direct contact with the media, and manipulated their fractal dimensions?"

The Dave Barry blog is an endless source of cultural stimulation. Here's a link from him to this article, about a subject of zero interest to me, namely DJs frigging about with CDs and gramophone records in order to entertained the zonked out raving masses. Zero interest until now:

PARIS (AFP) – Want to listen to something really different? Smear yoghurt on your favourite CD. Let it dry. Slide the disc into the player. Crank up the volume. And hear that music in a completely fresh, possibly spine-chilling way.

A joke right? Of course a joke. Otherwise Dave Barry's emailer fan club wouldn't have picked it up and sent it in. But serious also:

Jones' pet area of research is how signals can be transmitted through biological cells, which grow in a so-called "fractal" way, like tree branches.

He became intrigued by experimental musicians and DJs who, from the mid-1980s, sanded, varnished or even slapped paint onto CDs to create new sounds to sample.

Yes, that would explain quite a lot.

Music on CDs comes from tiny etched pits in the tracks that represent binary digits, the "0" or "1" that make up a computer code. The code, reflected back by the laser in the CD player, is then processed back into an electronic signal and converted to sound.

Mutilating the surface, so that some of the pits are missed, thus changes the sound.

You don't say. But this is where it gets more interesting.

But Jones found that much subtler sounds could be achieved using fungal or bacterial growth, rather than scraping or coating the disc's surface.

This is because these life forms introduce tiny errors, on a micron on nanoscale level rather than the far bigger millimetric scale.

In addition, the way fungus and bacteria can shape the sound in weird ways.

My guess is that various members of the slacker generation have already discovered this phenomenon, but didn't grasp its scientific significance.

Bacteria grow by cell division, while fungi grow by branching. Both processes can be controlled by adding malt extract to the disc as food.

Jones told New Scientist that he came across the discovery quite by accident, when he was DJing in his bar.

"I often change CDs when my hands are wet with beer," he told the British weekly. "One night I must have changed the CDs, touched the data surface, then left them for use on another night."

The following week, he put on a CD by Nine Inch Nails and found that it would not play properly because fungus had grown on it.

Don't you just hate it when that happens? Besides which, when did a CD by "Nine Inch Nails" ever "play properly"? But now I'm showing my culture.

But the fungus had not ruined the disc. …

Of course not. How could noises made by some nine inch nails be ruined? All they could ever be is different.

… The original audio sequence was there, but it would sometimes change in pitch and there were small staccato noises in the background.

And now the eureka moment.

He asked himself: "What would happen if I purposely grew fungi, yeast or bacteria in direct contact with the media, and manipulated their fractal dimensions?"

That's my question of the year so far.

Yoghurt-on-a-disc was born.

Jones says that he has yet to damage any of his discs or players with his pioneering work, but warns that the technique does crash CD players on computers because the software cannot cope.

Ah, the grand tradition of scientists pissing about and calling it research. "Head ache Jones?" "Yes sir. Rather too much fractal fungoid sonic analysis last night, sir." "Take it easy Jones, you're a valuable man." "Will do sir."

The internet will soon be awash with these noises, I think. Fungoid fractal sonics. I hate it already.

I just checked that the date is not April 1st. Unsurprisingly, it is not April 1st. It is September 13th. I mean, how could you make this story up?

Posted by Brian Micklethwait at 09:33 AM
August 20, 2003
Speculations

A great way to edge your profile in the blogosphere in the upwards direction is to do one of those links to a Samizdata posting that turns the bit where it says "TrackBack [0]" to "TrackBack [1]". Noticing such a circumstance (and making it go now to "TrackBack [2]") at the top of Dale Amon's posting about SpaceShipOne (which I have a soft spot for simply because it photographs so prettily), I backtracked my way to a blog called The Speculist, which is about the onward march into the wild blue future yonder of technology. Whenever Samizdata gets too gloomy about the European Union, income tax, UK gun control, etc., this will be one of the places I go for optimistic refreshment about life's possibilities.

My favourite posting there at present, edging the one about DNA computing into second place, is this one about Chinese human-rabbit hybrids.

Hollywood must be told about this. The pitch: The Fly, only instead of a fly it's a bunny. The Bunny! Jeff Goldblum with fur and whiskers (which he has already practised doing in the outstanding Earth Girls Are Easy), winning an Olympic sprinting medal and then disappearing into a hole in the ground. Maybe not.

Posted by Brian Micklethwait at 12:12 PM
July 16, 2003
Terence Kealey on hobby scientists

As all those multitudes who read everything I ever post already know, I've been reading Terence Kealey's The Economic Laws of Scientific Research. While I was reading what follows, I thought, I love this. I also thought, hm, maybe the place for this is the Culture Blog. Because that's what it's about. And then, as if to clinch it for me, along came Kealey's delightful final paragraph.

I wrote a short Libertarian Alliance piece in a similar vein, but about arts funding, at the end of which I say (approximately): for art read life. What Kealey says is: and in particular, for art read science. I've added a couple of links about two of the recent (and Nobel Prize winning) hobby scientists he mentions.

The hobby scientists flourished under laissez faire, but laissez-faire Britain came to an end in 1914. Before 1914 the Government sequestered less than 10 per cent of the nation's wealth in taxes, but between 1918 and 1939 the Government increased this to about 25 per cent of GNP, and since 1945 the Government has spent between 40-50 per cent GNP. Because of the attrition of inherited wealth and of private means, the hobby scientist is now practically extinct. By the 1930s, for example, half of the lecturers in the Department of Biochemistry at Cambridge University still had private incomes, but today's tax structure has dramatically cut the numbers of people who inherit sufficient private means to do science for fun. One rare survivor is Peter Mitchell who won the Nobel Prize in 1978 for discovering the chemiosmotic hypothesis in his private laboratories in Bodmin, Cornwall. The occasional hobby scientist can still be found in a theoretical subject; Albert Einstein, for example, was working as a clerk in the patents office in Zurich when, during his spare time, he conceived of his theories of relativity.

Even dirigiste France could produce hobby scientists, but its harsh taxes so restricted opportunities that, inevitably, its most distinguished hobbyist was a taxman. Lavoisier was indeed a Farmer-General, which ultimately led him to the guillotine (the judge who condemned him to death remarked that 'the revolution has no need for scientists'; Karl Marx would have disagreed).

The loss of the hobby scientists has been unfortunate because the hobby scientists tended to be spectacularly good. They were good because they tended to do original science. Professional scientists tend to play it safe; they need to succeed, which tempts them into doing experiments that are certain to produce results. Similarly, grant-giving bodies which are accountable to government try only to give money for experiments that are likely to work. But experiments that are likely to work are probably boring - indeed, if they are that predictable, they are barely experiments at all; rather, they represent the development of established science rather than the creation of the new (though science is so unpredictable that even so-called predictable experiments will yield unpredictable results on occasion). But the hobby scientist is unaccountable. He can follow the will-o'-the-wisp and he is more likely to do original than unoriginal research, because it is original research that is fun.

Most professional scientists spend much of their time doing repetitive work. Science has become a treadmill, and scientists must be seen to be publishing papers, speaking at conferences, getting grants, teaching undergraduates and training PhD students. These activities will not succeed unless they are predictable, and therefore even boring. The hobby scientist need never be bored. He need only do an experiment if it looks fun, The hobby scientist, therefore, will be attracted to challenging science to the same degree that the professional scientist is attracted to safe science.

The hobby scientist, moreover, will be a different sort of human being from the professional scientist. A professional scientist needs to be tough. It is a harsh, competitive world in a modern university, and if a scientist does not drive himself and his students to write the requisite number of papers and to win enough grants, then that scientist does not survive. But a hobby scientist does not have to be any particular sort of human being. Indeed, many of the great hobby scientists would transparently never have survived a modern university. Peter Mitchell, whose is chemiosmotic hypothesis changed the very nature of modern biochemistry, took seven years to complete his PhD. In Britain, PhD grants are only for three years, so Mitchell would never have completed his PhD had he depended on public funds (particularly as he was not a good PhD student, and no one would have fought for him). After his PhD, he obtained a lectureship in the Department of Zoology at Edinburgh University, but he found the job intolerable and left after a few years. He bought a dilapidated country house in Bodmin, in Cornwall, spent two years rebuilding it as a form of psychotherapy, and then started on his researches again, in his own way, on family money - to win a Nobel Prize.

Many of the hobby scientists were decidedly peculiar. Cavendish, a bachelor, only spoke to other human beings on Thursday nights when he dined with a coterie of FRSs. Otherwise he lived in solitude, communicating with his servants by notes and letters. Dinner was served to him through a contraption that shielded the butler from gaze, and if Cavendish ever saw a servant, he dismissed that person instantly. Darwin was also odd. He spent his whole life as a semi-invalid, and although it is claimed he suffered from Chagas' disease, Pickering showed in his Creative Malady that Darwin probably pretended to be ill to shield himself from the strains of everyday life. Neither Cavendish not Darwin would have survived in a modern university any better than did Mitchell, yet they were scientific giants (Darwin could not even survive undergraduate life, and he left before obtaining a degree). Another academic failure was Albert Einstein, one of the greatest of hobby scientists. Einstein did not do well as an undergraduate at university, and he failed to obtain a PhD position, so he had to get a job; he chose to clerk in a patents office because it left him with spare energy in the evenings.

When science was a vocation, personal poverty did not frustrate potential researchers. Michael Faraday, for example, was the son of a blacksmith, and he was apprenticed to a bookbinder, but science was his hobby and despite his lack of conventional qualifications Sir Humphry Davy was happy to employ him as a technician at the Royal Institution. It did not take long for his genius and passion to be recognised. (Even a chronic grumbler like Thomas Huxley prospered as a gifted career scientist despite his lack of private means.) Occasionally, a contemporary private scientific body will be as enlightened as those earlier institutions. Barbara McClintock, who won the Nobel Prize in 1983 for her discovery of transposable genetic elements, was employed from 1942 by the Carnegie Institution in Washington, DC. All they asked of her was that she wrote an annual report, which is all that she wrote. She could not be bothered with all the fuss and nonsense that it takes to publish papers in peer-reviewed journals, and anyone who wanted to know what she had done only had to read the Carnegie Institution's annual report. Only a private body could behave so unconventionally. A modern university would have found McClintock wanting, because she would not have been conventional enough to spend her days writing grants, sitting on committees, and driving PhD students, technicians and post-doctoral fellows to write their quota of papers.

The hobby scientists were the most romantic of scientists, approaching the poets in their intellectual purity and richly individualistic personalities. Rich or poor, the hobby scientists were driven by a vocation and a love of research. We are lessened by their extinction. Those who argue for more government funding of science, or of anything else, should never forget the cost of government money, namely the taxes that impoverish society to enable government to impose its particular, narrow, harsh vision of a modern university.

Terence Kealey

Posted by Brian Micklethwait at 02:50 PM