Brian Micklethwait's Blog
In which I continue to seek part time employment as the ruler of the world.Home
6000 on Strange London buses
BQV on Adverts for small and cheap drones
Darren on Ancient carved god spied in modern London
6000 on What are those things on her hands?
Natalie Solent on What are those things on her hands?
Brian Micklethwait on Ancient carved god spied in modern London
Natalie Solent on Ancient carved god spied in modern London
Darren on Ancient carved god spied in modern London
6000 on Ancient carved god spied in modern London
Darren on A forgotten war
Most recent entries
- The view from outside Waterloo Station
- Goodbye KP?
- Strange London buses
- Seaside muralist
- How Centre Point is looking just now
- Another horizontal advert for an only slightly more expensive drone
- First test against NZ – first day
- Blue sky
- Adverts for small and cheap drones
- High hair
- Hungerford Footbridges photographers
- An alien robot playing the cymbals and paps
- A photographer and an advert
Other Blogs I write for
6000 Miles from Civilisation
A Decent Muesli
Adventures in Capitalism
Alex Ross: The Rest Is Noise
Another Food Blog
Antoine Clarke's Election Watch
Armed and Dangerous
Art Of The State Blog
Boatang & Demetriou
Burning Our Money
Chase me ladies, I'm in the cavalry
China Law Blog
Civilian Gun Self-Defense Blog
Coffee & Complexity
Communities Dominate Brands
Confused of Calcutta
Conservative Party Reptile
Counting Cats in Zanzibar
Deleted by tomorrow
Don't Hold Your Breath
Douglas Carswell Blog
Dr Robert Lefever
Englands Freedome, Souldiers Rights
Everything I Say is Right
Fat Man on a Keyboard
Ferraris for all
Freedom and Whisky
From The Barrel of a Gun
Gates of Vienna
Global Warming Politics
Greg Mankiw's Blog
Guido Fawkes' blog
Here Comes Everybody
Hit & Run
House of Dumb
Iain Dale's Diary
Jeffrey Archer's Official Blog
Jessica Duchen's classical music blog
Laissez Faire Books
Last of the Few
Libertarian Alliance: Blog
Liberty Dad - a World Without Dictators
Lib on the United Kingdom
Little Man, What Now?
Loic Le Meur Blog
L'Ombre de l'Olivier
London Daily Photo
Metamagician and the Hellfire Club
Michael J. Totten's Middle East Journal
More Than Mind Games
Mutualist Blog: Free Market Anti-Capitalism
My Boyfriend Is A Twat
My Other Stuff
Nation of Shopkeepers
Never Trust a Hippy
Non Diet Weight Loss
Nurses for Reform blog
Obnoxio The Clown
On an Overgrown Path
One Man & His Blog
Owlthoughts of a peripatetic pedant
Oxford Libertarian Society /blog
Patri's Peripatetic Peregrinations
Police Inspector Blog
Private Sector Development blog
Remember I'm the Bloody Architect
Setting The World To Rights
SimonHewittJones.com The Violin Blog
Sky Watching My World
Social Affairs Unit
Squander Two Blog
Stuff White People Like
Stumbling and Mumbling
Technology Liberation Front
The Adam Smith Institute Blog
The Becker-Posner Blog
The Belgravia Dispatch
The Belmont Club
The Big Blog Company
The Big Picture
the blog of dave cole
The Corridor of Uncertainty (a Cricket blog)
The Daily Ablution
The Devil's Advocate
The Devil's Kitchen
The Dissident Frogman
The Distributed Republic
The Early Days of a Better Nation
The Examined Life
The Fly Bottle
The Freeway to Serfdom
The Future of Music
The Happiness Project
The Jarndyce Blog
The London Fog
The Long Tail
The Lumber Room
The Online Photographer
The Only Winning Move
The Policeman's Blog
The Road to Surfdom
The Wedding Photography Blog
The Welfare State We're In
UK Commentators - Laban Tall's Blog
UK Libertarian Party
Violins and Starships
we make money not art
What Do I Know?
What's Up With That?
Where the grass is greener
White Sun of the Desert
Why Evolution Is True
Your Freedom and Ours
Arts & Letters Daily
Bjørn Stærk's homepage
Butterflies and Wheels
Dark Roasted Blend
Digital Photography Review
Ghana Centre for Democratic Reform
Global Warming and the Climate
History According to Bob
Institut économique Molinari
Institute of Economic Affairs
Ludwig von Mises Institute
Oxford Libertarian Society
The Christopher Hitchens Web
The Space Review
The TaxPayers' Alliance
This is Local London
UK Libertarian Party
Victor Davis Hanson
WSJ.com Opinion Journal
Bits from books
Bloggers and blogging
Brian Micklethwait podcasts
Cats and kittens
Food and drink
How the mind works
Media and journalism
Middle East and Islam
My blog ruins
Signs and notices
The Micklethwait Clock
This and that
Category archive: Bits from books
Goddaughter 2 recently suggested I read this. I now suggest that you read it:
In the afterlife you relive all your experiences, but this time with the events reshuffled into a new order: all the moments that share a quality are grouped together.
You spend two months driving the street in front of your house, seven months having sex. You sleep for thirty years without opening your eyes. For five months straight you flip through magazines while sitting on a toilet. You take all your pain at once, all twenty-seven intense hours of it. Bones break, cars crash, skin is cut, babies are born. Once you make it through, it’s agony-free for the rest of your afterlife.
But that doesn’t mean it’s always pleasant. You spend six days clipping your nails. Fifteen months looking for lost items. Eighteen months waiting in line. Two years of boredom: staring out a bus window, sitting in an airport terminal. One year reading books. Your eyes hurt, and you itch, because you can’t take a shower until it’s your time to take your marathon two-hundred-day shower. Two weeks wondering what happens when you die. One minute realizing your body is falling. Seventy-seven hours of confusion. One hour realizing you’ve forgotten someone’s name. Three weeks realizing you are wrong. Two days lying. Six weeks waiting for a green light. Seven hours vomiting. Fourteen minutes experiencing pure joy. Three months doing laundry. Fifteen hours writing your signature. Two days tying shoelaces. Sixty-seven days of heartbreak. Five weeks driving lost. Three days calculating restaurant tips. Fifty-one days deciding what to wear. Nine days pretending you know what is being talked about. Two weeks counting money. Eighteen days staring into the refrigerator. Thirty-four days longing. Six months watching commercials. Four weeks sitting in thought, wondering if there is something better you could be doing with your time. Three years swallowing food. Five days working buttons and zippers. Four minutes wondering what your life would be like if you reshuffled the order of events. In this part of the afterlife, you imagine something analogous to your Earthly life, and the thought is blissful: a life where episodes are split into tiny swallowable pieces, where moments do not endure, where one experiences the joy of jumping from one event to the next like a child hopping from spot to spot on the burning sand.
This is from Sum, by David Eagleman, which is subtitled “Forty tales from the afterlives”, the above being the first of them, also entitled “Sum”.
I sum- (hah!) -marised this tale as best I could to another friend, who immediately got the point that Eagleman makes at the end, that the mere fact of the variety of life becomes a source of joy, if you compare it with a life from which variety has been drained away. This alone turns humdrumness into hell, and contemplating that hell turns the humdrumness into a kind of heaven.
Count your blessings, but not the same blessings all at the same time.
Okay, this quote is from Chapter One, “A Universal Language?”, of The Story of English: How the English Language Conquered the World by Philip Gooden (pp. 11-12):
English is the closest the world has yet come to a universal language, at least in the sense that even those who cannot speak it - admittedly, the large majority of the world’s population - are likely to be familiar with the odd English expression. One term that is genuinely global as well as genuinely odd is OK (or O.K. or okay), originating in America in the 19th century. An astonishingly adaptable word, it works as almost any part of speech from noun to verb, adjective to adverb, though often just as a conversation-filler - ‘OK, what are we going to do now?’ Depending on the tone of voice, OK can convey anything from fervent agreement to basic accquiescence. It may be appropriate that such a truly universal term has no generally agreed source. Attempts to explain where it came from don’t so much show variety as a high degree of imaginative curiosity. So, OK is created from the initials of a deliberate misspelling, oll korreket, or from a campaign slogan for a would-be US president in the 1840s who was known as Old Kinderhook because he came from Kinderhook in New York State. Or it is a version of a word imported from Finland or Haiti, or possibly one borrowed from the Choctaw Indians. Or it is older than originally thought and derives from West African expressions like o-ke or waw-ke. Enough explanations, OK?
I am reading In Defence of History by Richard J. Evans. The attackers are the post-modernists. In Chapter 3 ("Historians and their facts"), Evans writes about how evidence considered insignificant in one era can become highly significant in a later era:
The traces left by the past, as Dominick LaCapra has observed, do not provide an even coverage of it. Archives are the product of the chance survival of some documents and the corresponding chance loss or deliberate destruction of others. They are also the products of the professional activities of archivists, which therefore shape the record of the past and with it the interpretations of historians. Archivists have often weeded out records they consider unimportant, while retaining those they consider of lasting value. This might mean for example destroying vast and therefore bulky personnel files on low-ranking state employees such as ordinary soldiers and seamen, manual workers and so on, while keeping room on the crowded shelves for personnel files on high state officials. Yet such a policy would reflect a view that many historians would now find outmoded, a view which considered ‘history’ only as the history of the elites. Documents which seem worthless to one age, and hence ripe for the shredder, can seem extremely valuable to another.
Let me give an example from my personal experience. During research in the Hamburg state archives in the I98os, I became aware that the police had been sending plain-clothes agents into the city’s pubs and bars during the two decades or so before the First World War to gather and later write down secret reports of what was being said in them bysocialist workers. The reports I saw were part of larger files on the various organizations to which these workers belonged. Thinking it might be interesting to look at a wider sample, I went through a typewritten list of the police files with the archivist, and among the headings we came across was one which read: ‘Worthless Reports’. Going down into the muniment room, we found under the relevant call-number a mass of over 20,000 reports which had been judged of insufficient interest by the police authorities of the day to be taken up into the thematic files where I had first encountered this material. It was only by a lucky chance that they had not already been destroyed. They turned out to contain graphic and illuminating accounts of what rank-and-file socialist workers thought about almost every conceivable issue of the day, from the Dreyfus affair in France to the state of the traffic on Hamburg’s busy streets. Nobody had ever looked at them before. Historians of the labour movement had only been interested in organization and ideology. But by the time I came to inspect them, interest had shifted to the history of everyday life, and workers’ views on the family, crime and the law, food, drink and leisure pursuits, had become significant objects of historical research. It seemed worth transcribing and publishing a selection, therefore, which I did after a couple of years’ work on them. The resulting collection showed how rank-and-file Social Democrats and labour activists often had views that cut right across the Marxist ideology in which previous historians thought the party had indoctrinated them, because previous historians had lacked the sources to go down beyond the level of official pronouncements in the way the Hamburg police reports made it possible to do. Thus from ‘worthless reports’ there emerged a useful corrective to earlier historical interpretations. This wonderful material, which had survived by chance, had to wait for discovery and exploitation until the historiographical climate had changed.
In an earlier posting I mentioned that I had ordered Marc Morris’s book about The Norman Conquest, and I have now started reading this. (Although for some reason the version of it that I have seems to be the American one.)
The events depicted in the Tapestry are of course highly dramatic, but as Morris relates, so too was the subsequent history of the Tapestry:
By any law of averages, the Tapestry ought not to exist. We know that such elaborate wall-hangings, while hardly commonplace in the eleventh century, were popular enough with the elite that could afford them, because we have descriptions in contemporary documents. What we don’t have are other surviving examples: all that comes down to us in other cases are a few sorry-looking scraps. That the Tapestry is still with us almost I ,000 years after it was sewn is astonishing, especially when one considers its later history. It first appears in the written record four centuries after its creation, in 1476, when it is described in an inventory of the treasury at Bayeux Cathedral, from which we learn that the clergy were in the habit of hanging it around the nave every year during the first week of July (an annual airing that would have aided its conservation). Its survival through those four medieval centuries, escaping the major hazards of war, fire and flood, as well as the more mundane menaces of rodents, insects and damp, is wondrous enough; that it successfully avoided destruction during the modern era is nothing short of miraculous. When the cathedral’s treasury was looted during the French Revolution, the Tapestry came within a hair’s breadth of being cut up and used to cover military wagons. Carted to Paris for exhibition by Napoleon, it was eventually returned to Bayeux, where for several years during the early nineteenth century it was indifferently stored in the town hall on a giant spindle, so that curious visitors could unroll it (and occasionally cut bits off). During the Second World War it had yet more adventures: taken again to Paris by the Nazis, it narrowly escaped being sent to Berlin, and somehow managed to emerge unscathed from the flames and the bombs. The Tapestry’s post-medieval history is a book in itself - one which, happily, has already been written.
What next for it, I wonder?
I’ve been reading Paul Kennedy’s Engineers of Victory, which is about how WW2 was won, by us good guys. Kennedy, like many others, identifies the Battle of the Atlantic as the allied victory which made all the other victories over Germany by the Anglo-American alliance possible. I agree with the Amazon reviewers who say things like “good overview, not much engineering”. But this actually suited me quite well. At least I now know what I want to know more about the engineering of. And thanks to Kennedy, I certainly want to know more about how centimetric radar was engineered.
Centimetric radar was even more of a breakthrough, arguably the greatest. HF-DF might have identified a U-boat’s radio emissions 20 miles from the convoy, but the corvette or plane dispatched in that direction still needed to locate a small target such as a conning tower, perhaps in the dark or in fog. The giant radar towers erected along the coast of southeast England to alert Fighter Command of Luftwaffe attacks during the Battle of Britain could never be replicated in the mid-Atlantic, simply because the structures were far too large. What was needed was a miniaturized version, but creating one had defied all British and American efforts for basic physical and technical reasons: there seemed to be no device that could hold the power necessary to generate the microwave pulses needed to locate objects much smaller than, say, a squadron of Junkers bombers coming across the English Channel, yet still made small enough to be put on a small escort vessel or in the nose of a long-range aircraft. There had been early air-to-surface vessel (ASV) sets in Allied aircraft, but by 1942 the German Metox detectors provided the U-boats with early warning of them. Another breakthrough was needed, and by late spring of 1943 that problem had been solved with the steady introduction of 10-centimeter (later 9.1-centimeter) radar into Allied reconnaissance aircraft and even humble Flower-class corvettes; equipped with this facility, they could spot a U-boat’s conning tower miles away, day or night. In calm waters, the radar set could even pick up a periscope. From the Allies’ viewpoint, the additional beauty of it was that none of the German systems could detect centimetric radar working against them.
Where did this centimetric radar come from? In many accounts of the war, it simply “pops up”; Liddell Hart is no worse than many others in noting, “But radar, on the new 10cm wavelength that the U-boats could not intercept, was certainly a very important factor.” Hitherto, all scientists’ efforts to create miniaturized radar with sufficient power had failed, and Doenitz’s advisors believed it was impossible, which is why German warships were limited to a primitive gunnery-direction radar, not a proper detection system. The breakthrough came in spring 1940 at Birmingham University, in the labs of Mark Oliphant (himself a student of the great physicist Ernest Rutherford), when the junior scientists John Randall and Harry Boot, working in a modest wooden building, finally put together the cavity magnetron.
This saucer-sized object possessed an amazing capacity to detect small metal objects, such as a U-boat’s conning tower, and it needed a much smaller antenna for such detection. Most important of all, the device’s case did not crack or melt because of the extreme energy exuded. Later in the year important tests took place at the Telecommunications Research Establishment on the Dorset coast. In midsummer the radar picked up an echo from a man cycling in the distance along the cliff, and in November it tracked the conning tower of a Royal Navy submarine steaming along the shore. Ironically, Oliphant’s team had found their first clue in papers published sixty years earlier by the great German physicist and engineer Adolf Herz, who had set out the original theory for a metal casement sturdy enough to hold a machine sending out very large energy pulses. Randall had studied radio physics in Germany during the 1930s and had read Herz’s articles during that time. Back in Birmingham, he and another young scholar simply picked up the raw parts from a scrap metal dealer and assembled the device.
Almost inevitably, development of this novel gadget ran into a few problems: low budgets, inadequate research facilities, and an understandable concentration of most of Britain’s scientific efforts at finding better ways of detecting German air attacks on the home islands. But in September 1940 (at the height of the Battle of Britain, and well before the United States formally entered the war) the Tizard Mission arrived in the United States to discuss scientific cooperation. This mission brought with it a prototype cavity magnetron, among many other devices, and handed it to the astonished Americans, who quickly recognized that this far surpassed all their own approaches to the miniature-radar problem. Production and test improvements went into full gear, both at Bell Labs and at the newly created Radiation Laboratory (Rad Lab) at the Massachusetts Institute of Technology. Even so, there were all sorts of delays - where could they fit the equipment and operator in a Liberator? Where could they install the antennae? - so it was not until the crisis months of March and April 1943 that squadrons of fully equipped aircraft began to join the Allied forces in the Battle of the Atlantic.
Soon everyone was clamoring for centimetric radar - for the escorts, for the carrier aircraft, for gunnery control on the battleships. The destruction of the German battle cruiser Scharnhorst off the North Cape on Boxing Day 1943, when the vessel was first shadowed by the centimetric radar of British cruisers and then crushed by the radar-controlled gunnery of the battleship HMS Duke of York, was an apt demonstration of the value of a machine that initially had been put together in a Birmingham shed. By the close of the war, American industry had produced more than a million cavity magnetrons, and in his Scientists Against Time (1946) James Baxter called them “the most valuable cargo ever brought to our shores” and “the single most important item in reverse lease-lend.” As a small though nice bonus, the ships using it could pick out life rafts and lifeboats in the darkest night and foggiest day. Many Allied and Axis sailors were to be rescued this way.
Here (pp. 143-5) is how Thiel explains the difference between humans and computers, and how they complement one another in doing business together:
To understand the scale of this variance, consider another of Google’s computer-for-human substitution projects. In 2012, one of their supercomputers made headlines when, after scanning 10 million thumbnails of YouTube videos, it learned to identify a cat with 75% accuracy. That seems impressive-until you remember that an average four-year-old can do it flawlessly. When a cheap laptop beats the smartest mathematicians at some tasks but even a supercomputer with 16,000 CPUs can’t beat a child at others, you can tell that humans and computers are not just more or less powerful than each other - they’re categorically different.
The stark differences between man and machine mean that gains from working with computers are much higher than gains from trade with other people. We don’t trade with computers any more than we trade with livestock or lamps. And that’s the point: computers are tools, not rivals.
Thiel then writes about how he learned about the above truths when he and his pals at Paypal solved one of their biggest problems:
In mid-2000 we had survived the dot-com crash and we were growing fast, but we faced one huge problem: we were losing upwards of $10 million to credit card fraud every month. Since we were processing hundreds or even thousands of transactions per minute, we couldn’t possibly review each one - no human quality control team could work that fast.
So we did what any group of engineers would do: we tried to automate a solution. First, Max Levchin assembled an elite team of mathematicians to study the fraudulent transfers in detail. Then we took what we learned and wrote software to automatically identify and cancel bogus transactions in real time. But it quickly became clear that this approach wouldn’t work either: after an hour or two, the thieves would catch on and change their tactics. We were dealing with an adaptive enemy, and our software couldn’t adapt in response.
The fraudsters’ adaptive evasions fooled our automatic detection algorithms, but we found that they didn’t fool our human analysts as easily. So Max and his engineers rewrote the software to take a hybrid approach: the computer would flag the most suspicious transactions on a well-designed user interface, and human operators would make the final judgment as to their legitimacy. Thanks to this hybrid system - we named it “Igor,” after the Russian fraudster who bragged that we’d never be able to stop him - we turned our first quarterly profit in the first quarter of 2002 (as opposed to a quarterly loss of $29.3 million one year before).
There then follow these sentences.
The FBI asked us if we’d let them use Igor to help detect financial crime. And Max was able to boast, grandiosely but truthfully, that he was “the Sherlock Holmes of the Internet Underground.”
The answer was yes.
Thus did the self-declared libertarian Peter Thiel, who had founded Paypal in order to replace the dollar with a free market currency, switch to another career, as a servant of the state, using government-collected data to chase criminals. But that’s another story.
Here is another bit from a book which I found particularly interesting, having just purchased and started to read the book in question.
In the Preface of A Great and Terrible King: Edward I and the Forging of Britain, Marc Morris writes that the first question everyone asks is: Was that Edward the Confessor? No. He came much earlier, before the Norman Conquest. Question number two was more interesting, because it has a more interesting answer. It concerns evidence:
The second question that has usually been put to me concerns the nature of the evidence for writing the biography of a medieval king, and specifically its quantity. In general, people tend to presume that there can’t be very much, and imagine that I must spend my days poking around in castle muniment rooms, looking for previously undiscovered scraps of parchment. Sadly, they are mistaken. The answer I always give to the question of how much evidence is: more than one person could look at in a lifetime. From the early twelfth century, the kings of England began to keep written accounts of their annual expenditure, and by the end of the century they were keeping a written record of almost every aspect of royal government. Each time a royal document was issued, be it a grand charter or a routine writ, a copy was dutifully entered on to a large parchment roll. Meanwhile, in the provinces, the king’s justices kept similar rolls to record the proceedings of the cases that came before his courts. Miraculously, the great majority of these documents have survived, and are now preserved in the National Archives at Kew near London. Some of them, when unrolled, extend to twenty or thirty feet. And their number is legion: for the thirteenth century alone, it runs to tens of thousands. Mercifully for the medieval historian, the most important have been transcribed and published, but even this printed matter would be enough to line the walls of an average-sized front room with books. Moreover, the quantity is increased by the inclusion of non-royal material. Others besides the king were keeping records during Edward I’s day. Noblemen also drew up financial accounts, issued charters and wrote letters; monks did the same, only in their case the chances of such material surviving was much improved by their membership of an institution. Monks, in addition, continued to do as they had always done, and kept chronicles, and these too provide plenty to keep the historian busy. To take just the most obvious example from the thirteenth century, the monk of St Albans called Matthew Paris composed a chronicle, the original parts of which cover the quarter century from 1234 to 1259. In its modern edition it runs to seven volumes.
I say all this merely to demonstrate how much there is to know about our medieval ancestors, and not to pretend that I have in some way managed to scale this mountain all by myself. For the most part I have not even had to approach the mountain at all, for this book is grounded on the scholarly work of others. Nevertheless, even the secondary material for a study of Edward I presents a daunting prospect. At a conservative estimate, well over a thousand books and articles have been published in the last hundred years that deal with one aspect or another of the king’s reign. For scholarly works on the thirteenth century as a whole, that figure would have to be multiplied many times over.
Another Bit from a Book, and once again I accompany it with a warning that this Bit could vanish at any moment, for the reasons described in this earlier posting.
This particular Bit is from The Rational Optimist by Matt Ridley (pp. 255-258):
Much as I love science for its own sake, I find it hard to argue that discovery necessarily precedes invention and that most new practical applications flow from the minting of esoteric insights by natural philosophers. Francis Bacon was the first to make the case that inventors are applying the work of discoverers, and that science is the father of invention. As the scientist Terence Kealey has observed, modern politicians are in thrall to Bacon. They believe that the recipe for making new ideas is easy: pour public money into science, which is a public good, because nobody will pay for the generation of ideas if the taxpayer does not, and watch new technologies emerge from the downstream end of the pipe. Trouble is, there are two false premises here: first, science is much more like the daughter than the mother of technology; and second, it does not follow that only the taxpayer will pay for ideas in science.
It used to be popular to argue that the European scientific revolution of the seventeenth century unleashed the rational curiosity of the educated classes, whose theories were then applied in the form of new technologies, which in turn allowed standards of living to rise. China, on this theory, somehow lacked this leap to scientific curiosity and philosophical discipline, so it failed to build on its technological lead. But history shows that this is back-to-front. Few of the inventions that made the industrial revolution owed anything to scientific theory.
It is, of course, true that England had a scientific revolution in the late 1600s, personified in people like Harvey, Hooke and Halley, not to mention Boyle, Petty and Newton, but their influence on what happened in England’s manufacturing industry in the following century was negligible. Newton had more influence on Voltaire than he did on James Hargreaves. The industry that was transformed first and most, cotton spinning and weaving, was of little interest to scientists and vice versa. The jennies, gins, frames, mules and looms that revolutionised the working of cotton were invented by tinkering businessmen, not thinking boffins: by ‘hard heads and clever fingers’. It has been said that nothing in their designs would have puzzled Archimedes.
Likewise, of the four men who made the biggest advances in the steam engine - Thomas Newcomen, James Watt, Richard Trevithick and George Stephenson - three were utterly ignorant of scientific theories, and historians disagree about whether the fourth, Watt, derived any influence from theory at all. It was they who made possible the theories of the vacuum and the laws of thermodynamics, not vice versa. Denis Papin, their French-born forerunner, was a scientist, but he got his insights from building an engine rather than the other way round. Heroic efforts by eighteenth-century scientists to prove that Newcomen got his chief insights from Papin’s theories proved wholly unsuccessful.
Throughout the industrial revolution, scientists were the beneficiaries of new technology, much more than they were the benefactors. Even at the famous Lunar Society, where the industrial entrepreneur Josiah Wedgwood liked to rub shoulders with natural philosophers like Erasmus Darwin and Joseph Priestley, he got his best idea - the ‘rose-turning’ lathe - from a fellow factory owner, Matthew Boulton. And although Benjamin Franklin’s fertile mind generated many inventions based on principles, from lightning rods to bifocal spectacles, none led to the founding of industries.
So top-down science played little part in the early years of the industrial revolution. In any case, English scientific virtuosity dries up at the key moment. Can you name a single great English scientific discovery of the first half of the eighteenth century? It was an especially barren time for natural philosophers, even in Britain. No, the industrial revolution was not sparked by some deus ex machina of scientific inspiration. Later science did contribute to the gathering pace of invention and the line between discovery and invention became increasingly blurred as the nineteenth century wore on. Thus only when the principles of electrical transmission were understood could the telegraph be perfected; once coal miners understood the succession of geological strata, they knew better where to sink new mines; once benzene’s ring structure was known, manufacturers could design dyes rather than serendipitously stumble on them. And so on. But even most of this was, in Joel Mokyr’s words, ‘a semi-directed, groping, bumbling process of trial and error by clever, dexterous professionals with a vague but gradually clearer notion of the processes at work’. It is a stretch to call most of this science, however. It is what happens today in the garages and cafes of Silicon Valley, but not in the labs of Stanford University.
The twentieth century, too, is replete with technologies that owe just as little to philosophy and to universities as the cotton industry did: flight, solid-state electronics, software. To which scientist would you give credit for the mobile telephone or the search engine or the blog? In a lecture on serendipity in 2007, the Cambridge physicist Sir Richard Friend, citing the example of high-temperature superconductivity - which was stumbled upon in the 1980s and explained afterwards - admitted that even today scientists’ job is really to come along and explain the empirical findings of technological tinkerers after they have discovered something.
The inescapable fact is that most technological change comes from attempts to improve existing technology. It happens on the shop floor among apprentices and mechanicals, or in the workplace among the users of computer programs, and only rarely as a result of the application and transfer of knowledge from the ivory towers of the intelligentsia. This is not to condemn science as useless. The seventeenth-century discoveries of gravity and the circulation of the blood were splendid additions to the sum of human knowledge. But they did less to raise standards of living than the cotton gin and the steam engine. And even the later stages of the industrial revolution are replete with examples of technologies that were developed in remarkable ignorance of why they worked. This was especially true in the biological world. Aspirin was curing headaches for more than a century before anybody had the faintest idea of how. Penicillin’s ability to kill bacteria was finally understood around the time bacteria learnt to defeat it. Lime juice was preventing scurvy centuries before the discovery of vitamin C. Food was being preserved by canning long before anybody had any germ theory to explain why it helped.
Dominic Frisby on the Hype Cycle
On the rights and wrongs of me posting bits from books (plus a bit about Rule Utilarianism)
How Bill Bryson on white and black paint helps to explain the Modern Movement in Architecture
Chippendale without Rannie
Bill Bryson on the miracle of crop rotation
Postrel goes for Gray
JK Rowling describes two rich girls
Christopher Seaman on conducting
3D printed baby in the womb
Don’t judge a new technology by its first stumbling steps
Alex on Quentin
Algernon Sidney sends for Micklethwait because Micklethwait is wise, learned, diligent, and faithful
New apostrophe-shaped footbridge in Hull
Lighter blogging here but not none
76 operas and a monument in the wrong place for Hermann the German
Emmanuel Todd quoted and Instalanched
Richard Dawkins on university debating games
Alex Ross on Hollywood film scores
Professor C. Northcote Parkinson on the Edifice Complex
Alex Ross on Sibelius
Lawrence H. White on the Scottish experience of free banking
“I will cause a boy that driveth a plough to know more of the scriptures than thou dost.”
John Carey on Shakespeare and the high-art/ popular-art distinction
Switching from dumb bombing to smart bombing
“I’ll build it with explosive bolts connecting the wings to the fuselage …”
If the Jews have been running the world they haven’t been doing it very successfully
Terence Kealey on the Wright brothers and their patent battles
Ed Smith on how baseball defeated cricket in America
Understanding is the booby prize exclamation mark
Will China fail?
A dreadful age
Richard Dawkins on the Muhammad cartoons affair
Is Jeremy Paxman a closet libertarian?