What I'm Reading

Books Read in 2023

The books I've read so far this year.

Published by: Solaris, 2015

I haven't embarked on a proper reading binge for quite a while, but in the case of the Fractured Europe series I found it impossible to resist the temptation and I flew through book 2 in less than 36 hours. I was utterly gripped.

The book takes the same premise as the first novel in the sequence but comes at it from the opposite direction, so to speak. The stakes are, of course, higher; the action even more kinetic; the scenario even wilder. And the writing is still spectacular. Although I'd spotted one plot reveal being set up a couple of chapters in advance, I was still wrong-footed by the plot on many occasions. I love it when that happens.

Describing the plot will inevitably veer into spoiler territory, and as I said in the previous review, I don't want to to that. I'll just say that this sequence of novels is one of the best I've ever read. I'm already a hundred pages in to book 3, Europe in Winter.

Published by: Solaris, 2014

Somebody once described this book as "Len Deighton meets Franz Kafka" and that is a perfect take. Written well before the outbreak of Covid, it's a tale set in a Europe which has been ravaged by a pandemic badly enough for many countries to have disintegrated into tiny polities and city states, with varying degrees of success. This book has some serious depth to it; the politics and mores of Eastern Europe are described with an easy knowledge that signifies a master storyteller working at the height of their powers. The characters, the set pieces, the dialogue draw you in effortlessly and the progressively more and more unsettling plot will have you turning the pages to find out just how many more layers of the story you've yet to uncover (it's a lot).

It's best if that's all you know about the plot before you start reading. The first time I read it I had no idea of the ride I'd let myself in for, and finding out what was going on was a delicious experience (which is apt, as Rudi, the principal protagonist, starts out as a chef; there are nods to Bourdain, Blumenthal and Ramsey for gourmets to spot). You should go into it spoiler free too, if you can. Don't even read the blurb on the back. But trust me on this: you should read this novel, because it's something very special. It's thrilling and bold and original and it's going to blow you away.

Put it like this: I decided as soon as I'd finished it that I need to read the rest of the series, in order, back to back, in one go—and so this evening I unaccountably found that several hours have passed without me noticing and I'm already half-way through Europe At Midnight.

Published by: Fourth Estate, 1994

This was another charity shop find, which I discovered sitting on the shelves in the "science" section of the British Lions shop in Thornbury. To start with, I was hopeful that the book's sensationalist title was simply a marketing gambit and it was actually going to be an academic examination of the conflicting claims made by Thomas Kuhn and Karl Popper about how the field of science determines the fundamental truths of reality. Scientific progress happens in stages, and according to Kuhn, current schools of thought (which he refers to as "paradigms") may stay in place for centuries until their adherents either die out or are convinced of their error by newer, better theories which predict the behaviour of reality more accurately. Kuhn focused heavily on the "convincing people you're right" part of the process rather than the "providing strong observational or experimental evidence" or the "must be able to accurately predict what an experiment will find before you start" parts. Kuhn's representation of the process through which he believed a theory gained scientific acceptance led many people—usually those without even the most basic scientific knowledge—to think that their pet theory deserved to be regarded on an equal footing with those developed by teams of scientists after years of painstaking research. This is more than a little bit frustrating for the scientific establishment and as a broad generalisation, this view of how science makes progress isn't really reflected by the historical evidence. Theories are usually replaced when better data are available, often as a result of advances in technology (the telescope, for Galileo; x-ray crystallography for Crick and Watson; a handy microwave horn antenna for Penzias and Wilson, and so on). Popper, on the other hand, introduced a splendid way of sorting out scientific claims from non-scientific bunkum that's known as the Falsification Principle. If you can't make any observations or perform an experiment that will prove the theory is wrong, then for Popper (and pretty much all modern scientists agree with him) it can't be classed as a scientific theory. For example, the theory that the Earth is flat can be falsified by flying high enough that you can see the curvature of the Earth. Lots of people have done this, because you don't have to fly very high to do so, so we know that the theory that the Earth is flat is wrong. It's not rocket science, although rockets do come in useful when making experimental observations in this case.

As you might have guessed, I'm in Team Popper. The amateur theoreticians, not so much.

This book was written a couple of years after the announcement by Martin Fleischmann and Stanley Pons that they had managed to achieve a form of nuclear fusion on a tabletop, using equipment that could be bought at a decent hardware store for around $90; today, the scientific consensus is that nuclear fusion was not taking place but anomalous generation of heat was replicated by other experimenters, which has led to cold fusion being rebranded as low-energy nuclear reactions and research in this field continues to this day. The initial reaction of the scientific establishment in not accepting Fleischmann and Pons's original claims seems to have been justified. Milton opens his argument with Fleischman and Pons's case, and develops his thesis along the lines that if cold fusion actually is a real thing—and nearly thirty years later, that's a contentious assertion, remember—then who is to say that (insert any other theory of your choice which was disregarded by mainstream science) might be true as well?

Which is an interesting argument, for sure. Unfortunately it's also entirely nonsensical.

Milton's style is to represent failed theories (such as the existence of the lumeniferous ether) as failing because—as per Kuhn—it's the scientists who are better at winning arguments who dictate scientific consensus, whereas—as per Popper—it was because the old theories could be, and were, falsified. Deciding what would be taught to the next generation of scientists is not determined by a sort of popularity contest amongst the scientific community. It's judged by the accuracy of any theory's predictions about how reality behaves. I've ranted about how Kuhn's book successfully wrecked an entire generation of scientific praxis elsewhere on this website, so I'll spare you a repetition here. If your theory is wrong, it doesn't matter how good your public relations team is, you won't win the scientific debate. Kuhn seemed to believe otherwise, and Milton appears to agree with him.

Science does not advance if its practitioners pick and choose the data that fits a theory while ignoring the (significantly larger) pile of observations that contradict it. However, Milton provides examples of scientific investigations which appear to have adopted this approach. Observations of fractional charges on an electron have been obtained experimentally over the years, even though at the time they were first recorded the scientific model of the atom predicted that this would be impossible (rather than going away, that particular story has become more and more interesting since this book was published) and Milton quite rightly criticises scientists who have been caught fudging their results to fit their pet theory. However, it's science's behaviour when the opposite takes place and researchers publish their anomalous results that is the issue here.

The idea that science might take a dim view of contradictory results is not without merit. Charles Fort's famously iconoclastic books written in the 1920s took pot-shots at science's habit of sweeping any "damned data" under the carpet, and indeed Milton explains that it took years for Scientific American to acknowledge that the Wright Brothers had achieved powered, heavier-than-air flight; the idea that meteorites were rocks that had fallen from the sky was still being ridiculed by prominent thinkers (including Thomas Jefferson) in the 19th century. People were making bold assertions about the impossibility of space travel less than a month before the Russians put Sputnik I into orbit. But here, science's intemperance for open-mindedness is exaggerated so that policing procedure in order to get a better theory can be applied to truly off-the-wall ideas as well and make them just as "true".

Valid criticisms of methodology or tales of injustice are buried under a mess of false equivalences, specious arguments, and seemingly deliberate misrepresentation. So, for instance, we're told that the Michelson-Morley experiment (which sank the idea of a luminiferous ether once and for all) is actually still open to doubt because of experiments that Dayton Miller carried out in the 1920s. Miller did indeed report anomalous results, but Milton fails to mention that (spoiler alert) no other researchers have ever managed to replicate his results, despite technology having gotten much, much more accurate since the 1930s. Nor does Milton take the time to describe how Miller's explanation of what his experiment showed was rapidly and soundly disproved experimentally by the Hammar Experiment, which was conducted in 1935.

It gets worse; we're encouraged to view Lord Clancarty's famous debate about flying saucers in the House of Lords as some sort of turning point in science's conspiracy to conceal the truth. Milton writes that in his debate, Lord Clancarty referred to flying saucers originating from inside a hollow Earth, the UFOs gaining access to the surface by means of a gigantic hole at the North Pole, but an examination of the parliamentary record in Hansard reveals that he made no mention of the existence of such a hole at any point in his speech whatsoever and as an aside, that image of the North Pole has been misinterpreted. The satellite did not take a photograph of a giant hole at the top of our planet. Instead, the image is a mosaic of many, much smaller photographs joined together. No photographs were available to fill in the details at the pole, because the inclination of the satellite's orbit never took it over the area. With no data to show, the image compilers simply left that part of the mosaic as black. The idea that the Earth is genuinely hollow and can be accessed via a hole at the North (or perhaps the South) Pole is the domain of conspiracy theorists and science fiction authors (Rudy Rucker wrote two rather fun adventure novels with Edgar Allan Poe as his protagonist on this theme, for example, but I don't think Rudy ever intended anyone to take them as factual accounts of a real journey, even if he did write himself into the plot in the second book).

I'd pretty much realised at this point that I was dealing with someone who was exhibiting the same sort of behaviour as those of the "alternative science" theorists with whom the book takes sides.

And just to confirm this, Milton now begins an examination of Immanuel Velikovsky's classic work of crank science, Worlds In Collision. Milton presents Velikovsky as a misunderstood genius who was vilified by the scientific establishment. Milton describes, correctly, how several universities threatened to withdraw their custom from the Macmillan publishing house unless Velikovsky was moved to a different publisher. This does not appear to have dented the book's sales; the resulting notoriety that the book acquired probably increased them. You can't really describe your work as "suppressed" when it sat at the top of the New York Times bestseller list for nine weeks. Milton frames Worlds in Collision as a rational, erudite work of science which, quite frankly, is unfair, as the nuttier aspects of the book make for great (if implausible) reading. Many of Velikovsky's references are taken from the Bible and other religious texts, so it should be clear that WiC was not a work of science. I have my copy of the book right here, so let me give you an example of its true flavour, taken from page 184 of the Abacus Editions paperback:

" When Venus sprang out of Jupiter as a comet and flew very close to the earth, it became entangled in the embrace of the earth. The internal heat developed by the earth and the scorching gases of the comet were in themselves sufficient to make the vermin of the earth propagate at a feverish rate. Some of the plagues, like the plague of the frogs ('the land brought forth frogs') or of the locusts, must be ascribed to such causes. "

You've no doubt already spotted the many flaws in that chain of reasoning. Velikovsky also claimed that the approach of this cometary version of Venus reversed the rotation of the Earth for a day, making the Sun change course in the sky. The mechanical stresses involved in having the Earth slowing down and stopping like this would have levelled every single tree and man-made structure on the planet. Nor did Velikovsky ever explain the process by which he believes the planet Jupiter managed to spontaneously expel around 1/320th of its total mass to create Venus, which is the central tenet of his story. Milton appears to think that these events might well have actually happened, and that scientists ganged up on poor Immanuel because his ideas went against the current paradigm. Which they most definitely do, and repeatedly so; in his first chapter, Velikovsky dismisses the theory that the planets of the Solar System formed from the gravitational accretion of irregularities in a protoplanetary disk. Nearly seventy years after WiC was first published, this theory is not only universally accepted; science has progressed to the point where it has actually managed to watch this happening in a disk surrounding another star.

Milton doesn't explain why Velikovsky's ideas go against the current paradigm, and this is the crux of the matter. It's not because they're part of a bold, new paradigm which better explains how reality works; it's because they're silly. This is the common thread which draws together all of the supposedly "suppressed research" in the book. It isn't being suppressed because it goes against current thinking. Indeed, you can hardly describe such work as being suppressed at all if you can read about it in mass-market paperbacks. It's simply being ignored, because it's irrelevant nonsense. There's a big difference.

After reading his Wikipedia page it is very difficult not to draw the conclusion that this book was written as a result of sour grapes. Let's face it: when Richard Dawkins describes your previous book as "twaddle" you're probably best just packing up your schtick and taking up gardening instead. This book reads as if the author is seizing a chance to settle his scores with the scientific establishment. At several points in the book, Milton assures us that he's "Not knocking science," but he certainly seems to be intent on some serious character assassination.

It's an entertaining read. But it's not a scientific one.

Published by: BBC Books, 2013

Yes, discovering two very different books with titles which both begin How To Read... during my recent charity shop adventures amused me, and I couldn't resist reading and reviewing them consecutively.

This book is aimed at budding astronomers, at people who are considering buying a pair of binoculars or a telescope to observe the night sky. It's a laudable aim, and one that was dear to the heart of Sir Patrick Moore, presenter of the BBC show The Sky At Night from its first broadcast on the 24th April 1957 until he made a last, posthumous appearance on the 7th January 2013, a few months prior to the book's publication. He was 89 years old. This book was published by the BBC and bears strong The Sky At Night branding. It also has an introduction by Dr Brian May, who—aside from being the lead guitarist in Queen—is also an astrophysicist, and was a friend of Patrick's. He has appeared on the show several times.

It should be clear therefore that the book's intended audience is viewers of the television show (which is still running today). For the majority of the book, the authors understandably stick to objects which their audience will find easy to observe with the naked eye, binoculars, or a small telescope. These are all to be found within our own solar system, starting with the Sun (with the essential warning that you should NEVER look through a telescope at the Sun) and moving outwards until we reach the Kuiper Belt, the origin of many (but not all) of the comets that occasionally grace our night skies. The authors explain that they're not going to discuss the wider field of cosmology, so other than a passing mention of our own Milky Way galaxy there is no discussion of nebulae or galaxies. I found this rather odd, as the Andromeda Galaxy can be seen with the naked eye in good seeing conditions and it covers a larger part of the sky than the full Moon. In the Southern Hemisphere, the Large and Small Magellanic Clouds are distinctive features, too. It's an even dafter decision to justify once the discussion in the book swings around to exoplanets, which the amateur astronomer stands absolutely no chance whatsoever of seeing from their back garden.

There are some odd omissions. When the principal features of the Moon or Mars are discussed, it rapidly becomes obvious that the book really needed to include decent maps of them rather than the simple astronomical sketches provided. Although the inner planets are described as being "terrestrial" neither the reason for this adjective nor the word itself are ever adequately explained—terra is Latin for earth or ground, and terrestrial planets are simply those which have surfaces you could stand on (if you were wearing appropriate clothing, that is.) Other terms such as "convection" are mentioned several times before an attempt is made to explain them properly. The process of sublimation is described in a roundabout way without using the actual word for it at all (perhaps the authors had used up their quota of allowed technical language by that point).

There are lots of weird errors. The astronomical term preceding (as opposed to following, both words that are used to describe which side of a planet is being observed in preference to East or West) appears as "proceeding" which is something else entirely. Two of Saturn's moons are confused: tiny Pan (10 km across) is initially described as being one of the two shepherd moons of the F ring, acting with Prometheus (101 km across); instead, it's the moon Pandora (84 km across) which helps Prometheus to create the weirdly braided F ring and Pan is responsible for the less pronounced Encke Gap within the A ring. We're told that the Greek word for the Sun, "Helios", means "sun-centred" (at this point, the authors are explaining the word "heliocentric" so this appears to be one of the book's many editorial glitches). One mistake that really annoyed me appears in the chapter on Mars. It perpetuates the long-standing myth that at its closest approach, the planet will appear to be nearly as large as the Moon, giving their relative diameters as we observe them through a telescope as 26 arcseconds and 30 arcseconds respectively. I've ranted on my blog before now about this particular claim, because it's obviously, blatantly false. The Moon is about 30 arcminutes across; even when Mars appears at its largest in the night sky, it is roughly sixty times smaller than the Moon. Considering the authors are both professional astronomers, I find it hard to believe that they were responsible for this howler.

In fact I can't believe that they were responsible for large portions of the book, as the standard of writing varies so much. A lot of it reads as if it had been given to a bunch of undergrads as a project that could earn them a few extra marks and what they produced was hastily compiled without bothering with any proof reading, but instead was sent off to the publisher without any editorial input at all. How was it that absolutely nobody in the production process noticed that the opening paragraph of the chapter on comets contains the sentence "A bright comet in the night sky can be a glorious sight, a bright majestic tale (sic) sweeping back from the fuzzy nucleus"? That repeated use of the word "bright" is a regular feature of the text, too. There's a lot of redundant verbiage; references to "the skies above" abound. Where else would they be?

The quality of the writing for a lot of the book is distinctly sub par. There's none of the lyricism of Carl Sagan to be found here. In fact, the grammatical style is all over the place. Phrases such as the old chestnut "quite literally" are used with figures of speech, a rookie mistake if ever there was one. The wrong word gets used more than once (much more, if I'm honest), perhaps as a result of autocorrect getting its wires crossed of poor dictation software; this means that rather than embarking on a quest to understand the laws of nature, the conclusion of one chapter tells how humans have apparently embarked on a question to do so. Later on in the book, we are told that the rings of Saturn when viewed edge on will appear as "little more than a thin slither (sic) of light". The slither / sliver confusion is repeated a couple of pages later. Ironically, in the final chapter about the possibility of life on other worlds, "planet" is used when the writer clearly meant "plant"! And sadly, there are a fair few typos, so we're introduced to the rather feminine sounding Giordana Bruno instead of Giordano Bruno and later we learn about the renowned English astronomer Sir "Issac" Newton, for example.

For me, the more serious issues with the text are the structural ones. The flow of ideas from simple to complex gets jumbled up (and as I'm a professional training designer, that sort of thing really bugs me). A subject's introductory paragraph sometimes ends up as the final paragraph of the preceding section. Wider context is often treated as an afterthought. We're told what Pythagoras did before it's explained who he was, or where he came from. Relatively obscure points are made not once, but several times (for example, we're told that the original name for the telescope was the optical trunke when Galileo's achievements are being discussed, and then again when the authors are examining the work of Hans Lippershey). This may be the result of the book having more than one author, but any editor worth their salt should have been keeping track of who mentioned what.

Things aren't a complete write-off. There are some interesting titbits in the book; why is the planet Mercury associated with Wednesday by so many different cultures, for instance? But I'm afraid that there are much better introductions to astronomy out there, made with more care than was used with this one.

Published by: Bloomsbury, 2013

I wasn't sure I was going to get on with this book, as the author chose to use the very first paragraph in it to have a go at his kids for not taking the dog out, a chore which had been relegated to him. But as his dog-walking took him to the local churchyard, this turned out to have inspired the rest of the book.

And once the grumbling is out of the way, the book turns out to be a bit of a treasure. Stanford weaves together a fascinating mix of archaeology, social anthropology, religious belief, and philosophy as he visits a selection of the Europe's graveyards both ancient and modern. Although the book is predominantly an examination of burial practices, it's never morbid and there are some great snippets of information for trivia hounds, such as the origin of the word "mausoleum"

Père Lachaise Cemetery in Paris makes an appearance, of course. So do Greyfriars Kirkyard in Edinburgh and the Scavi in Rome. I was slightly surprised that Highgate Cemetery was missing; instead, London gets a chapter on Paddington Old Cemetery. Each is discussed with empathy and respect. Stanford dives into the history of each location in order to explain how it became what it is, and he does so in the context of personal visits which add an element of—well, I guess that "adventure" isn't exactly the word I'm looking for in this context, but nor is it far off the mark. There's a sense of presence in a significant space, of being surrounded by history, and a vivid sense of the societies that used them that is just as exciting for me. The book is frequently surprising, too. I'd never considered the economics of a graveyard before, but it turns out to have played a pivotal role in how the ways we bury our dead have changed over time. There are some sobering statistics, too. It turns out that a grave will be tended by other members of the deceased's family for an average of just fifteen years. Perhaps as a result of this, many modern grave markers are made from wood and other biodegradable materials. A century from now, your final resting place may have become completely anonymous...

I did find myself wishing that the illustrative photographs had been better, though. They appear to be the author's snaps of each location, rather than professionally composed attempts to convey the atmosphere of each place. And as they are reproduced in halftone in with the text rather than being provided on separate (more expensive) glossy plates, the picture quality suffers considerably. More irritating was the fact that when a photograph appears in the text, it lacks any form of caption to explain its context. To find out what it is you're looking at, you have to consult the list of illustrations that is provided at the front of the book; this gets rather tedious after the first dozen or so.

But that's a minor niggle. If you've ever considered just how you would like to be remembered after you've passed on, this book will give you plenty of food for thought.

Published by: E-bookarama Editions, 2019

Pyotr Demianovich Ouspensky was an early pupil of the mystic George Gurdjieff, whom he met in Moscow when he was in his late thirties. It seems an odd age to begin a serious attempt at spiritual enlightenment under the tutelage of a guru who was only eleven years his senior, but Ouspensky had been interested in Theosophy for over a decade, travelling in the footsteps of its inventor and greatest publicist, Helena Petrova Blavatsky. He spent time in Tamil Nadu before the First World War forced his return to Russia, where he ended up working for a succession of newspapers. His motives may indeed have been pure but unfortunately, judging by some of the tales that Gurdjieff tells in his autobiography Meetings With Remarkable Men, it's difficult to avoid drawing the conclusion that the older man had developed his reputation as a great philosopher as a useful asset to be used in separating gullible people from their money. Did Ouspensky see an opportunity to follow the same path as his mentor? HPB, after all, had done very nicely from her tales of talks with the Ascended Masters in the mountains of Tibet and the grab-bag of ideas from Buddhism, Hinduism, Monism, and Freemasonry that she assembled into the Theosphical school of thought, which became popular in the early part of the Twentieth Century. Theosophy aspired to be regarded on an equal footing with science, but it was nothing of the sort; it's mysticism with a dash of philosophy, and not very good philosophy at that.

The thing about claiming to be in the possession of secret knowledge is that eventually you have to put up or shut up. Read HPB's writing and you soon realise how wobbly the whole edifice is; Ouspensky would need better material than that if he was going to make a proper go of things. And indeed it seems that he managed to be plausible enough to achieve considerable success. For a time, he could be found delivering lectures to the great and the good (and the credulous) in the salons of Viscount and Lady Rothermere, as well as those of several prominent Theosophists. Ouspensky did not regard the scientific method as useful (perhaps it seemed too much like hard work), nor did he keep abreast of recent developments in the field of physics, despite relying heavily on its findings to shape his arguments. Einstein is acknowledged, but Special and General Relativity are only mentioned in a way that implies spuriously that they support Ouspensky's pronouncement that "Matter, i.e. everything finite, is an illusion in the infinite world."

In fact, much of the book was already out of date by the time the first edition came out. The luminiferous ether was still being presented as a fact by Ouspensky, even though Michaelson and Morley's classic experiment proving that it couldn't exist had been performed twenty-five years earlier. Max Planck's paper proposing quantum theory was more than a decade old, but you'll find no mention of it (or Max) whatsoever here. Gurdjieff's tales are of making easy money. Actually exerting oneself in order to become better informed about a subject wasn't part of the game. In Ouspensky's A New Model of the Universe the resulting lack of a solid theoretical grounding is abundantly clear, as many of his assertions are made without evidence in a very "the dog ate my homework" fashion. "All the good stuff will be in my forthcoming book," Ouspensky tells us. It never was. You can see this attitude in Tertium Organum, too.

I find it particularly telling that Ouspensky preferred to refer to textbooks on physics written for schoolchildren; for A New Model of the Universe he used one written by Professor Orest Danilovich Khvolson, who was a member of the Soviet Academy of Sciences and one of the first physicists to publish on the concept of gravitational lensing. For Tertium Organum, Ouspensky quotes heavily from the writing of Professor Nicholas Oumoff of Moscow University, who was a genuinely influential scientist who had successfully recommended the physicist Sir J. J. Thomson for a Nobel prize in recognition of his discovery of the electron, but the quotes that Ouspensky selects are mostly taken from the Professor's dabblings in philosophy. Indeed, given Ouspensky's treatment of Einstein, I'd be very surprised if Oumoff actually supported any of the theories which Tertium Organum espouses. Ouspensky was also very taken with the work of Charles Hinton, a British mathematician who wrote science fiction stories in his spare time which mused about life in higher dimensions. Hinton is best known today for having coined the word "Tesseract" which I guess makes him an authentic part of the Marvel Cinematic Universe.

But Ouspensky got lucky. By focusing on the idea of higher dimensions, he captured the zeitgeist for many people who were seeking a deeper meaning to life, the Universe, and everything from the cosy cottages of the Home Counties of the 1920s. Cosy they might have been, but the horrors of the First World War were still fresh in people's memories, and there was comfort in the idea that science might be on the verge of breaking through to the hereafter. Additional, higher dimensions were a plausible location for an afterlife that the bereaved hoped that they might soon be able to communicate with directly. People were looking for answers, and Ouspensky had ones that sounded like they ought to be correct, even if they weren't fully understood by his readers.

Or perhaps Ouspensky had twigged that trying to think in more than three dimensions made people's brains hurt, so claiming that he was (very ocasionally) able to do so himself was a subtle way of asserting authority and superiority over his audiences. We'll never know whether he could actually do so or not, but I'm profoundly sceptical.

Humans evolved in (and still live in) a world of three dimensions of space and one of time. As such, it's very difficult for our primate brains to visualise things in more than three dimensions. Edwin Abbot's novella Flatland became immensely popular in the 19th Century as it turned the problem on its head, getting us to imagine what life would be like in just two dimensions. Hinton's scientific romances suggested that four-dimensional beings—if they existed—ought to be capable of manifesting in our world, and therefore provided a seemingly-reasonable explanation for psychic phenomena. Beings who existed in four dimensions would, of course, be far more advanced than humble primates such as ourselves. Ouspensky picks up this idea and runs off with it. He begins by claiming that the lower animals can only think two-dimensionally, but then remembers birds, which clearly have no problems negotiating their way in a three-dimensional world at all and changes his mind (later on, he's clearly forgotten about them again and repeats his original claim.) Interestingly, he comes up with a concept now known as a worldline, which is formed by the persistence of a three-dimensional object over the fourth dimension (which is, according to Einstein's General Relativity, that of time). But Ouspensky doesn't appear to have read Professor Khvolson's work quite as diligently as he did Hinton's, so we are assured that the fundamental duality in the Universe is not that of matter and energy, but of matter and motion. Things rapidly stray further and further off the beaten track, and this is not helped by Ouspensky's preference for the writing of Emmanuel Kant in explaining the nature of things over that of dear old Albert and his pals. And so by the time we get to Chapter 13, we're being assured that because we cannot know the ding an sich but only our perceptions of it, then our humble three-dimensional reality cannot exist; its appearance as mere phenomena masks a deeper form of existence which science is unable to reveal and we're back to Plato's cave and Arthur Schopenhauer's Veil of Maya. Ouspensky's conclusion is not that we think in three dimensions because that's the form of the world we live in, but the reverse: that the world exists in three dimensions because that's how we think. Sensibly, when it comes to providing information on how one might go about cultivating the ability to think in more than three dimensions, he hands the problem straight back to Hinton. Smart move there, Pyotr.

Ouspensky evidently had a real problem with the scientific approach and some chapters are little more than extended diatribes against positivism and its shortcomings. He, of course, knows better than all those crazy scientists and despite admitting that "The whole of our positive science—physics, chemistry and biology—is based on hypotheses contradictory to Kant's propositions" he takes Immanuel's side in every discussion. As with his work A New Model of the Universe, his authorial voice is that of a wise headmaster. As an ascended sage, he wants you to know that he has it all figured out. Of course he has; he is privy to arcane knowledge! But rather than share that knowledge, he uses it primarily to justify smug judgements such as "Kant's idea is quite correct" and "(visualization) creates the possibility of the same mistake which has stopped Hinton in many things." Ego has taken over ambition and choked the desire to inform, and it's not a pretty sight.

A strong smell of hubris pervades the book. After all, if you've given your work a title implying that it contains the third great revision in scientific thought after those of Aristotle and Francis Bacon, you'd better have some truly Earth-shattering stuff to back it up. Given Ouspensky's relative obscurity these days you'll already have figured out that Tertium Organum falls well short of its ambition. Ouspensky's thesis relies heavily on taking the adage that "everything that exists for long enough becomes its opposite" as literal truth and he turns this into a bizarre system of pseudo-logic that allows him to contradict Aristotle and "prove" that A is equal to not-A, black is white, everything in motion must possess life, and reality doesn't exist. Explication is for wimps; whatever he intuits is true must therefore be so, because he says it is. Yes, really; or as he puts it, "Consciousness, therefore, is the sole basis of certainty." As the book progresses his assertions get bolder and weirder and eventually we find him throwing statements at us such as "A religion contradicting science and a science contradicting religion are equally false" and my favourite, because it is presented with a wild blend of italics and upper case: "Mystical states give knowledge WHICH NOTHING ELSE CAN GIVE."

I don't buy any of this for one moment. Ouspensky clearly had an interesting mind and if he'd been a bit more open to the scientific method (and, perhaps, a bit less of a chancer) he might well have made a significant impression on Western thought. But instead, he wrote a string of smug, disappointing works that use mysticism to hide gaps in his arguments and misrepresentation of other people's ideas when his own begin to fail.

This was yet another cheap e-book edition I picked up for my Kindle that suffers badly from a complete lack of understanding of how the original work was formatted. The chapter summaries are not presented at the beginning of the book, but as the opening paragraph to the chapter they describe. Even worse, every single page in the e-book is numbered as page 2. There are better versions of the book kicking around on the web that you can download for free, and you'd be better off doing that, should you decide to read the book at all. Don't feel that you need to on my account.

Published by: Bloomsbury, 2000 (ebook)

Much as I love food, and cooking, I never wanted to be a chef. I was never sure why. Or at least, I wasn't until I read this extraordinary autobiography.

When I was a child, my parents would always watch cookery programmes on television (I'm old, so back then these were in black and white, and were usually presented by Fanny Cradock and her long-suffering fourth husband, Johnny, whom she bossed around at every opportunity). When I was small, British cooking had a reputation to match that lack of colour, coming as it did from a country that had only recently been freed from the food rationing that had been imposed during the Second World War in order to conserve dwindling food supplies. Recipes involved disgusting substances like powdered egg; taste was calibrated in shades of grey and food was something which you consumed to fend off hunger rather than something that could bring joy.

That only began to improve at the end of the 1960s and an early sign of change was the advent of the celebrity chef. My first experience of this was watching "The Galloping Gourmet". This was a Canadian television show shown by the BBC and presented by Graham Kerr, who introduced his personal take on New Zealand and Australian cuisine which he would prepare live in front of a studio audience. At the end of each show, one member of the audience was picked to try whatever had been made. The antipodean character of it all made for exotic television at the time and the show was wildly popular. I was surprised to discover very recently that Kerr is actually from Brondesbury, in London. His fondness for (as he put it) a "quick slurp" was the first occasion when I noticed just how frequently alcohol was involved in the process of preparing food; that impression was subsequently confirmed once Keith Floyd got his first series on the BBC. By the start of the 1980s, I had discovered the delights of food from off the beaten track in both domestic and foreign cuisines and my lack of culinary adventurousness (learned from my parents) had disappeared forever. And I've never, ever looked back.

I still have absolutely no inclination to be a chef, though.

After reading Anthony Bourdain's memoir about his professional career, I'm very glad that I didn't choose that path. I wouldn't have survived it. Bourdain's experiences with hard drugs (which appeared to be very much part of some aspects of New York's culinary scene in the early 1980s) make Kerr and Floyd's fondness for an occasional snifter seem positively quaint. Much of the kitchen life he describes reads more like a novel about the Vietnam war. Life as a top chef is clearly not for the faint of heart. Gonzo sensibilities are much more the order of the day and I was not surprised at all to discover that Bourdain was once described as a worthy successor to Hunter S. Thompson. He was clearly very well read, and I smiled to read him name check William Gibson and Philip K Dick in the space of a single sentence when he describes visiting Tokyo for the first time. But the man had a poet's gift when it came to describing food. You'll find your mouth watering time and again as you read about the food (and some very special meals) that shaped his talents as a master chef.

Perhaps knowing of his death from suicide in 2018 has coloured my reading of the book, but there is a darkness to his writing which often throws the rapturous descriptions of good food into stark relief. There is a crushing sadness lurking amongst the joy; there's a profound sense of resignation and occasional despair at the physical toll he inflicted on himself, and there are glimpses of a deep weariness that he was holding at bay with the constant consumption of aspirin and cigarettes. He wrote Kitchen Confidential when he was forty-three years old but it reads like he was at least twenty years older than that; a chef's life is undoubtedly a hard life and Bourdain wore his battle scars proudly. He was a warrior, and he doesn't hide that fact; there is considerable violence in the stories he tells, enough that more than one genuinely shocked me. But at heart he was an exceptionally gifted communicator and a truly great writer, and this book really demonstrates those qualities of his at their best.

Published by: Black Swan, 2013

I've just finished reading another book from Professor Jim Al Khalili, and in this one he has a lot of fun explaining some of the world's most famous paradoxes, those ideas that have you scratching your head because from their descriptions it sounds like science (usually physics, but occasionally mathematics gets a look in as well) has got itself into a muddle along the lines of the philosopher Epimenides, who once declared "Cretans are always liars," thus creating a sentence that cannot be true, but cannot be false either—provided that it is uttered by someone who really is from Crete, as Epimenides was.

Jim presents the grandfather paradox: if time machines are ever invented, would you be able to travel back in time to kill your grandfather before your mother or father was born and stop yourself from existing, allowing your parents to meet after all, in which case you would be born and go back in time? This conundrum becomes a stepping off point for discussing current thinking in quantum physics that arose from the group of brilliant minds that Niels Bohr gathered together in Copenhagen in the 1930s. I'm sure that the people at Marvel will be delighted, because it appears that the existence of a multiverse provides a plausible way of resolving the paradox.

Olber's Paradox (if the Universe is infinitely old and infinite in size, the night sky should shine as brightly as it does during the day, so why doesn't it?) provides an opportunity for a whistle stop tour of modern cosmology. Xeno's paradox is used to show how badly the ancient Greeks struggled with the concept of infinity and infinite series and the solution to the fallacy is shown as clearly as I've ever seen it. The Fermi Paradox (if aliens exist, where are they?) allows Jim to discuss the Drake equation and recent discoveries of exoplanets, leading him to explain how it would only take a spacefaring civilization ten million years or so to colonise an entire galaxy, with no magic faster-than-light flying saucers required at all, just lots of patience.

And yes, game show host Monty Hall makes an appearance and now I finally understand why I should change my choice about which door to open if I want to end up with a luxury car instead of a goat.

For a book that was published ten years ago, Paradox remains remarkably up to date, although the suggestion that the as-then recently discovered bacterium GFAJ-1 might be an example of life evolving on Earth for a second time (because it seemed to show an ability to incorporate arsenic in its DNA instead of the phosphorous that all other terrestrial life uses) seems to have fallen by the wayside. But the mention of a new particle discovered by scientists working at CERN in Switzerland in the book's final chapter was right on the money; it was the long-anticipated Higgs boson.

Published by: Penguin, 2022

Whenever I hear someone start talking about the exponential curve of modern technological growth I wince, and prepare for another sermon on the Singularity, a concept dreamed up for the most part by Ray Kurzweil about the future moment where the curve of a graph plotting mankind's progress in pretty much anything over time becomes vertical, and—to borrow a phrase from Arthur C. Clarke—we will find ourselves moving suddenly and jarringly from "sufficiently advanced technology" into "magic". At that point, Kurzweil and his acolytes argue, all humanity's problems will be instantly solved by superintelligent AIs, molecular-scale 3D printing, nanotechnology robots swimming through our bloodstream, and any one of a thousand other things that currently only exist in science fiction novels. The downside is that humanity will no longer exist; we will have been transformed into something new and strange. And Kurzweil tends to be somewhat hazy over whether or not you or I will have any say in the matter if we're still around when (or more accurately, if) the Singularity finally happens.

I was delighted to discover that Azeem Azhar's book does not go down that road. In fact, Azhar explicitly rejects Kurzweil's vision in favour of a much more sober examination of what it means for people who live in a time of change that is unprecedentedly rapid on many fronts. This has resulted in instabilty across the board from the employment market and economics to how food gets on our tables, from manufacturing and supply chains to politics at all levels from local to global upwards. We're already living in the exponential age, and the future we seem to be travelling towards is nothing like the rosy, have-it-all utopia that is envisioned by Kurzweil. For a start, the important decisions on the way the world should be run these days are much more the purview of giant multinational corporations than inefficient, outdated practices such as democracy.

So what, Azhar asks, can we do about it? In his opinion, the answer is "quite a lot, actually." But the optimal response involves a rebalancing of capital on more equitable terms; it requires the immense business behemoths that companies like Amazon have become over the last decade to limit their excesses (or—more effectively—have limits imposed upon them by external arbitrators), and most importantly it means shifting the balance of power away from obscenely rich CEOs back towards their impoverished employees (the figures Azhar gives on how much employee productivity and GDP have grown over the past forty years while wages have barely shifted since Thatcherism and Reaganomics kicked in at the start of the 1980s should leave you outraged). To his credit, Azhar points out that the way forwards needs to be different for different cultures and nations; the gig economy might suck here in the UK, but it is a way out of poverty for many people in African states. To borrow a concept from William Gibson, change, like the future, will not be evenly distributed. It can't be.

At the book's heart is a fundamentally humanist rallying cry: everyone should have sufficient agency over their own destiny. We're clearly not there at the moment. But as technology makes its way up the exponential curve, it provides the means for this to happen, and to happen for everyone, rather than just for the chosen few.

But—as this book makes abundantly clear—late-stage capitalism can no longer be even remotely considered as a candidate for the means by which we can bring such a utopia about.

Published by: Headline Book Publishing, 2000

Kitty Ferguson has had two careers: as a professional musician and as a science writer. She brings both to bear on this book, which is a history of the science of cosmology (which studies the Universe in an attempt to determine how it came to be the size and shape that it is today, and how it might develop in the future). Both Pythagoras and Johannes Kepler wove ideas from music into their thoughts on why the stars and planets are arranged as they are; even today, the resonances (a musical term) between the orbits of the planets plays a significant part in our understanding of how planetary systems develop over time—not just in our solar system, but also in the many planetary systems that we now know exist around other stars.

This is one of the best popular science books I've ever read. Ferguson takes a subject that I thought I knew about well and made me realise that a lot of what I thought I knew wasn't actually correct. Galileo's relationship with the Catholic Church is perhaps the most obvious example, but there are many others; many of cosmology's developments did not originate with the people who are most closely associated with them. There are plenty of fascinating insights into many of the field's most famous players, and the fundamental concepts are explained clearly and engagingly; I now know why Cepheid Variables behave in the way that they do which makes them so useful as "standard candles" for astronomers, for example.

The book was published at a moment in time when cosmology was undergoing its greatest upheaval since the Copernican revolution that placed the Sun at the centre of the solar system back in the 16th century. The "funny energy" that was hinted at following the examination of type 1A supernovae in distant galaxies by Saul Perlmutter and his team (which not only confirmed that the universe is expanding but also revealed that the expansion is speeding up) is discussed in the final chapter; today, it is formally referred to as dark energy but we have still to discover what it is, or how it works. For this book to remain relevant a quarter of a century later is a testament to how well it is written. Highly recommended.

Published by: Heritage Illustrated Publishing, ebook

It took Laurence Sterne eight years to write this extraordinary book. It didn't take me as long to read it, but it's a work I've been dipping in to over the course of several months rather than something that I could read over the course of a day or two. The text is intense and demanding. And that's because it's very difficult—if not impossible—to concentrate on the story thanks to its many asides, diversions and abrupt changes of narrative. Although its title suggests that this is the author's autobiography, it's nothing of the sort. That's the basic joke of the book. The narrator continually changes his mind about what he's telling us, going off at tangents that become increasingly more wild and implausible as the book progresses. They come so thick and fast that trying to keep track of where you are in the narrative becomes impossible. Some threads are dropped, only to be picked up dozens of chapters later. Not one of them ever seems to be drawn to a satisfactory conclusion. This self-sabotage is cleverly worked into the text as a family trait; Tristram's father sets out to educate his son by obsessively writing an encyclopedia for him, and in becoming more and more distracted by its content he fails to complete it, thus failing to provide his son with any useful education at all.

At the time it was written, this must have been a radical departure from how tales were normally told in print and Sterne seems to have invented a new style of writing in order to pull this off. The punctuation which he employed throughout the book to underline its changes of direction is still strikingly eccentric and surprisingly modern. Sterne liked his em- and en-dashes even more than I do, and even travelled from Yorkshire down to the printers in London to make sure the text was rendered just how he wanted it; some dashes in the original edition were outrageously long. They give the text a breathless quality that makes me think that the author intended the tale to be told loudly, at full tilt, and as quickly as possible so that the reader is permanently left behind, confused, struggling to catch up.

The result draws the reader into an ADHD fever dream of distraction and information overload (although the erudite, scholarly quotations that are thrown at us are—of course—entirely made up, the sources complete works of fiction). There are whole paragraphs in French and whole chapters in Latin but when the narrator relates anything taken from (no doubt spurious) Greek text, it's simply—and in my opinion, hilariously—introduced with "(in Greek)".

No wonder the book caused such a stir when it was originally published between 1759 and 1761; it was a sell-out success. It remains notable enough for Michael Winterbottom to have directed a film of it back in 2005, complete with an all-star cast.

But although the cover of the ebook proclaims that this version is the "classic illustrated edition" it is nothing of the sort. In fact, it's a poor effort. Sterne would have wept to see what a mess has been made as all his wonderful dashes have been reduced to hyphens and I'm sad to say that the edition I read omits all of Sterne's elaborate typographical jokes altogether. The famous "black page" (was Zappa a Sterne fan?) is nowhere to be seen. The jokey blank pages added for the reader to supply content from their own imagination, such as in the description of the Widow Wadman, are replaced with pedantic text of "(blank page)" which makes the joke fall as flat as a lead balloon. Worse, William Hogarth's illustrations are all absent—they have been replaced by a boring selection of irrelevant, anachronistic paintings from the following century. The result is that the playful heart, the idiosyncratic, essential stuff of the book is largely missing.

Read the book, by all means; just don't bother with this ebook version.

Published by: Taylor and Francis, 1999

Jim is one of the best science communicators out there, and his Radio 4 series The Life Scientific is well worth a listen. In this book he sets out to explain the possible "practical" applications of Einstein's General Theory of Relativity for things like crossing interstellar distances in the blink of an eye or travelling through time. He does so in a way that is easily accessible and does not require detailed scientific knowledge. But I used those inverted commas advisedly, as travelling through wormholes requires the use of all sorts of exotic things that could exist, but probably don't, like exotic matter, which would have all sorts of mind-bending properties. And even if we could travel to a convenient black hole, using it to travel through time would—well, let's just say it would be somewhat risky. Many physicists believe that even if Einstein's equations predict that it would be possible to jump back into the past, the Universe will somehow prevent your journey from taking place. There are too many potential paradoxes lying in wait for the unwary chrononaut to stumble upon otherwise, and these are explained in a clear and entertaining fashion.

This is a fine read, and having listened to Jim on the radio a lot, it's unmistakably written in his voice. Great fun.

Published by: Constable, 2022

This book was recommended to me by Leah, a dear friend I've known for forty years who is based in California. We first met back in the 1980s as a result of our connections with the UK and US metal scene and Leah remains an enthusiastic metalhead to this day. "You need to read this," she told me when she came to stay recently, and I ordered a copy on the spot. She was absolutely right, too. I thoroughly enjoyed it.

This is an oral history of how, at the beginning of the 1980s, the British heavy metal scene somehow mutated from a small, largely underground collection of small groups into a global phenomenon that transformed rock music and inspired a whole generation of musicians to embark on their own musical journeys. It created a movement which is supported enthusiastically by its loyal followers to this day. In the process, it spawned Kerrang!—the world's first full-colour music magazine dedicated solely to heavy metal which is still thriving forty years later (and nowadays it has its own satellite television music channel, too).

This is a book of tales of how things went down, told by people who were there at the time and whose names will be instantly familiar if you were around back then too. Not all the stories agree with each other, but that's all part of the fun. Hann's book brought back lots of vivid and happy memories. I lost count of the number of times I found myself muttering, "God, I remember that!"

At one stage or another I must have seen almost all of the bands that are mentioned in the book, and I still have 45s and LPs from the likes of The Tygers of Pan Tang, Fist, Sledgehammer, Raven, Samson, Saxon, Judas Priest, and Dianno-era Iron Maiden. I would religiously buy the music newspaper Sounds every week. As Hann quite rightly points out, the NWOBHM movement would never have happened without its support—and more specifically, without the enthusiastic writing of its journalist Geoff Barton. I lived in the suburbs of London back then and once I had a car of my own, visits to the Greyhound in Fulham Palace Road, or the Soundhouse, or any one of the legion of London pubs that used to run Heavy Metal Nights in the 1980s were a part of life for much of my early twenties—because I wanted to see what the hell Barton was raving about. I would also make regular pilgrimages to the Hammersmith Odeon, to the Marquee in Wardour Street (which was the first venue in which I ever saw Metallica play), to the Rainbow in Finsbury Park (where I saw Graham Bonnet during his tenure as frontman for Ritchie Blackmore's Rainbow), or the Lyceum Theatre in the Strand (where I saw the Tygers, Magnum, and the nascent Def Leppard on their first headlining tour of the UK which is mentioned in detail in the book). Metallica's Lars Ulrich explains just how pivotal the NWOBHM was in the inception of the Thrash Metal scene; I also learned of an astonishing interview with Black Metal legends Venom, whose frontman Cronos christened what have since become four distinct musical genres in the space of a single sentence that had the journalist scrambling to write down...

Hann's central thesis is that for most of the bands who were a part of the movement, they got as far as they did as a result of three things: word of mouth (from Barton and a few others in the music press such as the late, great Malcolm Dome); rampant enthusiasm; and self-belief bordering on delusion, spiced with a generous helping of sheer bloody-mindedness. In the short term, attitude and energy often counted for much more than musical ability, particularly where Geoff Barton was concerned. And for a time, the bombast and hyperbole worked; many of those acts found themselves playing to venues holding several thousand people. But while many bands got in the album charts once or twice, and a few even reached the dizzy heights of an appearance on the BBC's chart show Top Of The Pops (probably the number one ambition of a large proportion of UK musicians around at the time), a crushingly small number of them went on to sustain a financially viable level of success. Sadly, for the bands that didn't make it, it wasn't always their fault. Perhaps the most telling comment in the book comes near the end and is made by Hann himself, who observes, "Almost every band in this book has a complaint about either their management or their record label, and it was ever thus." Many of the musicians from those days talk ruefully about bad decisions and missed opportunities. I wasn't aware of the reasons behind Gillan splitting up, for example.

It all left me feeling rather sad. Sadder still was the way that serious interest from record companies in up-and-coming metal bands faded away after the spectacular success of Def Leppard's album Pyromania showed that commercial gain was easier to achieve with bands who were willing to make radical changes to their approach in order to break the much larger American market. Those bands who remained true to their original vision were largely left high and dry. But today, many NWOBHM bands are still playing what's left of the UK club circuit (and the lucky ones get to take in the occasional festival, too), although few bands can boast that they have the original line-up of members any more.

Aside from Def Leppard, Iron Maiden are pretty much the only part of that generation of British bands who went on to achieve the sort of world domination that every NWOBHM band dreamt about, and many of the people interviewed for the book clearly view Maiden (and, indeed, Motörhead) as something individual and iconic, not as part of a wider movement. While Lemmy, Eddie, and Phil are mentioned several times in the book, they were clearly viewed as being something apart, something radically separate from the NWOBHM movement. Lemmy himself always said that what Motörhead played was rock and roll. But what both they and Maiden did superlatively well was this: they did the work. The secret to success is often just a matter of polishing an impressive set of musical chops by putting in lots and lots of gruelling hours on stage, combined with dogged persistence and an utterly unstinting focus on professionalism.

And that is probably the most important lesson I've learned from reading this book, although "Do not attempt to make your own pyrotechnic displays for your band's stage show" runs it a very close second and I guarantee that if you read this book, you will repeatedly find yourself asking, "How are these people still alive?"

If you like heavy metal; if you've ever been in a band; or if you've ever wondered what it's like to be in a band, you're going to love this book.

Published by: Bluebird, 2021

Given that Mo Gawdat used to be Chief Business Officer at Google X, I fell for the blurb on the back cover of this book that claims "no one is better placed than Mo Gawdat to explain how the Artificial Intelligence of the future works."

The problem is, he doesn't explain things at all. And, as he reveals in the book early on, the programmers who wrote the initial code for modern AI applications can't explain how they work either. When you dig into the field of AI in more detail, you'll see a lot of the experts start waving their hands about and walking back the more extravagant claims that have been made on AI's behalf. Chatbots like ChatGPT create the illusion of knowing what they're talking about purely because they take milliseconds to examine billions of text conversations their creators scraped off the Internet and find a response that best fits the content of the discussion you have had with them so far; they're using statistics, not consciousness. But you won't find detail like that in this book. Nor will you encounter a discussion about what a conscious machine might look like, or how we'd build one. I expected the idea of consciousness to be central to Gawdat's book, but it's barely touched on. And as for taking the step of asking permission to use all the content that's been lifted from other people's websites in order to train those AI databases, well, who can be bothered to do that, right?

Instead, the first half of the book relies mainly on tropes from science fiction films to show how bad the future could be (and I suspect William Gibson would take issue with Gawdat's frequent use of the term "mild dystopia" because a dystopia is, by definition, as bad as it gets—and I've seen him make exactly that point in discussion with the author Nick Harkaway.) And in discussing these science fiction tropes, Gawdat left me with the distinct impression that he hadn't actually watched the source material he writes about, but had instead just skimmed the Cliff's Notes for them. There's an awful lot of misrepresentation (apparently the problem of interstellar space travel is solved, because we can play video games that take us there in our VR headsets; tick the box for instant teleportation replacing air travel for the same reason while you're at it) and careful omission of many significant problems (Gawdat assures us that a self-driving car has never killed a human being, which is absolutely not true, and was so well before the book was written). It's all thrown together and spiced up with plenty of tired clichés and cringe-inducing, patronising asides to the reader.

Gawdat explains that he dictated the book to an app called otter.ai and then checked the results using Grammarly, and oh, boy, it shows (to give just one example, I'm pretty sure when he writes "punch" in the book, what he actually would have said was "punish"). He frequently loses track of his argument, telling us we should refuse to work on AI in one chapter, declaring that AI will happen whether we work on it or not in the next, and then recommending that we don't work on AI later still before assuring us it will inevitably happen in the book's conclusion. I'm pretty sure we'd have gotten a better and much more readable book out of him if he'd used a human editor instead.

The second half of the book is where things start to get interesting, however. Here, Gawdat lays out a rambling case for how we should behave in our daily lives in order to persuade the AIs that we will be worth keeping around when they complete their inevitable rise to power. Even if they never do, it would make the world a much better place, as it involves not being a dickhead, not trolling people online, leaving positive and compassionate comments on YouTube, not using the recommendation functions on retail websites, and never sharing all those tedious click bait articles and videos that we come across on Instagram or Facebook.

Now that's something I can get behind.

Published by: Penguin, 2020

This was a Christmas present from my brother, who knows my musical taste very well (although I have to admit that I only have Kraftwerk albums on vinyl; that will have to be remedied.) It's a fairly quick read, and while it has its roots in academia—according to the bio at the front of the book, Schütte teaches German popular culture at Aston University—the book gives an interesting and accessible account of one of the most significant bands ever active in the field of electronic music and discusses their influence on popular music as a whole. Artists from Jam Master Jay to Brian Eno, from Blondie to David Bowie, make an appearance or provide pithy quotes.

I wasn't aware of Ralf and Florian's fondness for cycling, which it is suggested accounted for the significant deceleration of their artistic output in the late 80s. Or of Hütter's claim that he never listens to music in any form any more, even when he's relaxing at home.

And yet this book feels distant, removed from things. At first, I just put this down to the fact that Kraftwerk the band have never really been interested in discussing their philosophy or artistic approach in any terms other than self-deprecating humour or ironic mockery. There's little to go on to gain a sense of what the musicians are like as people; they were always famously reticent in interviews and from 1978 onwards they didn't even pose for their own press photographs any more, delegating that role to their robotic mannequins. But then I realised that most of the book is written at one remove from things. Although several personnel from Kling Klang are thanked in the acknowledgements at the end of the book, there's surprisingly little original research in evidence. Most of the descriptions of significant events and critical opinions have been gleaned from interviews that were conducted by other people, from archived articles in the music press, and from other academic works. Although Schütte is clearly a big fan of the band, the result is a book that feels remarkably (and disappointingly) gutless. If there are insights to be found here, they are mostly recycled ones.

Shortly after the book was published, Florian Schneider died. Although he hadn't performed with the band since 2006, he still felt like an integral part of the band. Reading about the mannequins the band used in Schütte's book I began to understand that the band's use of dummies of themselves as equals in the presentation of their music is probably why I still regard him as part of Kraftwerk's very considerable Gesamtkunstwerk. And I suspect I always will.

Published by: Quartet, 1978

I tracked down a second-hand copy of this with considerable interest, because I'd read that Frank Herbert had used it as one of his principal sources when he was writing Dune (the book was originally published in 1960). It's an account of Shamyl (now rendered as Shamil), third Imam of the Caucasian Imamate (now part of Dagestan) and the war he waged successfully against a Russian force that vastly outnumbered his own for the best part of a decade in the middle of the 19th century. Given the current situation in the region, where another state continues to hold out against seemingly inexhaustible numbers of Russian troops, The Sabres of Paradise has taken on a new significance.

If you've read Dune, much of the book will seem familiar. Frank Herbert lifted some sentences almost verbatim and used many others as inspiration. The Caucasus may not have had sandworms, but Shamyl's murids would never have willingly given up their knives. A warrior was judged by the flair with which he used his blade in combat (this right down to the observation "to kill with the point lacked artistry"). But Kanly here is not the mannered feuding between the great families of the galaxy. Instead, it refers to a long tradition of blood vendettas, which Blanch describes in grisly detail. The secret hunting language of Shamyl's people, Chakobsa, gives its name to the Fremen tongue. And Blanch's quotations of Arab poems and the laments of Georgian prisoners alike made their way to Herbert's work, all of it uncredited.

It soon becomes clear that the Padishah Emperor of Herbert's work bears a strong resemblance to Tzar Alexander II. From now on, I shall always think of the planet Kaitain as St. Petersburg; the Russian houses of the Romanovs, the Lermontovs, the Gagarins, the Vorontzovs and many others make an appearance, and their vast wealth and unashamed opulence all strike chords with anyone familiar with the Atreides and the Harkonnens.

But the book refuses to be read only as a sourcebook for what is, quite frankly, a much lesser work. The depth and complexity of the tale being told here eclipse the work of fiction it inspired. As unbelievable as parts of it become, the story related here really happened. Indeed, one or two surprisingly familiar characters make their appearance, including Count Leo Tolstoy.

Blanch, who died in 2007 at the age of 102, considered The Sabres of Paradise to be her best work, and I can see why. Brian Aldiss was a fan, describing it thus: "A book as thick with flavour as roast wild boar, tusks and all." The book is a tour de force of simply astonishing writing: painstakingly researched, it tells a story of ceaselessly shifting politics, of great heroes gaining—and then losing—their grip on power and influence, of hostages taken, of tragedy and betrayal, of daring and hope, of indulgence and excess pitted against asceticism and denial.

Highly recommended.

Published by: Orbit, 2022

I've been waiting to read the final volume in the grand space opera of The Expanse since it was published in hardback last year, and when the paperback arrived this morning, I set everything else aside and dived in, because I've been a fan of the books for quite a while. I can't claim I was there at the beginning, but I'd certainly become a fan well before the TV series first appeared.

The television series has come and gone, although the plot of the show was not that of the books, and Amazon wrapped things up at the end of its sixth season without ever getting to the events that take place in Leviathan Falls. But then again, it's Amazon, so things are going to be messed up; it's their default operating mode.

Discussing the plot would plunge us rapidly into spoiler territory, and I'm not going to be That Guy, so all I'm going to say about it is that it draws things to a satisfyingly grand conclusion. Perhaps the best endorsement I can give the book is to say that I've already finished it.

Published by: Canongate, 2012

ZONA is a book with a considerable reputation. It has cropped up in my conversations with all sorts of people over the last few years, and to say that its readers rave about it is an understatement. I decided it was well past time I got a copy and found out exactly what all the fuss was about. I'd become one of the converted within a couple of pages.

Ostensibly, the book is Geoff Dyer's attempt to describe, scene by scene and shot by shot, what happens in Andrei Tarkovsky's legendary 1979 film Stalker.

But the book is so much more than that. When people describe a work as being a "meditation on modern life" it's usually a sign that they're struggling to encapsulate writing that ranges far and wide in its subject matter rather than it literally being the author's thoughts on what it means to be alive in the modern world, but ZONA somehow manages to be both of those things at the same time. There are accounts of childhood adventures exploring the derelict railway station at Letchworth (Dyer's own childhood version of The Zone) in the 1960s, tales of the film's troubled production history, outrageous stories of the director terrorising both Russian film censors and the film's financial backers, mixed together with thoughts about identity, what happens when you frame the film as a bizarre Soviet take on the long-running BBC sitcom The Last of the Summer Wine, despair and wretchedness, and love and hope (or the lack of it). The whole thing is permeated by a deep and enduring love of the magic of film, and it's a delight from start to finish.

Published by: iUniverse, 2005

Back in 1988, Dr Pauline Oliveros (1932–2016) was invited by the trombonist and didgeridoo player Stuart Dempster to visit an empty water cistern in Fort Worden, in Port Townsend, WA. Built to hold two million gallons of water, the now-empty concrete space possesses a 45-second reverberation tail and once this had been discovered, word had rapidly spread around local musicians and Dempster was keen for Oliveros to hear what it sounded like for herself. Access to the cistern was by climbing down a 14-foot ladder and Oliveros duly lowered her accordion down into the dark before climbing down herself. With them was Al Swanson, who made a sound recording of their performances inside the cistern, which more than lived up to its reputation for other-worldly acoustics.

As a result of the length of the reverb, the musicians had to alter the way they approached their performances, listening keenly to the way sounds lingered in the space around them. Recognising that this focused attention was not the way most music was usually performed, Oliveros began exploring its implications, and continued to do so for the rest of her life. From that single performance, the Deep Listening CD, band, and compositional movement sprang into being. It continues to this day. Oliveros pioneered the use of software known as convolution reverb in live performance so that any venue which the band played could take on the acoustic qualities of the original recording location. Oliveros was a pioneer in the field of ambient music and her influence has only grown since her death. After listening again recently to Tom Service's fascinating examination of her career on his The Listening Service podcast, I decided I really ought to get myself a copy of her book.

So I did. The text begins with a summary of the one-day Deep Listening workshop that Oliveros taught for many years (she also taught a much more academic version of DL that took its students three years to complete) and gives brief descriptions of the various physical exercises that are intended to reintegrate physical awareness of one's body with awareness of the sounds that surround you in your environment. This part of the book can seem very "New Agey" and I say that as someone who studied Chinese exercise and qi qong for nearly ten years back when I lived in Milton Keynes.

But press on, and you get to the meat of the book: a series of exercises and projects intended to help students develop their abilities to listen in new ways, to become actively aware of the sounds surrounding them. Oliveros asks some piercing and unusual questions of the prospective deep listener ("When can you feel sound in your body? How long can you listen? When are you not listening?" are just three examples) that seek to challenge the complacency with which most of us treat our ears. Following the exercises are a number of thoughtful commentaries by Dr Oliveros's students and colleagues. I finished the book in a very thoughtful frame of mind, and full of inspiration for new things to try in my own compositional work.

Published by: Penguin Books, 2013

I enjoy reading Lee Smolin's books a lot; this is the second book of his that I've finished this year. He writes clearly, and comes up with very interesting ideas about reality that often challenge consensus. He does this admirably in Time Reborn, which is an argument against one of the basic conclusions of relativity: that time is an emergent property of the Universe, rather than being something fundamental. When I first heard about the concept of the Block Universe—which Albert thought was a source of comfort, because it posits that all events, past, present, and future all exist simultaneously in a four-dimensional Universe which can never change, I was utterly horrified. For me it's the most depressingly grim theory of existence that I've ever encountered (it's also a central plot point in Alan Moore's epic novel Jerusalem, incidentally).

Smolin argues convincingly that this view of time is mistaken, and that the future is not set. The concept of free will, Smolin says, isn't dead yet (even if it's taken a severe kicking in recent years). This is very good news for people like me; I'm all for second chances.

Harking back to Sir Martin Rees's observation of how the laws of physics and physical constants in our Universe seem remarkably fine-tuned in order to allow for our existence (see book 4 below), Smolin offers a simple, if jaw-dropping explanation: this particular Universe is just one of very many others; he suggests that each universe produces progeny within black holes and the laws which are selected to govern each universe at the moment of its creation (in its own individual Big Bang) are somehow heritable, depending on a form of cosmological genetics. Minor variations of laws and constants selected when each new Big Bang occurs within a fresh black hole provide a process by which the production of increasing numbers of child universes can evolve naturally over time. Darwin's "survival of the fittest" can therefore be applied, not just to species of animals, but to entire universes: Smolin suggests that the descendants of those universes which are better at producing black holes must be better at producing progeny of their own. For there to be large numbers of black holes in a universe, he argues, there must be stars, and galaxies, and that creates the potential for complex molecules to exist. With each generation of universes which successfully manage to reproduce, the likelihood that their descendants will be based on laws and physical constants like the ones we see in our Universe gets larger, and the likelihood that those complex molecules will aggregate together sufficiently to become conscious starts to grow. Life therefore becomes an inconsequential side-effect of Cosmological Natural Selection where evolutionary pressures operate at a scale that predates the Big Bang by an absolutely mind-boggling degree (it could very well be infinite). There is no longer any need for an anthropic principle; the way things work here has evolved to the point where life can arise. There may well be billions of similar universes like ours out there, each very slightly different to ours, each forever out of our reach. However Smolin argues that it will soon be within our capability to test whether this theory is true or not with empirical observations of the Cosmic Microwave Background.

The discussion isn't limited to the evolution of the Multiverse, either. It becomes a philosophical journey around the idea of whether we can ever determine the "true" nature of reality, as well as a critical examination of what drives capitalism and whether or not any current models of economics make any sense at all. It's all heady, fascinating stuff.

Published by: Harper Collins, 2011

How many times have you looked for your car keys unsuccessfully, only to spot them sitting on the table ten minutes later? Did you know that you may have picked them up and quite literally held them in your hand while you searched the house, but not realised that you had found what you were looking for?

This is a non-fiction book about six ways in which our thinking can lead us astray (or, sadly, land other people in jail for crimes that they did not commit). It's a great precursor to Daniel Kahneman's classic work, Thinking, Fast and Slow which was published just six months later. The Invisible Gorilla demolishes the idea that going with your intuition or gut feeling (the "Thinking Fast" in Kahneman's book) is more reliable than sitting down and thinking things through properly. Chabris and Simon (who were both teaching psychology at Harvard University at the time) went viral at the turn of the century thanks to the ingenious experiment they devised and conducted which gives this book its name. We might think that we're aware of what's going on around us, but sometimes we fail to see something outrageously obvious, even when it's right in front of our face.

The authors examine six main aspects of how our minds work and show just how badly they can be broken as we go about our daily routine: attention; cause; confidence; knowledge, memory; and potential. You will discover that although we think that we can trust the way our brains interpret reality, this is often simply an illusion.

Some treasured myths are well and truly busted, including the idea that playing Mozart to babies makes them smarter, that subliminal advertising works, or even that having access to more information will inevitably lead to you making better financial decisions. David Dunning and Justin Kruger's work on the level of skill that absolute beginners think they have in a subject is discussed in detail. This explains why far more than the expected half of a set of test subjects will rate their skills or abilities as above average. We also learn just how narrow a field of knowledge can be; even though someone is an expert in one field, they can still be a complete idiot in others. If nothing else, this book should stop you from spending $500 for a spiffy gold-plated Ethernet cable which won't perform any better than a $2 one you can buy at the supermarket.

If you're someone who always trusts your intuition whenever you make a big life decision, reading this book is not only recommended, it could save you a lot of pain...

Published by: Thames and Hudson, 1996

Professor Steven Mithen (b. 1960) is Professor of Archaeology at the University of Reading. In this complex and ambitious book, he sets out on a grand quest to determine how the human mind evolved. Along the way, he sets out plausible explanations for how art, religion, and science originated from changes in its structure.

The book sets out a narrative in four acts. Act 1 is concerned with the hominins and the point at which the evolution of man and chimpanzees diverged some six million years ago in the shape of our ancestral ape. There is very little in the fossil record from this era and even now, there's a lot of debate about whether the 5.6 million-year-old species Ardpithecus kadabba (only discovered after Mithen's book was published) is part of humanity's lineage or not, so Mithen is working almost entirely in the dark; he sets his stage accordingly. Act 2 begins 4.5 million years ago with Australopithecus and includes the discovery of "Lucy" at Olduvai Gorge, concluding with the appearance of Homo habilis. Act 3 begins 1.8 million years ago with Homo erectus, moves to Europe half a million years ago with the hefty Homo heidelbergensis, and concludes 100,000 years ago with Homo neanderthalensis. Act 4 introduces modern man, Homo sapiens sapiens onto the stage, and the play ends in the modern day.

So how do you infer the structure of a mind from the fossil record? As an archaeologist, Mithen employs some fascinating methods. Not only have casts been made of the brains of our ancestors from the empty fossil skulls that have been found, vast quantities of behavioural clues abound in the artefacts that they left behind. The development (or rather, the absence of development) of the handaxe, minute scratches on pieces of bone, even the likely numbers of social groups which formed for each species inferred from analysis of dig sites can all be employed in a forensic reconstruction of their character.

Mithen uses the idea of a cathedral as his central metaphor for how the structure of mind should be regarded. Different modes of thought about natural history, technology, or social interaction are seen as smaller chapels inside the main edifice. In our early ancestors, Mithen suggests, each metaphorical chapel was walled off from the rest of the building, and primitive minds could not use resources from one mode of thought to inform any of the others.

I noticed distinct similiarities between Mithen's thesis and that of Julian Jaynes's The Origin of Consciousness in the Breakdown of the Bicameral Mind (a work that is entirely absent from the book's very extensive bibliography, possibly because it was famously described by Richard Dawkins as "one of those books that is either complete rubbish or a work of consummate genius") although Jaynes believes that the emergence of the modern mind took place much more recently. In Jaynes's view, before the bicameral mind became whole, communication between different modules of cognition in the brain (leakage between the different "chapels of the cathedral" in Mithen's model) were interpreted as divine inspiration and gave rise to religion and the search for their explanation to theology.

Which is all well and good, but there are many problems with this narrative. For a start, there is a lot of confusion about what is meant when the author refers to "The Mind." The words mind, thought, consciousness, intellect, intelligence, and mentality are used will-nilly and seemingly interchangeably, nor is there any attempt to define what any of them are actually referring to. This shouldn't come as a surprise, because the arguments in philosophy, psychology and sociology about what we mean when we talk about the mind (or any of the other terms I just listed) continue to this day. The idea that the brain imposes structure on the mind is a reasonable starting point. There are aspects of how we think, tendencies for our thinking to fall in to patterns which are seemingly genetically transmissible, yes; Chomsky proved that. You know it yourself, if you speak one of the world's commoner languages, particularly English. I have a hunch that the mind's fondness for hierarchical order which this tendency shows is a clue to its underlying structure. The level of consistency is fascinating; why should that be? But what that underlying structure is remains unknown. When we dig more deeply into the subject, we rapidly get into trouble. Even fMRI imaging has not enabled us to determine how consciousness works, and nearly sixty years after Stanley Kubrick and Arthur C Clarke showed us a thinking computer in 2001: A Space Odyssey, we remain unable to make one. Nearly twenty years after this book was published, we still do not have an agreed scientific theory of consciousness, and it may not even be possible to construct one, ever.

Also, the depiction of Homo neanderthalensis as somewhat brutish and dull is becoming rapidly outdated (in particular, the discovery of clear ritual behaviour dating back 176,000 years at Bruniquel seems to blow rather a large hole in Mithen's argument that such things only began to take place when modern man started doing them).

The book's a fascinating read and it's an interesting exploration of the ideas around the evolution of thought, but when it comes to delivering a solid conclusion, I think we have to accept that the big questions are still unanswered.

Published by: (less than) Stellar Editions, 2014

I've lost count of the number of times I've seen Hardy's most famous book referenced in print and the last time it happened I resolved to get myself a copy (thank you, World of Books) and read it. So here we are.

Professor Godfrey Harold Hardy (1877–1947) was one of Britain's finest mathematicians. He wrote this essay (at seventy pages, it's hardly fair to call it a book) at the age of 62—exactly the age I am now, and if I had any hopes of becoming a mathematician in my dotage, Hardy succinctly trashes them in his introductory paragraphs. He tells us that he "no longer has the freshness of mind, the energy, or the patience" to continue doing his job. Mathematicians over the age of forty contribute little to the field, we are told; by the age of sixty they may as well give up.

This rather morbid dwelling on fading intellectual prowess pervades the book, which is sad. There are brief glimpses of the joy Hardy must have experienced working with John Edensor Littlewood (1885–1977) and the Indian prodigy Srinivasa Ramanujan (1887–1920) at Cambridge, and he also clearly had a great enthusiasm for the game of cricket, which provides fertile ground for many of the analogies that are used in his explanations. Then again, the book was originally published in 1940, so the bleakness is understandable. Perhaps if he'd set about writing the book a decade or two earlier, he might have had a less pessimistic outlook.

But as a manifesto for creative work, it has much to offer. Hardy splits the field of mathematics into the trivial and the real, and for those seeking to contribute to the body of mathematical knowledge, he suggests that people aim to create something significant which possesses depth. These seem like admirable goals to aim for, whatever the field of work in which you are involved happens to be. Hardy seems fond of judging a discovery very much by how far down the rabbit hole of existing knowledge you have to go in order to understand the proof. For mathematicians, the reward of finally understanding something that has required them to think very hard about is a source of delight, and it can prove addictive. Failing that, the trick of taking things that everyone already knows, turning them on their heads, and using them to synthesise new knowledge is likely to bring you immortality of a sort. What Hardy is apologising for, he explains, is finding satisfaction and self-esteem in working in the field, and for believing that he had helped to make a significant contribution to it, which he undoubtedly had.

I must admit that I struggled to make it to the end, however. This was not Hardy's fault, but the publisher's. Judging by how they assembled this particular edition, Stellar Editions appear to be a shoddy, fly-by-night outfit that doesn't care about the products they release at all. The presentation of the book is inept. The blurry photographs on the front and back cover are so badly chosen and cropped that I find it hard to believe that they were the best that the designer could come up with. Worse, whoever set the type seems to have no more than a passing acquaintance with the English language. Most pages have at least one typo on them, and many have three or four. I had to keep re-reading sentences and then trying to figure out what Hardy's original text would have been. A quick online search of reviews of other books that they've published tells the same story, so this clearly wasn't an aberration. Stellar Editions clearly don't believe in spending money on designers, proof readers, or anyone else. If you're going to get a copy of this book for yourself, I strongly recommend that you make sure to get an edition that was published by somebody else.

Published by: Weidenfeld & Nicholson, 2015

Originally published in 1999, this overview of the then-current state of cosmology by the Astronomer Royal is very much a science book for non-scientists, sometimes infuriatingly so. I'm pretty sure that when he explains that "the size of the observable universe is, roughly, the distance travelled by light since the Big Bang, and so the present visible universe must be around ten billion light years across" his editor simplified the language a little bit too much; if the Big Bang happened ten billion years ago (it's now thought to be more like 13.8 billion years ago) then the radius of the universe would indeed be ten billion light years, but that would make it twenty billion light years across. And thanks to the accelerating expansion of spacetime and progress in modern physics, the universe is now thought to be closer to ninety billion light years across, but let's not quibble over a mere eighty billion light-years.

When we do get to the science, though, it's breathtaking. Sir Martin sets out just how remarkably fine-tuned the observable universe seems to be by discussing six dimensionless numerical constants which measure the way everything works. The first, the ratio of the electromagnetic and gravitational forces acting on a pair of protons, doesn't have a name, so he calls it "N"; the second, Epsilon, measures the efficiency of nuclear fusion in creating helium atoms from hydrogen (the fact that the symbol used in the book is the Greek letter of that name is never explained, which is a shame). The third is Omega (this Greek letter is explained, presumably because it doesn't look like any letter that English readers will be used to) measures the ratio of gravity to the expansion force exerted by dark energy, which must have started off being extraordinarily close to 1, even if that is no longer the case. The cosmological constant, Lambda, compares the energy density of the universe with the critical energy density of the universe. It's so small that its effects only become noticeable over distances of billions of light years. The fifth number, Q, is the proportion of the energy that it would take to pull a galaxy apart to the total amount of its "rest mass energy" (which is calculated using Einstein's most famous equation, E=mc2.) And finally, we're asked why it is that we live in three dimensions of space plus one of time rather than two, or four, or six dimensions of space and two or more of time.

In each case, if the value any of these numbers was even slightly different to what it actually is, we would not be here discussing how extraordinary those values are.

What does science make of this apparent fine tuning? That's where Sir Martin reveals the most amazing part of the whole deal: the universe in which we live might be nothing special, and just one of a quite possibly infinite series of other universes where the six constants he lists have quite different values. In some, there might not be any stars; indeed, they might not even contain any atoms at all. Other universes might consist in their entirety of a single, stupefyingly massive black hole.

Or perhaps the six numbers are connected in some way that we're not aware of, Sir Martin muses. Is there some Grand, Unified Theory of everything that would explain this fine tuning? When he updated the book in 2000, he clearly had high hopes that superstring theory would provide the answer, and do so within the next decade. However, as we've seen in the first two books I read this year, things really haven't turned out that well for superstrings, or even string theory in general. This doesn't hurt the book's main thesis; it's just rather sad to see that those grand expectations have yet to come to pass.

Just six numbers, and yet Sir Martin weaves an astonishingly profound tale out of them. Giordano Bruno got burnt at the stake for undermining the Catholic church's view that the Earth was the centre of the universe and Man was God's greatest creation by suggesting that there might be other worlds out there and that some of them might harbour intelligent life. Goodness knows what the Roman Inquisition would have made of current thinking about cosmology, or how astonishing the scale of everything actually is compared to our humble, pale blue dot.

Published by: Fourth Estate, 2003

Professor du Sautoy needs to write more books, because this is one of the best-written popular science books I've read in a very long time. It's the story of the Riemann Hypothesis, a nineteenth-century conjecture about the behaviour of prime numbers (specifically, how often they crop up among the natural numbers) which suggests that it is related to a type of equation called a zeta function. Riemann died young and never got around to proving his hypothesis. Instead, the problem soon found fame thanks to the way that it resisted all attempts to find a decisive solution. Since Fermat's Last Theorem was finally proved by Sir Andrew Wiles in 1993, the Reimann Hypothesis has become the Holy Grail of mathematics and is quite possibly the most important unanswered question in science. It's proved intractable enough to have become widely known outside the mathematics community, and it's even played a central part in at least one best-selling novel.

It's also intimately linked with the cryptographic algorithms that are used to protect the details of your credit card when you shop on the Internet, and Professor du Sautoy explains why it is that some areas of mathematical research aimed at proving the hypothesis have ended up being closely monitored by organizations such as the United States' National Security Agency.

Saying more about the history of the hypothesis would spoil things; the book is written like a whodunnit and the cast of characters includes many extraordinary figures from the field of maths. There are some familiar faces from physics, too. I was most surprised when Freeman Dyson made an appearance in the closing chapters.

I don't think it counts as a spoiler to reveal that as of today, the hypothesis remains unproven. But I was astonished by the directions the tale takes and the connections with seemingly unrelated areas of physics that have been discovered as a result of the heroic efforts to find a solution over the last two centuries. Even without a triumphant grand reveal at the end, this book is a delight from start to finish.

Published by: Houghton Mifflin, 2006

Another book from my charity shop science book haul. While this book is also about string theory, it examines the science from a markedly different perspective. Although the first few chapters are dedicated to where string theory came from and why it's so important, Smolin's main focus is on advancing a pretty solid argument that the theory has now taken over theoretical physics to the exclusion of pretty much all other theories. Smolin sets out why this is not a good thing, examining the sociological behaviours that are driving it. He's an excellent communicator. His writing is clear, easy to follow, and he conveys the roller-coaster ride of excitement and disillusionment that he personally experienced as a working theoretical physicist when the string theory revolution happened and the way people did physics started to change.

Although he assures the reader that this isn't meant as an attack on the science behind string theory or of the people working in the field, it's hard not to come away from the book feeling that an awful lot of very clever people have spent the last four decades barking up what might very well turn out to be the wrong tree. And they have done so for so long, Smolin argues, because it has become politically unacceptable for anyone to challenge consensus thinking on the subject. He sets out some pretty damning evidence for this extraordinary claim. String theory may well turn out to be one of the most expensive dead ends in the history of science.

Or it might not. The "Trouble with Physics" of the book's title is not just related to the politics of modern science; it's also the problem with the theory itself, which appears to be impossible to test experimentally. It can't even be used to make useful predictions, because the landscape of the underlying aspects of the theory is so wide-ranging, the properties of reality it can define are so varied, that nobody can identify which one we've ended up with, let alone explain why. One problem associated with the theory is thought to have around 10500 different solutions but it could be as many as 10272,000, or even higher. In trying to narrow things down, string theorists seem to have done the opposite. That's not exactly what you'd expect (or want) from a single, elegant theory of how the Universe works. The Emperor might not be wearing any clothes, but nobody has yet managed to get close enough to him to find out for certain.

What struck me most about the tale being told here is that the moment at which the field of theoretical physics stalled corresponds to a quite striking degree with the point at which the centre of activity in the field shifted from Europe to the Americas. Smolin classes the shift as being away from seers (disruptive visionaries like Einstein who thought deeply about the philosophical aspects of their work) towards craftspeople (who tend to be more attached to the status quo). Ironically, the lack of seers has turned string theory into something that looks remarkably like a cult, suffering from groupthink and exclusion of the out-group almost as badly as the Republican Party.

When this book was first published, Smolin got a lot of flak for calling out what he saw as an endemic problem in modern physics. However, nearly two decades on from the book's debut, no progress seems to have been made in proving he was wrong.

Published by: Abacus, 1992.

I started this year with something from a recent charity shop haul of science books. It's an overview of string theory by the physicist and writer Dr. F. David Peat (1938-2017), and while it's somewhat out of date, it covers a lot of the history of the theory and its principal ideas. Roger Penrose's Twistor Theory and spin networks (which subsequently led to the development of Loop Quantum Gravity) feature prominently.

It's all a very dense read, and while I got the general gist of things, a lot of the maths (and there's a lot of maths) went completely over my head. Indeed, Peat bemoans the fact that as the 1980s progressed, physics as he knew it was becomingly increasingly abstract; he more or less gives up on the idea of depicting what the fundamental concepts would look like, as many of the theory's fundamental entities don't just exist in the three dimensions of space and time that we're familiar with in every day life. However we are treated to an introduction to one of modern physics's most eccentric invention, the trouser diagram.

And no, I am not making this up.