Who are the real friends of science?

Science is indeed wondrous but it has limitations – which, as Sir Paul Nurse demonstrated in a popular and doubtless influential article published in 2021, are not always recognized by some of its most adept practitioners.

 by 

British Science week: 2

There is much to applaud in Sir Paul Nurse’s paean to science in the Radio Times (“Time to challenge anti-vaxxers”, RT October 8 2021). He is surely right to point out that without good science, “many more of us would have died” in the covid pandemic; and that “the way science works and develops needs to be taught at our schools, so citizens and political leaders are better prepared to be engaged”. It is true, too, and very sad, that “there are those in our society who are anti-science and others who do not fully understand it but are very happy to expound incorrect information, particularly on social media”. In particular, says Sir Paul, those who lack understanding and/or are positively anti-science include – 

“…climate-change deniers, anti-vaxxers, and those who attack gene editing as being unsafe – despite it being a highly precise way to improve agricultural crops. These people tend to cherry-pick data, are inattentive to reproducibility, lack scepticism about their own ideas, and are often discourteous and strident in their opinions.” 

BUT – and it’s a huge “BUT”: Not all the critics of all that’s going on, and of the directions that science has taken, and of the powers-that-be that finance the research (or not!), are ill-informed — and certainly not all are “anti-science”. In this as in all things the harshest critics are often the greatest friends. Many of the most impassioned are professional scientists or at least are science graduates who, quite simply, hate the direction that science often takes and much of what is done in its name. They hate the frankly crude and out-dated assumption that the ideas of science are cut from superior cloth and should be given priority over all others. They hate the present role of science – as the handmaiden of big business, abetted by governments like ours, who between them hold the purse-strings and determine the direction of science and of the high technologies that science gives rise to. Many non-scientists – including many from what is now called “the global South” who may have no formal education at all — also raise challenges that need to be taken seriously. It is arrogant in the extreme, just plain wrong, to assume that those who have not been through the western educational mill have nothing worthwhile to say.  In short, although science is indeed a huge cultural asset, and has become a key component of modern civilization, and the high technologies to which it gives rise can indeed be life-saving and life-enhancing – truly enabling human beings to be fully human – there is much to question and regret, and which needs to be reversed. 

More specifically, there are significant differences between Sir Paul’s three examples of supposed backsliders. There is a clear gulf between the climate change deniers and the other two. But there are also key differences between the anti-vaxxers and those who question the need for and the value of gene editing in plant breeding; differences that are go to the heart of all the world’s problems. Thus: 

Who are the climate-change deniers ?

Who, for starters, are these “climate change deniers”? They were common enough a decade ago, and some of them, alarmingly, were in positions of influence – politicians, writers, and editors of national newspapers — but few, nowadays  (surely?) deny that global warming is a fact. The reality is all too obvious. Some do continue to doubt whether or to what extent human activity is to blame – and some claim to do so on scientific grounds. Some still like to point out that no controlled trial is possible to test the supposed role and impact of industrially generated GHGs — for that we would need to compare planet Earth with similar planets that have not had an industrial past, and there aren’t any. Indeed, if you try hard enough, it is possible at every turn to find fault with the idea that human industry, industrial farming, and deforestation are prime causes of global warming. It is possible to demonstrate points of fact beyond all reasonable doubt, as demanded in English courts of law. But, as discussed later, nothing can ever be shown to be true beyond all possible doubt – a point that some industrialists and politicians, with the help of some scientists, still seize upon in their attempts to defend the status quo.  

Indeed, Richard Doll faced the same kind of criticism in the 1950s when he first told the world that smoking predisposes to lung cancer. The tobacco industry found scientists to point out that his evidence was far from cast iron. “I know it isn’t,” said Sir Richard. “But it’s true anyway”. And now we can see that he was very obviously right.  In science as in all things, as pointed out not least by the philosopher of science Karl Popper we need always to use our common sense; and common sense tells us that although the evidence does not show beyond all possible doubt that human beings are largely or even exclusively responsible for global warming it is nonetheless persuasive — and the stakes are so high that no-one but a fool or a mountebank would now be an out-and-out climate-change denier. It does seem, however, that some of the world’s most powerful governments and industrialists are reluctant to change their policies and their ways of lifeto anything like the extent that’s obviously needed, and the foot-draggers will surely be prominent at COP26. At least, precedent suggests that any agreement that requires radical action will subsequently be shelved or postponed or otherwise watered down. The fact remains, however, that most people now seem to agree that out-and-out climate-change deniers are at least perverse and sometimes wicked. 

There may be some who deny global warming and refuse vaccination and question the worth of gene editing in crop breeding, but they, surely, are a tiny minority, sometimes representing some quasi-religious cult. For the most part,   the “climate-change deniers, anti-vaxxers, and [the critics of] gene editing” are quite different people who come at the issues from quite different standpoints. To judge not least from the popular demonstrations worldwide, most people now agree that the out-and-out climate change deniers are nutters. In my view (which I am sure many share) the anti-vaxxers do make some plausible arguments but on the whole are at least misguided. But those who question whether gene editing or genetic engineering in general really have a serious role to play in plant and animal breeding and question the usefulness of the GMOs (“genetically modified organisms”) to which it gives rise, are often among the best informed. They include scientists, a great many farmers, and a lot of academic and other close observers. 

To understand the difference between the anti-vaxxers and those who question the value of gene-editing in crops breeding we need to step back a bit, and explore the difference between agriculture in general and medicine in general, and the different roles of science and high tech in each. Exhaustive analysis could take a lifetime but here are a few salients:  

Farming vs medicine  

Agriculture and medicine are in many ways comparable. Both are essential: agriculture every day and medicine at least as a back-up in times of stress. Both are ancient. Agriculture at least in preliminary forms is at least 40,000 years old, with significant stirrings well before that; and bona fide farming, clearly recognizable as such, has been creeping up on us at least since the last Ice Age, around 10,000 years ago. All the basics were well in place by the time of Cain and Abel. Medicine is as old as humankind – or  even older: for many animals are surprisingly good at self-dosing when they are feeling off colour, or in need of some nutritional supplement (many  ungulates go to great lengths to find salt-licks), or of some prophylactic (macaws eat clay as a way of sequestering toxins in their food plants). 

Both disciplines are rooted in ancient craft and folk knowledge – with significant infusions of bona fide science over the past few centuries. In all societies the ancient and modern co-exist – sometimes harmoniously (up to a point), though often not. For huge numbers of people worldwide – perhaps the majority? – the local apothecary (or shaman or wise woman) is at least the first port of call, and is often very effective. After all, many if not most western drugs have their origins in wild plants, as deployed by herbalists the world over. Traditional agriculture that often owes little or nothing to formal science is still the main source of food for about 70 per cent of humanity, or so it has been calculated – even though traditional farmers commonly receive little or no help from governments, which western, industrial farmers very obviously do. 

For such reasons, many a modern scholar including many a scientist now acknowledges that traditional knowledge both in medicine and in farming contains much wisdom and must be treated with far more respect than has often been the case. To assume that modern science-based methods must be superior simply because they are science-based, and have big money behind them, and are perceived to be “modern”, and to sweep tradition aside without a second thought, is always high-handed and sometimes is foolish to the point of criminality. But that is what is now done routinely, and is called “progress”. At CAWR, at Coventry University, Prof Michel Pimbert has been arguing for some years that outsiders who presume to change traditional practices for whatever reason must plan their strategies in partnership with the locals who know their own problems best and often have at least the seeds of solution. Interlopers these days do commonly claim to consult the locals but “consultation” all too often is a perfunctory invitation to rubber-stamp some fait accompli. This applies the world over, from Africa to Oxfordshire.  

Yet all human activity can be improved with a little bona fide science, if sensitively applied. Thus, crucially, in medicine, good science helps us to see which protective or curative measures work best and which don’t work at all or make things worse. Central to this endeavour is the controlled trial, preferably conducted “blind”. In its simplest form, one group of subjects may be given some modern drug or vaccine; another group of similar people is given some ancient remedy; a third is given a placebo; and a fourth group gets nothing at all. The results are judged by doctors who do not know which group has received which treatment – and so they are “blind”: unfazed by preconceptions.  If the people receiving the modern treatment fare significantly better than the others, without too many side effects, then it may be admitted into the fold. If not, not.  Often things are more complicated. Sometimes – often — such straightforward comparisons are impossible because there are too many “confounding variables”.  Sometimes the results are fiddled or otherwise slip through the net. But taken all in all, comparative trials of this kind have stood us in good stead. The treatments that are widely available, whether drugs or vaccines or whatever, are nearly always tried and tested as far as is possible, and chicanery is mercifully rare, or so it seems.  

Comparable trials are the norm in modern agriculture too. Seeds are compared by growing the different types in the same conditions and seeing which give the best yields. Veterinary medicines are generally tested with the same rigour as in human medicine. And so on. This is all very useful in assessing the value both of modern innovations and of ancient practices. 

In medicine, it is easy to rattle off a shortlist of innovations that have been transformative: vaccines, analgesics, anaesthetics, antiseptics, antibiotics, regulators of metabolism, anti-depressants, modern surgery, modern methods of diagnosis, and the huge insights of molecular biology (DNA and all it does) which seems to be taking understanding and medical practice into a quite new era. Each of these advances has been a watershed. In all of these cases there was a time before and a time after. To be sure, most of above have ancient roots – medicinal drugs and tonics are surely as old as humankind (or older); vaccination has been with us for more than 200 years; and its predecessor, variolation, may be at least 1000 years old. And it is reasonable to suggest, although some religious fundamentalists and ascetics would disagree, that each and all of these innovations, despite some caveats, is a net force for good, sometimes overwhelmingly so. Smallpox is estimated to have killed 200 million people in the 20th century alone before it was wiped out in the wild by mass vaccination in the late 1970s; a triumph both of technology and of cooperativeness.  

We can rattle off a comparable shortlist of transformative, science-based innovations in farming too: the plough; wind, water and animal power – and then steam power and the internal combustion engine: various forms of rotation; Mendelian genetics; artificial fertilizers, thanks initially to Justus von Liebig and John Bennet Lawes in the early 19th century; industrial methods of nitrogen fixation in the early 20th century; organochlorine and organophosphorus pesticides before World War II, and new generations of pesticides since –notably, the neonicotinoids;  the herbicide glyphosate, marketed in particular as Roundup; and, now, smart robots and genetic engineering and indeed gene editing; all abetted by vaccines, antibiotics, anaesthetics and the rest adapted from human medicine. 

However – a serious difference! – whereas it is reasonable to argue that each and all of the high-tech innovations in medicine have been more or less unequivocally to the good, at least in net — if any of them went missing, life would be far harder and more precarious — the same cannot be said for most of the agricultural technologies listed above. Artificial fertilizers do increase yields but also in practice are highly polluting and are all oil-based and too much agrochemistry destroys the soil microbiota and mesofauna and hence the soil structure.  Such is the collateral damage from the organochlorine and organophosphorus pesticides that both have been withdrawn from general use. The same now seems to be true of neonics. 

Indeed, perhaps the most important innovation of the past 100 years is Albert Howard’s introduction of composting from India into the western world in 1930s, which led on to modern organic farming in all its forms. But organic farming in high-tech commercial circles is still seen to be somewhat eccentric, not to say backward-looking – and our high-tech, commercial government spends very little indeed on organic research. 

Both in medicine and in agriculture the value of any particular intervention depends very much, or perhaps absolutely, on context. So, as is widely agreed, the best way to deal with infection is to avoid contact with the causative parasites in the first place. Nowadays there are effective vaccines against cholera but the best way to deal with it is not to drink polluted water, as John Snow famously demonstrated in 1854 when he turned off the pump in Soho’s Broad Street. Next best is to build up general resistance by staying healthy. Thus, it seems, the best protection against TB is to be well-fed, even though many a high-profile celebrity died of it in the 19th century (in some artistic circles it was almost fashionable). Measles when I was a lad was still regarded as a routine childhood infection (even though it was a prime cause of deafness and blindness in western countries). But among malnourished children in West Africa and elsewhere it was a major killer. 

But infection will always be with us whatever else we may do – even if we did all that is humanly possible to provide everyone with good food and houses and all the rest. In Britain, poorer communities have suffered from covid more than richer people but Britain’s overall death rate per 100,000 head of population has been among the very highest in the world, together with Belgium and Italy – all among the richest countries in the world, per capita. Some infections, even the nastiest, like smallpox and polio, are very effectively dealt with by vaccination.  But others, like covid and flu, present a moving target because they mutate so readily. Malaria is difficult because the parasite itself is so complex – not a simple virus but a protozoan – and takes several forms within its host, and may take refuge in the liver where the immune system cannot get to it. So without some as yet unforeseeable “breakthrough” (horrible word) vaccines (and other protective agents) will always be necessary and always need updating. Besides, infections are not the only cause of disease. Some cancers have a viral origin but most do not, as far as is known. The same is true of heart disease. Many people too are killed or seriously held back by genetic disorders, especially sickle cell disease and thalassaemia. The decrepitudes of old age lie in wait for all of us. High tech at least in its present forms cannot always do what’s needed but it very often can and does, at least enough for practical purposes. 

Beyond doubt, too, the various technologies that are emerging from molecular biology, including genetic engineering and its refinement, gene editing, will play a bigger and bigger part in preventing and curing disease – both in human and in veterinary medicine. It’s the insights of molecular biology and its emergent technologies that have enabled medical scientists to keep track of covid’s many mutations, and to customize vaccines to cope with them. 

In short, however we may strive to improve the human condition, a future without high-tech medicine, rooted in excellent and ever-advancing science, would be very precarious indeed. Sir Paul is surely right: to be an anti-vaxxer is positively perverse and of course unsocial, since the unvaccinated are far more likely to become infected and so keep the various diseases going – whether covid or measles or mumps or whooping cough. It is peremptory indeed, too, to reject gene editing out of hand, or genetic engineering in general. Particularly in the context of medicine, these technologies are very important indeed. Indeed, in this, to quote Mrs Thatcher (I never thought I’d quote Mrs Thatcher) “There is no alternative!” – not if we want the majority of people to live their allotted span. 

But there is a realistic alternative to the ultra-high tech, commerce driven industrial agriculture that big governments like ours see as the norm and as the necessary future for all humankind. 

The alternatives 

In 2005 various agencies of the UN, including the World Bank, FAO, UNEP, and others, assembled the International Assessment of Agricultural Knowledge, Science and Technology for Development (IAASTD), with more than 900 experts from 110 countries, to evaluate “the relevance, quality and effectiveness of agricultural knowledge, science, and technology, and the effectiveness of public and private sector policies and institutional arrangement”. The IAASTD’s report, Agriculture at the Crossroads, published in 2009, pointed out that agriculture worldwide could take off in any one (or perhaps several) quite different directions, and we needed to ask which was best. We are still at the crossroads. The wheels turn slowly. 

The main choice is between what can properly be called Industrial Farming and what I (and a few others) have been calling Enlightened Agriculture, otherwise more widely known as Real Farming

Industrial farming is just that. Broadly speaking, the world’s farmland is conceived as a giant factory. The emphasis is on production: high input and high output. High output is achieved with the aid of high technology: breeds of livestock and varieties of crops precisely and maximally nourished and protected with whatever is deemed necessary (prophylactic medicines and pesticides and every other -cide). The aim, though, is not simply to produce whatever is needed but to be profitable – so costs must be kept as low as possible. In an oil-rich world it may be cheaper to mechanise than to employ human beings, who in any case are harder to control. Machines are most cost-effective when they are big.  But machines, even smart machines, do not cope well with complexity so industrialized farms, however big, are best kept simple. The crops are grown as monocultures. Furthermore, in the prevailing, neoliberal economy, farmers like everybody else are required or indeed obliged not simply to pay their way with some to spare but to compete for profit with all other farmers and indeed with all other industries in the global market. So it’s not enough merely to turn a profit. Farming enterprises must contrive to be as profitable as possible, or risk being overtaken or bought out by someone bigger and smarter. So modern, industrial farms grow bigger and bigger, more and more high-tech, in practice entirely oil-dependent, producing commodity crops in vast, simplified monocultures for sale to whoever will pay most; and, more and more, controlled by a steadily diminishing shortlist of global corporates who may be more powerful than almost any government and often prove more powerful than the law.  

Behind this modus operandi and its accompanying philosophy lie governments like ours that are technophilic and neoliberal to the core. Like all other serious human pursuits, agriculture is conceived – in a grim phrase I first heard in the 1970s – as “a business like any other”. This is a matter of dogma. Any other approach is deemed to be “unrealistic”. Farmers are obliged to do what the government says because the government – in partnership with the corporates who control production of pesticides and the rest – holds the purse strings. Those who go with the political-economic flow – “tick the right boxes” as the expression has it – are subsidized. Otherwise not. So when Liz Truss as Secretary of State at Defra decreed that farmers should raise more pigs to sell to the Chinese, that’s what farmers did — irrespective of real need or of animal welfare or ways of life or the state of the natural world or the human rights record of the Chinese. She did not foresee that the Brits and the Chinese would fall out big time over various British trade deals and the defence deal with the US and Australia, or that Brexit would exacerbate the shortage of truck drivers, or that there’d be too few abattoirs to cope with all the extra pigs. So now a lot of people are in trouble and a lot of pigs face an even more untimely end than they did before.  But this is the point. The world is a very uncertain place and getting more so by the day but the modern economy and the dogmatic politics that lie behind it leave no margin for error.  Agriculture and indeed the whole world are run on a wing and a prayer. 

Even governments like ours now acknowledge that present-day industrial agriculture is overcooked — that it is not sustainable. But they have worked their way into a corner. They really don’t know what else to do and if they did, they would not have the wherewithal to do it. The prevailing propaganda from governments and commerce would have us believe that industrial farming, with all its high-tech ingredients, and despite its obvious shortcomings, is necessary. And the necessary ingredients, Sir Paul implies in his Radio Times article, include gene editing – “a highly precise way to improve agricultural crops.” More generally, the shapers of strategy imply that without high-tech farming we will all starve – although of course, since present-day farming isn’t sustainable, we’re doomed to starve anyway. The British farmers’ chief representatives, the NFU, have gone along with the act. Biologists of the hard-nosed kind have been said to suffer from “physics envy”. Agriculturalists of the official kind suffer from big-biz envy.  

But, contra Mrs Thatcher and her followers, Enlightened Agriculture (aka Real Farming) does provide an alternative to the industrial kind. It is not designed simply to maximize material wealth in the form of money, and to concentrate that wealth in fewer and fewer hands, and to help boost GDP. The aim instead is –

To provide everyone who is ever liable to be born with food of the highest quality, without injustice, without cruelty, and without wrecking the natural world”

The term Enlightened Agriculture was first conceived in the early 2000s (by me) but its origins lie with two movements that are well established: Agroecology and Food Sovereignty.  The point of agroecology at least as I see things is, as far as is possible —  

To treat all farms as ecosystems and to try to ensure that agriculture as a whole is a positive contributor to the biosphere” 

The details of Food Sovereignty are complicated but the essence is that 

Every society should have control of its own food supply”. 

I have discussed the details of Enlightened Agriculture and the reasoning behind it at length elsewhere (see refs) but the upshot is that enlightened farms that respect the natural world and leave societies in charge of their own provender should in general be as diverse as possible — which means “polycultural”, ie mixed, with maximum tolerable genetic diversity within each crop and class of livestock. They should also be low input – which in general means organic: little or no artificial fertilizer or pesticide and no GM crops (which among other things tend to be highly uniform). But a mere variety of plants and animals is not enough. The livestock and crops on any one farm should be interdependent, and complementary, as in a natural ecosystem. 

Farms that meet these criteria perforce are complex – for the complexity itself is both synergistic and protective. Complexity in turn means skills-intensive, with plenty of skilled farmers and growers. Britain could do with ten times as many professional farmers and growers as we have now. In general therefore farms that are ecologically sound and socially benign should be small to medium-sized. So although they may grow some commodities for sale in the national or international market – things they are particularly good at – enlightened farms in the main are geared to local markets. 

In other words, structurally, technically, and in intent,  Enlightened Agriculture is more or less the diametric opposite of the industrial kind that the world’s most influential people advocate and put their weight and intellect behind (and public money). 

But is this really “realistic”? And – to come back to Paul Nurse’s article – do we not need ultra-high tech including, say, gene editing? 

What should be meant by “crop improvement? And what price gene editing? 

Gene editing is the latest refinement of “genetic engineering”. As science it is wonderful. I was first introduced to formal biology in the 1950s (school) and early ‘60s (university) and when “recombinant DNA” technology (genetic engineering) first came on board in the early 1970s I was shocked. It cannot be so, I told the friend who first told me about it (he was somewhat more scholarly than I was and kept up with the literature). It simply is not possible to take a gene (a piece of DNA) from one organism and stick it into another and expect it to be functional, I said. End of. 

But of course it was eminently possible and is now commonplace. The genome (the sum total of an organism’s DNA) has proved remarkably accommodating — often too hospitable for its own good. Indeed there has long been talk of village-scale genetic engineering units and perhaps they already exist.  But genetic engineering in its traditional forms means adding DNA from some other organism (of the same or different species) and/or eliminating unwanted DNA from the genome – and although some argued from the outset that nothing could possibly go wrong it was always obvious that a great deal could go wrong and many or most of the gloomier predictions have already come to pass. The literature on unwanted and unforeseeable side-effects is huge. Regrettably, many with a stake in GM technology ignore the negative reports or go to enormous and convoluted lengths to discredit them. Critics of science don’t have a monopoly on cherry-picking. 

Many scientists and agriculturalist now claim as Paul Nurse does that gene editing gets round many or most of the problems of genetic engineering. After all, it is at least intended to be, in Sir Paul’s words, “highly precise”. Particular genes that are known to be important are tweaked with as little disruption as possible to the rest of the genome to produce, sometimes, new crop plants that do whatever is required more efficiently than the originals without compromising other desirable qualities. Any untoward side effects can be monitored in the field and the offenders eliminated. Thus, say the champions of gene editing, in good hands the whole procedure is as safe as any biological intervention is ever likely to be. Or, as the albeit fictional airline pilot assured his passengers just before they all plunged into the sea, “Nothing can possibly go wrong!”  

Yet, as always, all is not so simple. A growing body of research shows that gene editing is not, in practice, always as specific as the editors intend. Among other things, alteration of any one gene can and does affect the function of other genes in the genome and thus can affect the organism as a whole in ways that were not predictable and may well be detrimental. References to the relevant research (kindly supplied by Claire Robinson of GeneWatch) include:

https://gmwatch.org/en/news/latest-news/19499
https://www.greens-efa.eu/files/assets/docs/chapter_2_gene_editing_is_not_precise_and_causes_unpredictable_genetic_errors.pdf
https://www.mdpi.com/2673-6284/10/3/10

The key issue, though — which Sir Paul does not address; and neither in any convincing fashion does the Royal Society of which he was president from 2010 to 2015 – is whether the whole attempt to apply genetic engineering techniques to crop breeding is really worthwhile, or perhaps, rather, is seriously misguided. Sure, the GMOs (“genetically modified organisms”) provided by genetic engineering can be spectacular. They can certainly be lucrative – great contributors to GDP, which agriculture as a whole mostly is not. GM crops have created a great many millionaires and the odd billionaire, and very rich people tend to assume that they and their wealth are self-evidently a good thing – that they are role models for the rest of us to aspire to. In the modern world they tend to call the shots. The idea that we could all be millionaires one day if only we knuckled down and competed all out for material wealth is what gives neoliberal economics its popular appeal (even though it’s an obvious lie).  In any case, wealth must imply merit, must it not? What other measures of worth could there possibly be? 

But are GMOs (however subtly refined) what the world really needs? Do people go hungry (a billion of them, according to the UN) because the world lacks super-crops? Will GM super-crops cure that hunger, or help us to take better care of the biosphere (peremptorily known as “the environment”)?  In truth there is very little reason to think so, except in the euphoric dreams of technophiles (Bill Gates comes to mind), and a great many reasons for doubt. In fact, more broadly, there is good reason to question the role of high-tech in general in “feeding the world”. We need not doubt the absolute value of science in general – especially, in the context of agriculture, the science of ecology – but lack of high tech does not seem to be a prime cause of present agricultural shortcomings, and more high tech (including genetic engineering and therefore including gene editing) is surely not a priority. Thus:  

What are the real reasons for world hunger? What’s really needed to put things right? 

This issue is of course vast, but some of the main points are: 

First, in many books and articles including Poverty and Famines: An Essay on Entitlement and Deprivation(1982, Oxford New York) the Indian economist and philosopher Amartya Sen has argued that famines never occur in true democracies. That is, whatever the technology, the underlying cause of famine is always political and/or economic. The principle is demonstrated even in the present-day UK in which a million people rely on food banks to keep them going even though the country is rich and we are, as we are constantly reminded, world leaders in agricultural science and technology. But then, although Britain’s leaders trumpet our democracy and send young men and women to war ostensibly to fight for it, our society is not nearly as democratic as our leaders like to pretend. Notably, there’s a 1000-fold difference between the richest and the poorest (a few million p.a. vs a few thousand) which seems to be against the spirit of democracy – unless we define democracy as carte blanche (as many unfortunately do). On a practical note, the huge discrepancy in wealth and income makes it impossible to decide on a sensible price for food.  What’s out of range for some is too little for others to register.

Then again, the president of the Millennium Institute in Washington DC, Prof Hans Herren (who co-chaired the IAASTD) has pointed out many a time and oft that the world already produces twice as much food as we really need – as can be demonstrated by very simple statistics. Thus the world produces about 2.5 billion tonnes of cereals per year, which is enough to provide enough macronutrient (food energy and protein) for 7.5 billion people – roughly the present world population. Yet cereals provide only half our food. The rest (pulses, other non-cereal grains, tubers, coconuts, fruit and vegetables, meat, dairy, fish, crustaceans, molluscs, and fungi) provide as much again, plus the bulk of the micronutrients. So the total is enough for at least 14 billion people. Global warming will surely require us to adjust these calculations but still the point remains: mere productivity is not the prime issue. 

At present, of course, more than half the world’s grain is used to feed livestock, and more than 90% of the soya. Yet we need not become vegetarians, or indeed vegans. Many may choose to do so for moral reasons but all we really need to do is to re-learn how to cook, and emulate the world’s greatest cuisines on an axis from Italy to China, all of which use meat sparingly – primarily as garnish. We can produce enough meat and dairy for this without feeding grain to animals in vast amounts. It’s enough to raise ruminants on pasture and pigs and poultry on leftovers and surpluses.  

In short, there is a lot wrong with present-day agriculture and sometimes it is far less productive than would be desirable but, overall, productivity is not the prime issue. What matters most is intent – or, more broadly, mindset. 

Indeed, the present focus on high tech farming is like the present obsession with space research and the particular desire to send men or women to Mars. Like high-tech farming space research comes with huge rhetoric: “The last frontier”; “A giant leap for mankind”. To be sure, space research has brought some huge practical benefits – primarily from communication satellites. Instant, long-distant telephone calls can be life-changing and the internet is wonderful — and the world’s poorest villagers may benefit from it just as much or even more than the western middle class. High tech can be highly “appropriate”, as E F Schumacher put the matter. But Mars is not a priority. It will still be there in 1000 years’ time. The urgency to get there is nothing to do with “Man’s” restless quest for knowledge but as always is driven by what President Eisenhower in his farewell address called “the military-industrial complex”.  There’s money in it and Ronald Reagan’s ludicrous “Star Wars” scenario has not gone away. 

And what’s true of space research alas is true of science as a whole – including agricultural research. The Green Revolution, based on “semi-dwarf” varieties of wheat and rice that were first grown in Mexico and India in the 1960s, is a prime example. The science and skill behind the new crops is truly wondrous – as I wrote about somewhat euphorically way back in 1988 in Food Crops for the Future (Basil Blackwell, Oxford). The semi-dwarf varieties are sometimes presented to us, not least by members of recent British governments, as early triumphs of genetic engineering – but this is not accurate. The new varieties of the original Green Revolution are not GM crops since they were not produced by transferring individual genes (“recombinant DNA”) but by very sophisticated breeding combined with manipulation of whole chromosomes. Yet they are very high-tech nonetheless. Under the right conditions, too, they do indeed deliver what was promised. The main point is that short-stemmed wheat can be heavily fertilized without “lodging” – growing too tall and then falling over – and when suitably nourished and otherwise cossetted it can and does give very high yields. The chief driver of the Green Revolution, Norman Borlaug, was awarded a Nobel Peace Prize. Today, arch-technophile and entrepreneur Bill Gates is advocating a new Green Revolution in Africa, this time truly based on modern genetic engineering techniques. 

But the Green Revolution of the ‘60s et seq has serious downsides – as recorded not least by Vandana Shiva, author among much else of The Violence of the Green Revolution (Blackwell’s, 2015). She is very much a scientist – her PhD is in quantum physics – but for the past several decades she has dedicated her energies to agriculture and particularly to the problems of farmers in her native India. Short-stemmed wheat can indeed be heavily fertilized without “lodging” and when suitably nourished and protected, it can and does give very high yields. But the seed is produced by high-tech methods protected by patents and so perforce is expensive and so are the necessary inputs – artificial fertilizers, pesticides and herbicides and water, which often must be supplied by irrigation. Thus only the rich farmers can afford to grow the new varieties and the poor, which is most of them, fall by the wayside. This leads to mass despair and, says Dr Shiva, to mass suicide. 

Some would excuse even this – write off the misery and the deaths as “collateral damage”: highly regrettable but unavoidable if we are to feed the ever-increasing population. But the high-tech crops that have brought so many to bankruptcy and despair – and now, not least in the form of GM cotton and soy are proving so damaging to the natural world – are not in truth necessary at all, as their advocates claim. I know a great many agriculturalists, farmers and scientists of various kinds, who have spent their lives among traditional farmers of the global south and agree with Amartya Sen that the problems in general spring not from lack of technique but from inappropriate infrastructure. In particular, the world-renowned animal nutritionist Prof E R (“Bob”) Orskov, late of the Rowett Institute, the James Hutton Institute, and the University of Aberdeen, who spent most of his working life in Asia, Africa, and the Caribbean, used to say that the farmers he knew in those countries – a great many of them – could all increase their output two or three times, with their existing techniques, if only they had the financial support enjoyed by farmers in the supposedly “free market” West, and if only there were suitable markets for their produce. As things are, poor farmers need to be ultra-cautious – meaning zero investment and commensurately little in return. Constant deficiency is not desirable but an expensive glut could be terminal (as it will doubtless prove to be for many for over-extended pig farmers even in affluent Britain).  

Of course, traditional methods can be enhanced by the added insights of science, and sometimes also by high-tech. But in agriculture, the science that is most needed right now is that of ecology – which has been sadly under-supported, not to say sidelined, in favour in particular of industrial chemistry and genetic engineering, which are flashier and more immediately lucrative. More generally, what’s needed in all human affairs and perhaps especially in agriculture, is “science-assisted craft”: begin with traditional knowledge and methods and then ask what science could reasonably offer. To do as imperialist nations have done these past few hundred years – simply impose what seems to be profitable in the west on to the world in general – is a recipe for disaster, as now is all too obvious. 

It surely would be peremptory to write off all high-tech approaches out of hand as demonstrated by the internet and by modern vaccines.  But in all contexts including agriculture what really counts is intent. If we do have good intentions – good food for everyone, social justice, and a flourishing biosphere – then we need an appropriate economy and governance to translate those good intentions into action. With truly appropriate politics, economics, and mindset most of our current problems would surely disappear. Appropriate strategies may require high tech but we certainly should not assume that a priori

Specifically, the technologies that have emerged from molecular biology have made it possible to keep track of all the new strains of covid as they come on stream, and the techniques of genetic engineering, including gene editing, have helped scientists to provide new vaccines for each new strain as it comes on line.  Wonderful.  But GM crops are a different kettle of fish altogether. The value or otherwise of all technologies must be judged in context. 

In short: we certainly do need to look again at the role of science and high tech in human affairs, and we certainly do need a new approach to science education. But the kind of education we need, I suggest, is not what Sir Paul seems to have in mind. 

A new kind of education 

Sir Paul is surely right to tell us that “the way science works and develops needs to be taught in our schools”. More broadly, I suggest, children and, ideally, the world at large should be introduced not simply to the content of science – Boyle’s Law etc, Newton’s mechanics, the rudiments of quantum theory, the periodic table, and, in biology, the modern synthesis of Darwin-Mendel-DNA and the rudiments of ecology — but also to the politics of science, internal and external, and the economics – including, who holds the purse-strings and steers the ship. But most fundamentally, everyone in a properly functional democracy should at least be aware of the philosophy of science:  what science is and – at least as important – what it is not. 

Emphatically, for starters, science is not the royal road to truth. Sir Paul tells us that “the bedrock of science is reproducible observation and experiment that takes account of all relevant data”. Indeed – though with three large caveats. First, the data can never be complete. We can never know all there is to know – and we cannot even begin to know what it is we don’t know. As Donald Rumsfeld famously if somewhat startlingly observed, we are beset in all fields by “unknown unknowns”: whole areas of thought that may be of key significance but of which we have no inkling. It is logically impossible, too, to gauge the extent of our own ignorance – to know how much we don’t know; for we could not know what we don’t know unless we were already omniscient, and could compare what we think we know with what there is to know. Neither can we pre-judge what is relevant. Sometimes what seems at first sight or even at second and third sight to be of no importance, and is written off as “anomalous” or “experimental error”, turns out to be crucial. Beyond doubt, “reproducible observation and experiment” is a huge advance on first impressions and guesswork but it is not and never can be the thing that lawyers somewhat absurdly demand in courts of law: “The truth, the whole truth, and nothing but the truth”. Omniscience is not in our gift.  We may feel certain that we are right but we can never be certain that our certainty is justified. 

To be sure, in the early 20th century a group of philosophers based in Vienna who became known as the “logical positivists” declared that the only ideas that should truly be classed as serious science – or indeed the only ideas of any kind that were worth taking seriously – were those that could be verified. In effect, this meant “proved”; and the proofs that were taken most seriously were mathematical, for maths can’t be wrong (can it?). Since science was and is the only method of inquiry that could lend itself to verification, underpinned by the sure-fire algorithms of maths, it seemed that the ideas of science were the only ones that should be taken seriously – or at least should be given priority. 

Yet there are flaws in the logical positivists’ apparently seamless argument. For one thing, as the American logician Kurt Godel first pointed out in the 1920s, all mathematical proofs that are not mere tautologies include assumptions that are not themselves provable. So maths too is a human pursuit, dependent on human decision. Then Karl Popper in the 1930s showed that although hypotheses can sometimes be disproved – that is, can be shown unequivocally to be wrong – they cannot be shown unequivocally to be correct. Certainly they could not be shown to encompass “the truth, the whole truth, and nothing but the truth”. What makes an idea “scientific” is not that it can be “verified” but that it could, at least in theory, be shown to be false. Thus, to take a simple but cogent example (though it’s not Popper’s), the idea that God exists cannot be said to be “scientific” — not because it cannot be proved but because it cannot be disproved. But it is possible (pace the logical positivists and Richard Dawkins et al) for an idea to be true and important even if it cannot be verified or indeed disproved. 

We can never prove that any idea we may have about the universe is true – or at least is “the whole truth” — because there might always be something we’ve overlooked – something that could upset the whole applecart. The best we can hope for (although I don’t think Popper used quite these words) is to show “beyond reasonable doubt” that a particular idea is – well; not true, necessarily, but at least “robust”. “Beyond reasonable doubt” is again an expression of English law – which this time is eminently sensible. Under any circumstances and no matter how many experiments we do, “beyond reasonable doubt” is the best we can hope for (bearing in mind that what seems reasonable today may seem most unreasonable in a few years’ time, and vice versa). For such reasons logical positivism was effectively defunct by the 1970s – at least in philosophical circles. Unfortunately, many modern scientists cling to its ideas, even if they don’t call themselves logical positivists, or perhaps realize that this is what they are. 

Two more insights, both dating from the 1960s, also seem very pertinent. First, the American philosopher Thomas Kuhn invoked the idea of the “paradigm”, and of the “paradigm shift”. Over time, he said, the combined efforts of scientists in any one field produce a general worldview, which Kuhn called a “paradigm” (although that word is more narrowly taken to mean “example”). In effect, a paradigm, (worldview) is a story that we tell ourselves about the way things are — “things” meaning life,  the world, the universe; all that is. 

The traditional view was, and in some people’s minds still is, that science is a great edifice of unequivocal truth, built over time stone by inexorable stone by the great global dynasty of scientists. The idea now, post-Kuhn, is that our scientific understanding is a succession of stories.  Sometimes the new story simply supersedes the one before, which was the case with the idea of “continental drift”: the idea that over time the continents shift position, break apart, and may re-combine elsewhere. This idea was first formally mooted in the early 20th century and then in effect was dismissed, since no-one could think of a convincing mechanism whereby such a thing could happen. Then in the 1950s and ’60s Earth scientists of various kinds developed the ideas of plate tectonics: continental islands floating on a sea of hot magma that is constantly stirred by convection. Now plate tectonics, including continental drift, is the orthodoxy. It’s hard to explain the world without it. The apparently commonsensical idea that the continents are nailed in place once and for all is dead and buried. 

Often, though – usually? – old paradigms are not simply wiped out. Often they are subsumed by whatever comes after. Thus for 300 years or so after Newton set out his ideas, scientists took it as read, done-and-dusted, that the universe runs according to the laws of Newtonian mechanics. Only a few observations like those of James Clerk Maxwell in the early 19th century threw doubt on this. Then at the turn of the 20th century Einstein picked up on Maxwell’s thoughts and developed the ideas of relativity. Then, even more radically, Max Planck and then Niels Bohr and his disciples and many more besides including Einstein began to reveal the wonders of modern quantum physics. This was a paradigm shift indeed, and a shift too from the complacency of the late 19th century, when many physicists felt that they really did understand the material universe as well as it could be understood and that there was nothing left to do but dot i’s and cross t’s. (Thank goodness they were wrong). 

Yet – quite rightly — Newtonian physics lives on. The modern, albeit uneasy alliance of relativity with quantum mechanics seems to describe the universe as a whole more completely and accurately than Newton’s mechanics, but Newton’s mechanics remains as a special case within the greater whole. It is the physics that applies to the everyday world of middle-sized objects like apples and planets moving at middling speeds. Modern physics is needed to describe the very large and the very small moving at speeds close to the speed of light. 

Sometimes – often – the leading paradigm of the day still leaves room for doubt. Thus not every physicist by any means is content with the idea that the universe began with the Big Bang 13.8 or so billion years ago. Many still prefer some variation on a theme of steady state or of a pulsating universe, repeatedly collapsing and then expanding again; a series of Big Bangs. Always, these days, it seems, the ideas of physics in particular become so refined and “sophisticated” that only those versed in the most arcane reaches of mathematics can get any real handle on them, while the rest of us (including a great many professional scientists) just nod (or else give up thinking and just do the maths). In the end science always runs up against mystery — unknowns and unknowables; and unknowables take us into the realms of metaphysics. In truth, in the end, what people believe or don’t believe depends not simply on the evidence, or on reason, and certainly not on anything as grand as verification. In the end it’s intuition that leads us to believe or reject any particular idea – and that goes for physicists as well as for theologians and poets and, indeed, all of us. In the end, when a scientist has done the maths and looked at all the evidence this way and that, what he or she chooses to believe is what they feel in their bones to be true. In the end the bones have it. Science is not rational all the way through and if it was it would not work. 

The second of the great insights of the 1960s (at least as I see things) came from the great zoologist-immunologist Sir Peter Medawar. He simply pointed out that science in the end, in practice, is and can only ever be “the art of the soluble” – no more, no less. He borrowed the expression from Bismarck’s “politics is the art of the possible”, for scientists, like politicians, can do only what they can do with the ideas and resources that are available to them at any one time. Thus, said Medawar, psychologists may yearn to understand the deep stirrings of the human mind; why we think and feel and act as we do; and what thinking and feeling actually are. Some psychologists like Freud and Jung were content to throw the net wide to embrace clinical experience of mental disorder, plus anthropology, and all literature – and thus ventured perforce into the realms of non-science: ideas that might seem to have enormous explanatory value and feel very satisfying and could indeed be true, yet are not theoretically disprovable and so fail the Popper test. But psychologists qua scientists sensu stricto felt obliged to devise experiments that they could control and which gave reproducible and quantifiable results. So while Freud and Jung and others offered exciting flights of intellect and imagination the experimental psychologists working within the confines of hard-nosed science had to content themselves with rats in mazes, and the strait-laced theories of behaviourism. Fortunately, things have moved on since then. Grander and grander ideas are now being formally put to the test (including the “anthropomorphic” idea that other animals too can think and feel and suffer from depression and feel socially excluded and all the rest. Humans don’t have a monopoly on sense or sensibility. David Hume and Alexander Pope said much the same thing in the 18th century, which wasn’t as hard-nosed as conventionally supposed). 

Thus science does indeed advance. The stories that successive generations of scientists offer by way of explanations become richer and richer. Folk tales give way to descriptions as complex and multi-layered as War and Peace – though never as complete and rounded as a work of fiction may be. The loose ends are the exciting bits, that lead us into new vistas – ever onwards. Even so, in psychology and indeed in science as a whole we can never do more than titivate our paradigms until they can be titivated no more, whereupon we must move into a new paradigm. We can never provide the complete and unequivocal narrative. “Judgement is mine”, sayeth the Lord. And so, too, ultimately, is understanding. In the end we are obliged to acknowledge that the universe is beyond our ken. We can never be certain that our explanations have taken everything into account and if we do feel certain, then we can never be sure that our certainty is justified for it’s always possible that some new finding will precipitate yet another paradigm shift. Over time, the ideas of science become ever more and more intricate, and esoteric. But our evolved faculties must always leave us far short of omniscience. If ever we feel that we have reached the end of the trail it’s only because we have run out of imagination, or indeed of intellect; unable to understand the implications of our own findings. And in the end all of is rely on that mysterious property known as intuition to tell us what to take seriously; the feeling in the bones. 

The curse of scientism 

Yet the conceit persists: that comprehensive understanding – omniscience — is within our gift; that science is the route to omniscience; that the insights of science must trump all others; and that we are well on the way to achieving something very like omniscience. This view of science is known as scientism, the essence of which has been captured by an Oxford scientist, author of a definitive text on organic chemistry, Peter Atkins. The key sentences are shown here in bold:  

“Science, the system of belief founded securely on publicly shared reproducible knowledge, emerged from religion. As science discarded its chrysalis to become its present butterfly, it took over the heath. There is no reason to suppose that science cannot deal with every aspect of existence. Only the religious – among whom I include not only the prejudiced but also the underinformed – hope there is a dark corner to the universe, or of the universe of experience, that science can never hope to illuminate. But science has never encountered a barrier, and the only grounds for supposing that reductionism will fail are pessimism on the part of scientists and fear in the minds of the religious”. 

From: “The Limitless Power of Science” in Nature’s Imagination – the Frontiers of Scientific Vision.  Ed John Cornwell, Oxford, Oxford University Press, 1995 p 125.

The ideas of science in turn are translated into “high” technologies — the kind of technologies that can arise only from high-flown scientific theory, like atomic energy and the laser and genetically engineered vaccines; as opposed to the traditional, craft-based technologies that gave us sailing ships and ceramics and sewing machines. Just as science has spawned scientism, so high tech has given rise in some circles – including very influential circles – to what might properly be called uncritical technophilia: the belief that high tech (albeit at some unspecified time in the future) will be able to dig us out of whatever hole we dig ourselves into, and that only high-tech can do this. In the immediate term, uncritical technophilia has led many people – including, it seems, governments like ours – to equate high tech with progress; and indeed to see high tech as the first port of call, the prime desideratum, in planning the future. In the longer term, in grander vein, uncritical technophilia has led many to suppose that high tech will eventually make us omnipotent, complementing the omniscience achieved through science. In our omniscient and omnipotent form we, humanity, will “conquer” space and disease and control all nature. We will bestride the universe like gods – which indeed, we are sometimes told by gung-ho scientists and entrepreneurs and politicians in Parnassian vein, is “Man’s destiny”. Sometimes, to be fair, the finger is wagged. Some warn us very properly to heed the chivalric principle of noblesse oblige: that as power increases, so must responsibility.  But the neoliberal economists who are now shaping the world are apt to waive the moralizing. Que sera sera. What will be will be. 

Religion properly construed should provide an antidote both to scientism and to uncritical technophilia. We can be gods (always with a little “g”) only in the sense that Nero or Caligula thought they were gods. Clearly they were deluded – and so too are all those scientistic and technophilic zealots who seek to control nature and “conquer” space. Religion  too – or at least, more broadly, some inkling of spirituality — should surely put an end to neoliberal economics, the all-out scrabble for material wealth and dominance. 

Yet life is never so simple. In practice, science and religion both are misconstrued. Science all too often degenerates into scientism and religion all too often is bogged down in fundamentalism – absolute faith in some ancient text or guru. Thus there are plenty of scientism-ists and uncritical technophiles in the American Midwest, which is a world centre both of neoliberalism and of fundamentalist Christianity. 

Contrariwise, as the modern Catholic Church emphasises, when science and religion are properly construed and as far as is possible are understood, there should be no serious conflict between them. Both actually tell us that in the end, the universe is beyond our ken. Scientists don’t need to be atheists and many are devout and draw inspiration from their religion — and vice versa. Many a man, woman and child has been drawn to religion by contemplation of the stars or of nature. The pioneer scientists of the 17th century commonly saw their researches as a means to understand the mind of God (with a capital G). 

I do not for one moment want to suggest that Sir Paul is guilty of scientism or of uncritical technophilia, and I have no knowledge whatever of his religious or political views. I do suggest, however, that his recent article in the Radio Times, and many others like it by many other scientists, provide the scientism-ists and uncritical technophiles and shoot-from-the-hip atheists with intellectual nourishment, and help to make their ideas seem respectable. We should indeed extol the virtues of science and be properly grateful for appropriate high tech – but we must remain critical.  Sir Paul, I suggest, at least in this “popular” article in a high-circulation magazine, is far less critical than is desirable. 

Decrees or dialogue? 

Education should never be conceived, as scientists so often seem to perceive it, as one-way traffic; wisdom dispensed de haut en bas by the full-time professionals and their paid advocates. Sir Paul in truth calls for a “proper debate”, which sounds eminently acceptable – except that such debates at least as conducted on television as mass entertainment tend to be between scientists who do know something about science and generally are nice people versus various kinds of religious and political zealots. That may be considered good telly but it’s an obvious mismatch. What’s needed is “proper” discussion between professional scientists and representatives of the many millions of people, including a great many scientists, often very distinguished, who feel that the scientists themselves have often mistaken the nature of their own metier and exaggerated its power, and believe that it really is one of the triumphs of humankind, and really is vital if we and our fellow creatures are to survive in tolerable forms for more than a few more decades, and hate to see science and its high technologies reduced to the roles of handmaidens to big business and political expediency.  

In truth we need more than a debate, however “proper”. We need to re-think the basis of science education: to re-think what science really is and what we really want it to do for us and why it is that in the end science is rooted in the unknowns and unknowables of metaphysics – as indeed is true of all really big ideas. We should in a truly rounded education explore the elusive concepts of “reality” and “truth”. We also need to look at the idea that behind the “laws” of physics and the patterns of biology lie influences that science cannot analyse, of the kind that is often called “transcendent”. For it is a mistake to suppose that because the ideas of science are so wondrous and often seem so rounded and complete, that therefore science has told us all there is to know. A feeling for transcendence is, I suggest, what ought to be meant by “spirituality”. 

We also need to look at the economics and politics of science, both internal and external: why the best ideas may often be sidelined while others that may in the end prove deeply pernicious rise to prominence simply because they appeal for whatever reason to the rich and powerful. 

Overall, science should be presented not as a collection of algorithms to lead us to an illusory god-like status but as a truly human pursuit: at its best, scaling the heights of imagination and stretching even beyond but also subject to the full gamut of human weakness. 

Above all we need to change the current attitude to science. Attitude is all. Right now the rhetoric presents science and the emergent high tech as means to “conquer” space and disease and in general to “control” nature — all for our own convenience and enrichment, or at least for the further enrichment of those who are already rich. Yet science should primarily be seen not as an exercize in control but in appreciation, as it was in the pioneer days of the 17th century: not to conquer and control nature for purely material ends but to help us to appreciate more fully the universe in which we are privileged to find ourselves. 

Science thus presented would surely awaken public interest and sympathy far more effectively than out-and-out advocacy or indiscriminate attacks on those who dare to take issue with the status quo. 

Footnote: All the above arguments are discussed in my latest book, The Great Re-Think (Pari Publishing, 2021). 

But I have been thinking about all this for quite a long time and earlier, relevant books include Good Food for Everyone Forever: A people’s takeover of the world’s food supply (2011); The Secret Life of Trees (2006); So Shall We Reap: the Concept of Enlightened Agriculture (2004); In Mendel’s Footnotes: Genes and Genetics from the 19th century to the 22nd(2001); The Second Creation: Dolly and the Age of Biological Control (co-authored with Ian Wilmut and Keith Campbell. 2000). Neanderthals, Bandits, and Farmers(1998); The Engineer in the Garden: Genes and Genetics from the Idea of Heredity to the Creation of Life (1993); Food Crops for the Future (1988); Future Cook (1980); and The Famine Business (1977). 



Leave a Reply

Your email address will not be published. Required fields are marked *