Food and cities: Food self-sufficiency defies the laws of nature
I’ve written before about how self-sufficient Birmingham could be, and why we should bother about local food growing. As far as feeding a densely populated area, local food growing doesn’t.
Local food growing has great civic, social and individual benefits — it is truly wonderful, as Caroline Hutton of Martineau Gardens so eloquently says!
But, unless we develop an at-present unimagined technology whereby we can, for example, manufacture new materials as foodstuffs, food self-sufficiency for a city defies the laws of nature.
The 3.6M of us here in the West Midlands live in the 60K hectare conurbation. We’re in competition for food with the other 60M+ in the UK, shortly to be 70M+ . . . And there are over 6 billion elsewhere, rising to 9 billion in 2050.
At Monday’s Forum meeting I gave the figure that 10 people can live off a hectare of highly fertile intensively farmed land. It’s a crude figure, but accurate enough for this discussion, and it doesn’t look likely to change radically anytime soon.
At a different level, self-sufficiency at a local or national level may well not be desirable either. It is not something recommended by either the GO-Science The Future of Food & Farming or the House of Commons Securing Food Supplies to 2050 as thereby lies threats to our own and other peoples’ food supplies.
The recent CPRE report From field to fork: The value of England’s local food webs gives examples of great local food webs and their value, including monetary value, to communities. But as a contribution to the nation’s food requirement, their own sums suggest a potential of only some 2% of what we require. (To reach that percentage figure, I’ve taken their figure of £2.7bn for potential future local food supplies [see p5] and the figure of £156.8bn cited by the IGD as the size of the UK grocery market in 2011.)
It would be irresponsible for city leaders and socio-political decision-makers to base their plans or action on the assumption that their city or conurbation could be self-sufficient in food.
Cities don’t have the land. And vertical farming is a non-starter for now because of capacity, materials structure and light . . . there may be technologies soon that can meet the last two of these issues but not the the first of them.
Although when I was a little babyboomer kid, we did know how self-sufficient the UK was, these figures aren’t easily accessible today — I’m conferring with experts on just this matter. Sure, we now have intensive farming so I’m supposing we do better now than we once did. But it no longer concerns the population. Yet, given the global challenges were facing this century, it should!
For those who wax lyrical about what happened during World War II, the dig for victory, the pig in the back garden and all that . . . here are some factoids: The UK produced a mere 30% of food for its below 50M population at the outbreak of World War II. Despite local initiatives, during the War and for some time afterwards, people had a meagre diet, doing without certain key nutrients* and there was rationing which lasted until July 1954. Indeed, potato rationing began in 1947, in response to impact on the potato crop of the bitter winter of 1946-7.
As a 1950s child, I remember pretty dire food, and shortages. We were given cod liver oil, virol (high calorie malt) and milk as the ‘normal’ child’s diet was deficient even with the most knowledgeable, affluent and conscientious of parents.
As for now and the future: there will soon be nine billion of us requiring twice the food production than we have now. This needs radical thinking and, crucially, realistic thinking too.
* The example that the link provides is an article published in the British Journal of Nutrition (2000), 84, 247-251. Its title is Nutritional Research in World War II: The Oxford Nutrition Survey and its research potential 50 years later by Huxley, Lloyd, Goldacre and Neil. Its abstract reads thus:
To investigate the nutritional status of the population of the UK during the Second World War, nutritional surveys were commissioned in 1941. These included surveys of two groups of pregnant women: the first comprised 120 working-class women who were studied in the spring of 1942, and a second group of 253 women in 1944. Both groups were followed up until after delivery. Detailed biochemical assessments were performed on each subject. Our statistical analysis of the haematological data showed that nearly 25 % of women from the 1942 group were deficient in protein, over 60 % were deficient in Fe and vitamin A, and over 70 % had severe vitamin C deficiency. The findings were reported to the Ministries of Health and Food who instigated a food supplementation policy at the end of 1942 that entitled pregnant women in the UK to extra rations of fruit, dairy produce and to a supply of cod-liver-oil tablets. A second group of 253 pregnant women were studied 15 months later which enabled the effects of this programme to be investigated. Supplementation reduced the proportion of women with vitamin A concentrations below the normal range from 63 % to 38%, and vitamin C from 78 % to 20%, but protein and Fe concentrations were not increased but actually declined. These findings continued to exert an influence over government food policy for pregnant women until the abolition of rationing in 1954.