Recently I had the pleasure of working with Asterisk as part of their second edition covering food (here's a link, and there were many great articles). In the end I gave an interview with their team on my own work and research on Abrupt Sunlight Reductions Scenarios (here), discussing the challenges of building resilience to these events, and some of the broader risks and shocks to food security the world has seen recently.
Hopefully the interview was of interest, especially for those who have not considered nuclear or volcanic threats to food before. However, the initial plan was for me to write an article on the topics above. This ended up being far too long for their needs and covered many different food security topics, pushing us to an interview instead to present a cleaner narrative.
While the full article was not used, it seemed a shame to waste it, and following some edits and a clean-up I present it below. Ongoing food security threats are both serious and neglected, as well as perhaps being less known to readers compared to other EA cause areas. I also find the topic very interesting in of itself: we all interact with food daily, it shaped our global history, and the elements of the cause straddle short termism and long termism. Here I highlight two areas of concern: the challenges of managing the surging global complexity of the food system, and preparing for events that will abruptly reduce sunlight. I hope you enjoy.
As a disclaimer, I work at a food security non profit, The Alliance to Feed the Earth in Disasters (ALLFED)). While the organization studies the threats listed below, alongside other neglected risks, I should stress that the words in this article are my own, alongside any mistakes.
***
We have never produced more food per capita. This food is of course deeply unequally distributed and there are still far too many who go hungry, but the change in living memory has been profound. As recently as 1970 over 34% of those in developing nations were undernourished by the United Nations Food and Agriculture Organization’s (FAO) definitions, while in 2019 it was around 8.9%[1]. Even with population growth that is hundreds of millions fewer in hunger, and should also be compared to a world prior to 1900 where the vast majority worldwide (80-90% at least) were undernourished by the same standards.
This is incredible progress, and while eliminating hunger altogether has proven extremely difficult even in high income nations there is no fundamental reason why we cannot see further improvements, potentially one day finally eradicating it altogether.
However, there are still serious threats to food security, which have little awareness worldwide and little to no preparation for their occurrence. In this article I wish to highlight a couple of these with potentially catastrophic consequences: our reliance on complexity, and abrupt sunlight reduction scenarios. I also want to discuss how they interact with the rest of the food system, and some ideas for what could be done to prepare and respond, which also has relevance for other threats such as climate change.
***
The modern world has brought incredible changes to the way we live and work, and agriculture has been no exception. In particular, the way we produce food has become steadily more complex over time, and has made use of a revolution in seeds, inputs and trade that is still continuing today. This has been very good for human flourishing, and is a true success story of our times. Complexity lets us produce far more food with the labor of far fewer people, trade away production shocks, and allows those of us who do not cultivate food to pursue other endeavors. I greatly respect farmers, but it’s not for everyone, and in the past the vast majority of labor was tied up in rural activities by definition. However, our new food system also comes with new risks, and opens up the threat of a truly catastrophic shock should it unravel.
Complexity is quite rightly a complex thing to define, but by complexity here I mean the number of steps that have to function in a chain before food arrives in your mouth. In a simple society you could be relying on just your family and neighbors alone for all your needs, while in the complex modern world for a single meal you may be linked to a tiny fraction of the work of literally tens of thousands of separate people.
The simplest and earliest farming was almost entirely self-sufficient, based around a family unit that supplied pretty much all that they needed by themselves, alongside some barter with their neighbors. This simple farming by its nature has to be risk averse, it’s hard to reliably bank surpluses between years at low levels of complexity, and producing a higher yield in 5 out of 6 years does not help if your crop gets wiped out and you starve in the final one. In addition, farmers at this level lack access to capital to save between years or finance inputs to raise output, and often also lack a market in which to sell a surplus or purchase food to cover a shortfall anyway. This is very different to higher complexity farming, where crop insurance, trade and effective storage allows surpluses and deficits to be balanced off against each other.
As a result, with the lowest level of complexity you have a very low level of yields, and very little surplus per farmer. This forces most of the global population to be farmers or gathering food, and while it is perhaps not the worst life it means people will not be doing much else, with famines still all too common.
So what kind of things can change this? It can start small historically: for example civil works that bring irrigation water to fields and for roads and ports to be constructed. Trade links can then develop, so that food can move to newly forming cities and between areas of surplus and deficit. Now money has more of a use to farmers, and can be used to hedge against a poor year, allowing specialization to higher yielding crops to occur. This new monetization allows specialization of tool manufacturing to start, and allows more field capital to be introduced, such as mills and horses/other prime movers so you aren’t working by hand, plus all the equipment needed to harness them (literally).
Then came the industrial revolution, which completely blew everything apart. Cloth no longer needed to be spun by hand (a massive drain on labor, often of rural women), and mechanical muscles in the form of trains and tractors could replace humans and horses in plowing fields and moving harvests. The benefits of rail and steam ship logistics were also huge, farmers in the vast plains of America could now start reliably feeding the growing industrial cities of Europe. Combined with tractors, this freed up labor from the fields, and meant that the same number of people could produce far more food per capita, and in much less time.
Conditions were - and are - tough in many factories, but the precarious life of a smallholder farmer is often worse. According to the United States Department of Agriculture the average field in America’s heartlands earned a revenue of around US$670 per hectare per year cultivating corn (maize) in 2020, and that is before input costs and the value of your time is subtracted. Those are good figures worldwide, meaning any farm where a family cultivates one or two hectares will never be rich, be it in America, Europe, Africa or Asia, unless it is really farming government subsidies. This was the case for all of history, and will remain so into the foreseeable future.
Instead, rising rural wealth has come from consolidation and rural to urban migrations, with average field sizes rising and the efficiency that has resulted. It may seem like progress in agriculture has been slower than in other areas, but it has still been astounding. According to Vaclav Smil’s estimates[2] it took around ten minutes of labor in 1801 to cultivate and harvest a typical kilo of wheat in the USA. However, by 1901 mechanization reduced that to 1.5 minutes, and now the average with modern equipment and higher yields is just two seconds per kilo, and we are still improving our techniques.
Another driver of the agricultural revolution has been fertilizers, and it is hard to oversell how important the Haber-Bosch process to secure nitrogen has been in expanding the carrying capacity of the earth. Liebig’s law of the minimum states that the input that is in shortest supply limits plant growth. For example, if you have an ideal climate and nutrients, but only half of your water, you still only get half of your optimal growth.
Typically the minima experienced by farmers was nitrogen, especially in environments with reliable rain and sunlight. Nitrogen itself is incredibly abundant and forms the majority of the air we breathe, but with strong chemical bonds that make it useless to plants. Instead, nitrogen needs to be in a more usable form, typically in nitrate compounds such as ammonia, or in urea. This transformation happens naturally from lightning strikes and bacterial action, but you will only fix around 10-15 kg of nitrogen per hectare per year that way, versus an optimum requirement of around 110-140 kg for a harvest of typical grains. Meanwhile, some plants do fix part of their own nitrogen by encouraging bacterial growth in their roots, but this is typically not enough to meet even their own optimal needs, let alone the needs of following harvests.
The need to secure nitrogen rich fertilizer has been known for thousands of years, well before the element was identified or people knew exactly how it worked. The most readily available form was animal and human manure, which have a low but still usable concentration of nitrogen (typically just 1-3% percent). However, many countries had already reached the limit of nitrogen from humans and animals by the time of the industrial revolution, with the supply from animals and cities already fully utilized. The USA and a few others even started mining guano from remote islands, digging up and shipping the accumulated droppings of hundreds of years, but this could never be a long term solution.
Then came Fritz Haber, who invented a process whereby natural gas was cycled at very high pressure over a catalyst to produce ammonia in the early 1900s. Together with Carl Bosch (who designed a way for this to be done practically at industrial scales) this managed to produce usable ammonia in 1910, for which they both eventually secured Nobel prizes. Initially the process was intended to produce explosives, but it could also be used for fertilizer production, with massive consequences for agriculture once the process became widely affordable in the mid 20th century. Urea can be produced from ammonia, is 42% nitrogen, easy to apply and has lower rates of volatilization than the critically limited supply of manures, making it ideal for farming.
In total, the FAO estimates that we now apply around 110 million tonnes of synthetic nitrogen in the form of ammonia and urea fertilizers produced from the Haber-Bosch process each year. This is compared to 17 million tonnes of nitrogen sourced from atmospheric deposition, 27 million from manures, and 34 million tonnes fixed by crops, meaning that over half of the crops produced are reliant on hydrocarbons in the form of natural gas for their growth. In reality this figure is higher, much of that 27 million tonnes of manure implicitly contains the synthetic fertilizers used to grow cattle feed, and if we ceased the Haber-Bosch process entirely we would also need to radically reduce herd sizes, reducing the nitrogen available to us further via a decline in manure. Remember that 18th and 19th century societies were already maximizing their usage of manure and still fell short of their needs; there simply is not enough natural fertilizers to make organic farming viable in a world with 8 billion people, unless we massively expand our croplands at great ecological cost.
At last with synthetic fertilizers the final minima of Liebig’s law could be lifted, and combined with plant breeding this meant yields could surge. In the United States maize yields have risen from under two tonnes of maize per hectare in 1940 to over ten tonnes today, and it is a similar case for Europe with wheat (from under one tonne per hectare in 1900 to three to four tonnes by 1940 and around eight tonnes today in Northwest Europe), with yields still rising year on year. This is not uniform, and there are significant yield gaps globally, however this means there is still a huge potential to raise output by deploying the technologies we already possess on land we are currently cultivating.
You may have heard facts which seem to directly contradict the above, for example that the majority of food comes from smallholders rather than high complexity farming operations. Even the FAO has published that 70% of our food comes from smallholder farms, however this seems to have been the result of confusion with their own sources. It is not 70% of food that comes from smallholders, but rather from family owned farms, which can extend into the thousands of hectares across places like the United States and Australia. Instead, smallholders are over 80% of farmers by some estimates but supply just a third of our food. This makes them among the poorest globally, and the hungriest.
So now we arrive in the present day, at a level of output and complexity across our food system that past generations would marvel at. It’s worth stressing how useful this is, complexity in the food system is a good thing, and has gone a long way in allowing us to build the modern world. It also underpins the reductions in famine and improvements to diets worldwide. I have great respect for those who farm in smallholder conditions, but it is not a life that I would choose, and many of those who are smallholder farmers today would and do leave if they have any other options.
However, high complexity also opens up new risks, it lets us trade away the small shocks but opens the door to much larger ones. In addition, it would be incredibly difficult or maybe impossible to go back a step in complexity, meaning if we fall we fall hard. Firstly, we often have lost the capital and skills required, for example draft animals and farmers with the experience to use them are pretty much gone across our breadbasket regions. Secondly, the previous methods would be radically insufficient to meet today’s needs, with a world of 8 billion and counting versus the world of 1.9 billion in 1900.
There have been instances of food systems losing complexity in the past - with catastrophic outcomes. Mostly this happened when the empires that maintained the required complexity disintegrated, leading in turn to a collapse in farming output and trade with ruinous consequences for both rural and urban populations. For example, the fall of the Han in China in the late 2nd century led to well over half of the Chinese population perishing from a pre-disaster level of over 56 million, and even 125 years later the recorded population was just 35% of this pre-collapse total. Meanwhile, the fall of the Western Roman Empire in the 4th century onwards was also spectacularly bad for Europe, with tens of millions of estimated deaths and many areas only seeing their populations return to pre-collapse levels in the 13th or even the 19th centuries[3].
Both falls occurred for complex political and social reasons that are still disputed today, and involved serious conflict. However, conflict was not the leading driver of mortality, accounting for less than 10% of deaths by most modern estimates. Instead, it was hunger and the diseases that followed it that killed most people. In particular, the collapse in trade links in both cases isolated entire regions from the foods they relied upon, as well as restricting access to farming equipment. These were societies where the majority of populations were rural and knew how to farm, but still saw devastating famines from losing access to trade, capital and inputs - abrupt degrowth in agricultural complexity is catastrophically bad.
So what might threaten our level of complexity? It may seem like nothing could. Unlike the Western Roman Empire or Han China the current food system is not contained within one country or empire. The sources of key inputs and crop production are concentrated, and the top 5 breadbaskets (China, the United States, Brazil, India and Russia) account for over half of global cereal output, but the concentration is well below other strategic sectors such as microchips, personal protective equipment and pharmaceuticals. We also produce a significant surplus over human needs (over half of human edible crops are fed to animals or go to biofuels) and there are stored crops, meaning we can potentially respond to shocks without cutting human consumption. Maybe we are fairly safe.
However, there are also reasons for serious concern. The Russian invasion of Ukraine was one warning: both countries are key exporters of staple crops as well as inputs and at a stroke, conflict disrupted over 10% of the global trade in wheat, maize and barley. Meanwhile, sanctions and later direct damage also disrupted the flow of natural gas from Russia, as well as Russian and Belarusian fertilizer shipments, raising prices worldwide. This has caused real hardship, especially for low income citizens in food importing countries across the Middle East, and a larger conflict that directly involves great powers would be on another scale altogether.
Conflict has often been a key driver of food system disruptions, however, it is not the only credible future cause. It was forgotten in the chaos of early 2020, but the COVID-19 pandemic was also starting to put pressure on the global good trade as poorly targeted lockdowns meant workers could not reach ships and ports, and countries scrambled to secure their own supplies via restricting exports[4]. For a time this was of real concern, past agricultural shocks have been made far worse by countries suddenly raising their imports or cutting exports (as was the case in 2007/08), and it was possible in the early months that parts of the world would see a food crisis despite a global surplus.
A loss of complexity may not necessarily come from any single massive event, although a serious nuclear conflict or possibly a geomagnetic storm could possibly cause enough damage to cause one by itself. Instead, it could be caused by shock or disaster which in turn causes cascading consequences as trade or production breaks down and the consequences spread outwards. Another global pandemic causing a breakdown in our transportation infrastructure is certainly one candidate, and a serious pest outbreak or crop failure, a loss of critical infrastructure for a sustained period, or significant non nuclear conflict could also potentially lead to a cascade.
How resilient or exposed to such shocks we are is a very difficult question to answer, and every year we are adding more complexity in order to feed more people and at a lower cost. It is therefore possible that novel and poorly considered threats to our food systems are emerging, and tipping points are shifting in unpredictable ways. The nightmare scenario is a gridlock: a disaster that pushes a system capable of producing great plenty into locking up: with inputs unable to reach the farms and outputs unable to reach consumers. Potentially such a gridlock could be cleared following a year of normalcy but a bad enough crisis could make that impossible, leading to starvation and a loss of civilization until the greatly reduced food and mouths are back in balance. Given how many metaphorical rungs may have been lost on the technology ladder below us this may be a long way to fall, and avoiding such an outcome must be a priority if the possibility exists.
***
The second threat I wish to highlight is that crop growth relies on the climate, which varies. This is a rather obvious point, but makes agriculture fundamentally different in its dynamics to most other sectors. In addition, it's almost impossible to replace our reliance on the climate entirely in the foreseeable future. It is possible to grow crops under artificial light, but at a wildly high cost: at US 9 cents/kWh it would cost just under US$350[5] to grow a kilo of wheat, versus a typical crop price of around US$0.15-0.3 for the same kilo wholesale. Even if we devoted all of our electricity output to artificial light under our most efficient bulbs, we could only feed around 10% of the world’s population.
Meanwhile, rain is also absolutely fundamental to agriculture, with the majority of the world’s croplands and the vast majority of grazing areas being rainfed. Of course irrigation also exists, but is not uniformly better for farmers: it comes at a cost, and is ultimately still reliant on rains to fill reservoirs and replenish aquifers. As a result when rains are disrupted, for example with the El Nino of 2015 across Southern Africa, it is common for rainfed cropland to be hit the year of the shock and irrigated areas the following. Irrigation can buffer farmers, but does not necessarily disconnect you from the need for reliable rains.
For most of history climate variability was a significant driver of famine, and within a region yields can swing significantly. However, more recently the impacts of weather on food security have lessened, as yield shocks tend not to be correlated at the global scale. This means if we can effectively trade between areas of surplus and deficit the flows can typically offset the impact of bad weather, at least for those with the income and infrastructure to access the global market. For example, since 1961 (when detailed data starts from the Food and Agriculture Organization of the United Nations’ database) average global cereal yields have only differed from their steadily upward trend by low single digit percentages, even while some regions saw their output collapse in a given year.
This is very good in typical years, however, what happens in the year when the weather shock is severe, lasts multiple harvests, and is global?
This is a concern with a changing climate: farmers can partially adapt to changing average conditions by shifting the crops they plant or their growing seasons, but rising volatility is far harder to deal with, especially if the climate shocks are regional. As a result, the potential for significant food shocks is growing over this century, with very serious implications. Within a crop year it is very hard to squeeze out more production from an already planted harvest at short notice, instead stocks must be drawn down or consumption reduced in order to balance supply and demand. Given how fundamental the need for food is, small movements in output can push prices very high indeed. For example, in 1974 a 4% decline in global corn harvests pushed up prices by 20%. Coming on the back of the energy crisis that had already raised prices in 1973, corn exceeded US$100/tonne wholesale (US$600/tonne in today’s money) in the United States, the highest level in real terms since the damage caused by WW2. This compares to prices of around US$210-230/tonne in 2022, which has caused real hardship.
However, there is another category of shock which could be even more severe, and what I want to focus on, given that they are my field of study and because of how critically neglected they are. Abrupt sunlight reduction scenarios (or ASRSs) refer to events that eject material high into the stratosphere, reflecting and absorbing sunlight before it can reach the earth’s surface. This reduction in sunlight ripples throughout the climate, with lower temperatures leading to lower evaporation and therefore rainfall, disrupted monsoons and high humidity. The impacts of such an event can persist for years (this varies depending on the type of material), and would be global in scope as particulate matter diffuses throughout the atmosphere.
Such events may seem unlikely or implausible, but have occurred at several times over our history, with devastating effects. Most recently was the 1815 Tambora eruption, which led to around a 3-5°F (2-3°C) decline in average global temperatures in “The Year Without a Summer” in 1816 and 5-20% crop losses across much of the world. Coming on the back of the Napoleonic wars it has perhaps been partially overlooked by history, much like Spanish Flu left less of a mark on collective memories than WW1, but the eruption led to excess mortality across Central Europe and China of at least one hundred thousand people, and probably far more who went uncounted.
Future volcanic eruptions may lead to even worse hardship in our modern interconnected world, and eruptions of a similar magnitude to Tambora of around VEI (Volcanic Explosivity Index) 7 or above have a 15-20% probability per century based on the recent estimates of Cassidy and Mani[6]. Meanwhile, VEI 8 plus eruptions are also possible, which could lead to catastrophic levels of cooling in the order of 10-35°F (5-20°C) after 1 to 2 years. Eruptions in this category have not occurred since we adopted agriculture, and if they had it would have caused harvests to almost completely collapse, throwing us back centuries.
In addition, splitting the atom has opened up a new ASRS threat: nuclear winter. Nuclear strikes on cities or zones with high levels of combustible materials can generate firestorms (as occurred at Hiroshima, as well as some conventional strikes on cities during WW2), which loft material high into the stratosphere. Once there, it can persist for 10-12 years, and the impact of this soot on the climate has the potential to be even more catastrophic than the war that caused it.
Nuclear winter is typically modeled in terms of teragrams or Tg (equivalent to a million tonnes) of soot lofted into the stratosphere. The projected severity of a winter ranges from around 5-16 Tg for a regional nuclear war involving 100-250 nuclear weapons of 15 thousand tons of TNT equivalent (kt) over cities[7], up to a 47-150 Tg from a significant nuclear exchange of around 500-4,400 weapons of 100 kt each striking cities[8]. The science and estimates here have evolved, both as new climate models have become available and as arsenals were reduced post cold war. While unknowns remain (for example how weapons would be targeted and the combustion dynamics of modern cities) it remains a significant risk that any nuclear targeting of cities could lead to a nuclear winter.
These nuclear exchanges could kill tens to hundreds of millions immediately, but a nuclear winter in the magnitudes above could lower food output by between 7-25% of output post disaster for the 5-16 Tg scenario, and between 50-90% in the 47-150 Tg scenario[9], potentially starving billions. We are rightly concerned about climate change in the range of 3-4°F (1.5-2°C), but a 150 Tg nuclear winter could cause temperatures to drop by as much as 40°F (20°C) across the United States within just two years post conflict, leading to the collapse of conventional agriculture.
The default assumption may be that there is nothing that could be done in such a scenario, however, this arguably not the case. Responding to even the most severe of volcanic and nuclear winters may be possible, and while the challenges here are huge there is nothing fundamental to prevent enough nutritious foods being produced. For the first time perhaps in our history we have the technology, skills and resources to actually prepare and respond, for example by the use of resilient foods to raise output.
Resilience is of course a fuzzy concept, but in the case of sunlight blocking catastrophes resilient foods mean foods where their output is less correlated with the climate. For example switching planted crops to cold and drought tolerant varieties provides a vital degree of resilience, and human inedible feeds could be prioritized for more feed efficient systems like dairy where possible to offset even more of the impact. Meanwhile, at the more extreme end of resilience there are foods which are almost entirely disconnected from the climate altogether. Microbes can convert feedstocks such as natural gas or hydrogen into protein, which can then be used as feed for animals, eaten directly, or hopefully one day analoges to meat and dairy products. This fermentation technology is currently being explored for commercial applications in several countries, and its potential for designer foods based on gene editing is fascinating, but it could also be used to continue to reliably produce food if the climate is disrupted. Finally, in severe scenarios cellulose can be processed into sugars (the basis of cellulosic ethanol fuels), and seaweed cultivation/aquaculture is resilient to climate swings due to the moderating effects of the ocean.
***
So what happens if something in Tambora’s order of magnitude happens again, tomorrow? What would we need to do?
Right away governments are going to need to realize how serious the situation is, and take charge. There will be time to act, but not much. We need to learn the lessons of COVID and the chaos and denial of the early days, which then lurched into panic once the pandemic was undeniable and the early days squandered.
Firstly, people would need to be told about the situation, and it is likely that there will be serious doubt and fear. A high magnitude eruption would have clearly occurred, but the climatic consequences would not be immediately obvious as the full shock will not arrive for several months to a year. The default assumption may also be that there is nothing we can do, and that apathy must be challenged, there would be time to respond but not much. However, within a year the sky will likely be visibly darker (which was widely commented on and painted[10] during the year without a summer), and the reality will be undeniable if crops fail.
Secondly, the knee jerk reaction for governments may well be to hoard stocks, with key producers banning exports and large scale importers massively raising orders, and this needs to be absolutely avoided. There can be enough food if we can raise output and cut back non food uses, but this will require functioning global trade. Panic again will be the enemy here, and commitments to maintain exports from key nations will do a lot to steady prices (such as Japan made to calm rice markets in 2008, which in the end brought the panic to an end with very few exports needed).
Globally, we have enough stocks for around 4-6 months of consumption at current levels[11], depending upon the time of year and the performance of recent harvests. This could be stretched to around 12 months if we cut back on nonfood uses such as feed and biofuels, which could buy us vital time if part of a crop is lost, but cannot cover the full shock of an ASRS, we need to find ways to increase output.
Right away, farmers will need to start preparing to shift their cultivation, with all the support they can get. The crop in the ground will likely survive, but the next planting will have to adapt to the new reality of the coming shock. In particular, crops that like the heat (such as maize) are likely to struggle in many parts of the world, and farmers’ hard-earned experience of their local growing conditions will be of little use when their typical climate is so disturbed. As a result, accurate and ongoing climate modeling will be vital, the results of which need to be translated into crop yield forecasts and communicated to farmers so they can make accurate planting choices. There is room for government support here, in the form of information, loans, crop insurance, and price guarantees, so that all farmers have the knowledge, resources, and confidence to plant the crop most likely to succeed, and planning for this ideally needs to happen well in advance of the disaster itself.
In addition, resilient foods would need to be quickly expanded where possible to meet the shortfall in the coming harvests. In particular single cell proteins look promising as animal products are so intensive in crops, but also potentially cellulosic sugars and seaweeds. It is unlikely that people would live on these products alone, but they can form a part of diets, and could also replace human edible feeds that would otherwise go to animals. These foods are not expensive to produce, and make use of currently available capital in many cases, but if we already have pilots completed ahead of time or a small viable industry established this could massively increase the speed at which output could scale.
These are just the foods and methods already identified as viable, there will be many others. One of the most impressive things about the COVID-19 pandemic and past crises such as WW2 was how fast innovations can occur when there is a burning challenge for the world to work on. There is room for governments to help here: for example, fast tracking regulatory approvals, guaranteed prices for nutritious foods that meet certain criteria or grant funding for promising solutions. There are also ways government intervention can go very wrong, which again we saw in the pandemic. Slow approvals, political wrangling over resources and refusal to cover the costs of businesses were all too normal. Lessons could be learnt from our past challenges, and ideally best practice globally would be shared and copied as the crisis progresses.
On top of this, producing enough food is a necessary but not sufficient condition that everyone is fed, we would need to take steps to ensure foods are equitably released and available for all. This wouldn’t necessarily require global rationing of all foods in the case of a shock like Tambora, and a few policy changes could have an outsized impact. For example, vast amounts of edible cereals and oilseeds are processed into biofuels worldwide (over a third of maize in the United States for example), as well as used to feed animals. Currently, legal mandates in key markets force fuel blenders to include a certain amount of biofuels in their sold gasoline and diesel, independent of the situation in food markets. If mandates were relaxed during the disaster this could free up the equivalent of 600-700 million people’s worth of foods in the United States alone, that would otherwise be turned into fuels.
On top of this, animal systems lose a large part of calories when using human edible feeds by their nature, and any measures to cut back on meat consumption and divert inedible feeds to dairy herds (where conversion ratios are the most efficient) would free up foods for human consumption. These livestock adaptations were adopted quickly across much of the world following WW2[12], and could be done again. Some of this would happen naturally by market forces, but farmers may need loans and support, especially those currently reliant on feedlots.
All of these measures would require preparations ahead of time, as well as a global response post-disaster and a degree of international cooperation. There are some examples from our past that could help, for example the experiences of WW2, and we would need to determine how the public and private sector could work together and what policy changes and investments should occur in advance. The challenges are huge, however, if we do plan effectively there is the potential for humanity to be insulated against ASRS for the first time in our history.
In addition, many of the measures to prepare for a volcanic or nuclear winter could also have benefits for building resilience to climate change, and would create a more robust food system in general. Planning for a significant crop shock simply makes sense, and measures to reduce our exposure to variance and to prioritize food for direct human consumption where necessary will certainly not be wasted.
***
However, what if instead of a volcanic winter on the scale of Tambora and a loss of 7-25% of crops, the precipitating event was a massive nuclear exchange leading to a 47-150Tg nuclear winter, and 50-80% of food output would be lost without a response? The disaster here would not be from a force of nature, but from the careful and diligent efforts of many thousands of men and women to inflict damage upon each other’s nations. This would be targeted against cities, critical infrastructure and industrial capacity for a nuclear winter to have occurred, and across multiple countries, greatly complicating any response.
As I mentioned earlier resilience is a fuzzy concept, the resilient foods we have discussed above are selected to be resilient to disruptions to the climate, not necessarily to losing complexity. Most countries would not be directly targeted, which would support a response, but even their supply chains and communications would be heavily disrupted and with little warning.
Intelligent preparations for disruptions to supply chains in both nuclear and non nuclear nations could save millions of lives, as well as having benefits across a number of other potential disasters, but the challenges here are huge. Trade must continue to flow, and farmers must have access to the fuel, seeds, fertilizers and machinery that they need to produce the nutrition the world needs. There is likely no other alternative if we are going to produce and distribute the required food. While some early work has begun[13] very little serious planning has occurred to date, and almost all of that was during the cold war. Much has changed since then, and we have a radically more connected world with new technologies available, opening up new challenges but also new opportunities.
Based on early estimates[14], if we can maintain at least partial global trade and agricultural complexity it could still be possible to produce enough food for human needs even in a severe nuclear winter. This would need to combine all the methods previously listed, alongside rationing, the deployment of greenhouses and relocation of cold tolerant crops internationally. In particular, cooperation and trade between the high latitude countries and countries in the tropics that could still effectively cultivate crops would be highly valuable, and we need every kilo of food we can possibly produce in such a scenario. For example, northern countries could send seeds, inputs (e.g. fertilizers) and stored foods in the early periods to support the tropics and would then benefit from the surplus produced in subsequent years. However, this would require cooperation and trade across multiple periods where trust would be harder to secure, financial markets may not function following a nuclear war and contracts impossible to enforce by usual channels. In addition, what would happen in a scenario where the harvest is smaller than expected?
Finally, there is the consideration of whether individuals within countries will pull together or apart. For example, will people hoard, riot, or flee en masse in search of food? In addition, the wealthy (who may be very different pre-disaster wealthy) may demand meat and other high input foods, even as others go hungry. What happens early in the disaster may be vital here, and the degree of confidence the public have in a response plan may be just as important as the plan’s technical effectiveness.
For disasters of this scale, there are more questions than answers, and the vast majority of work still remains. It may take years or decades of serious preparations across multiple countries before we can have confidence that the globe is prepared for any ASRS, even more so in the case of the potential disruption of a serious nuclear winter and conflict. These events are incredibly complex challenges that touch on far more than just food, and would test us all in a way few have ever experienced.
There is a degree of cynicism about cooperation in disasters, and the default impression in some academia seems to tend towards it being improbable or impossible. I am more hopeful, it is not always the case that humanity falls short and there are several historical examples of countries pulling together and cooperating under incredibly tough conditions, even when the predictions were for the opposite.
It’s certainly not impossible that we will fall short of those ideals, but not planning is also not a reasonable option. An ASRS or similar food shock will happen at some point if we achieve a future that stretches into the next centuries, and we must be prepared.
- ^
More recently, the impact of COVID-19 and the Russian invasion of Ukraine have reduced incomes and disrupted food markets, pushing hunger up once again.
- ^
How the world really works - 2022 - Page 49
- ^
There are a few very interesting write ups of what the fall of the Western Roman Empire was like for the average man or woman, for example Bret Devereaux’s blog has a great summary of the debate.
- ^
IFPRI has an article on the situation early in the pandemic and the restrictions in place here: https://www.ifpri.org/project/covid-19-food-trade-policy-tracker.
- ^
It takes around 3850 kWh to make a kilo of wheat for example
- ^
There were around 97 such eruptions recorded over the last 60,000 years, and there is some evidence we are in a period of comparatively higher activity at the moment.
- ^
Toon, O. B. et al. Rapid expansion of nuclear arsenals by Pakistan and India portends regional and global catastrophe. Sci. Adv. 5, 5478 (2019).
- ^
Toon, O. B., Robock, A. & Turco, R. P. Environmental consequences of nuclear war. Phys. Today 61, 37–42 (2008).
- ^
Xia, L., Robock, A., Scherrer, K. et al. Global food insecurity and famine from reduced crop, marine fishery and livestock production due to climate disruption from nuclear war soot injection. Nat Food 3, 586–596 (2022). https://doi.org/10.1038/s43016-022-00573-0
- ^
For example, the contrast in Caspar David Friedrich’s paintings before and during the shock. It’s hard to know how accurate an oil painting is of course, but there were many others, perhaps most famously the work of Turner in the United Kingdom.
- ^
Based upon USDA PSD data, adjusted for crop years (stocks rise and fall over the year as harvests arrive and are then drawn down).
- ^
Collingham, L. Taste of War: World War II and the Battle for Food. Penguin Publishing Group, 2013. https://books.google.co.uk/books?id=NrOKDQAAQBAJ.
- ^
For example this report for New Zealand. ALLFED has also started country by country ASRS planning, starting with a report for the USA.
- ^