Monday 23 August 2021

BOOK REPORT: THE PLUTONIUM FILES by EILEEN WELSOME

 

 

I’M NOT SURE WHERE I FIRST RAN ACROSS A REFERENCE to Eileen Welsome’s book, The Plutoninum Files, but locating a copy took a while. It was out of print and there were only a few (expensive) copies online. The inter-library loan service in my province had one, but it was “unavailable” (on loan? misfiled?) for the longest time; it was months before it finally arrived at my local library for me to pick up. The book is based on Welsome’s exposé of a disturbing Cold War story she first published as a three-part article in the Albuquerque Tribune entitled “The Plutonium Experiment” that won for her 1994’s Pulitzer Prize. In it, she describes her research into a decades-long project conducted by various branches of the US government and medical and scientific establishments around radiation experiments, in which human test subjects were injected with, or exposed to, various radioactive substances, including Plutonium, in the 1940s and 50s—often without their knowledge.

 

She began her search by accident. In 1987, while looking at recently de-classified reports about how some military bases around the country were cleaning up their dump sites, one report from the nearby Kirtland Air Force Base stated that debris from their dump contained radioactive animal carcasses. This struck her as a bit strange and she wondered why the animals had been radioactive. She requested further information from the base and was given access to their archives. While going through a stack of dusty files in the base’s walk-in vault, she read a report containing a footnote* referencing human radiation experiments. Surprised, Welsome began digging and ran across a 1976 Science Trends magazine article detailing how the remains of three “plutonium injectees” (210) had been exhumed and analyzed for their radioactive content. At the time, the story was given little coverage by the mainstream press. Later, following 1979’s Three Mile Island nuclear accident and heightened public awareness of things nuclear, Mother Jones published a story, in 1981, about a series of “total body irradiation” (TBI) experiments conducted at Oak Ridge, Tennessee during the late 1940s. The article prompted a congressional investigation, whose findings, published in the 1986 “Markley Report”, provided information on a range of human irradiation experiments sponsored by military and government agencies in the 1940s and 50s (including such actors as the nascent NASA organization and, bizarrely, the Quaker Oats company!)

 

These stories came to light due to the release of tranches of declassified government documents in the 1970s and 80s, as well as FOI (Freedom of Information) requests from reporters interested in post-war nuclear experiments, tests, and studies (for military and civilian purposes) proposed or conducted by government labs, universities and medical centres that had hitherto been hidden from public scrutiny and oversight by “national security” censorship. In other words, there was an increasing amount of information coming to light about what civilian and military authorities had been doing with nuclear materials in the decades following WWII, but there was no over-arching narrative that would capture the public’s imagination (and perhaps act as a catalyst for greater transparency and disclosure in this vitally important area.)

 

Welsome’s exposé would prove to be one such (albeit temporary) catalyst. In that memorable fall of 1987, Welsome began her initial investigation with the idea that the “human” dimension to any nuclear story should be her focus. The 1986 congressional report had indeed recorded eighteen unnamed test subjects who had been injected with Plutonium between 1947-49, but they had been identified only by code names (“Cal-3”, “HP-9”, etc.). Welsome was determined to find their identities and tell their stories, and after many months, her research led her to a small, Texas town where the family of one injectee, Cal-3 (Elmer Allen) lived. In her 1993 Albuquerque Gazette article she gave names to five Plutonium test subjects, including Allen. The news article became a national and international sensation, prompting the Clinton administration’s new Department of Energy+ (DOE) administrator, Hazel O’Leary, to order sealed government documents from a variety of agencies (including the CIA) be opened. These actions were followed soon after by a Presidential Commission tasked to investigate the Cold War experiments and make recommendations. At the time, Welsome was interviewed and she said she didn’t think the full story would come out. “I think people are shredding [documents] over at DOE now.” While the report subsequently acknowledged that 32 human radiation experiments had been carried out since 1945 on as many as 700 test subjects+ (including poor, pregnant women and “mentally-retarded” boys), it blamed everyone and no one at the same time. The experiments were framed as a Cold War “necessity”, and scientists, experimenters, and other government officials were deemed to have been carrying out legitimate scientific and medical research, sanctioned by boards of governors and oversight committees (some were) that ensured the ethical treatment of test subjects. And while there may have been instances where some experimental standards missed the mark, overall, the report concludes, the experimenters strove to work under guidelines and norms commensurate with the times, to the best of their abilities, etc. 


Welsome found the commission’s 1994 report “disappointing and timid” (487), and a chapter in her book, The Plutonium Files, discussing the commission’s recommendations, entitled “Whitewashes, Red Herrings, and Cold Cash”, makes clear her thoughts on the matter. However, she does give DOE head Hazel O’Leary full credit for “making public such a controversial chapter in Cold War History. Through her efforts, a massive and secretive bureaucracy was nudged farther into the bright light of truth.” (487) When O’Leary was later interviewed about what her thoughts were as she read Welsome’s 1993 story about average people being exposed to radioactive substances and used as unwitting test subjects, she made the startling comment: “Who were these people [conducting experiments] and why did this happen? The only thing I could think of was Nazi Germany.”

The publicity following the commission’s publication of its findings and President Clinton’s 1994 apology to “those of our citizens who were subjected to these experiments, to their families and to their communities” (470) sparked several lawsuits filed by former test subjects, those ranging from soldiers sent into “ground zero” blast sites,1 to people involved in TBI experiments, to adolescent boys given “radioactive” oatmeal for breakfast. Families of several of the eighteen Plutonium test subjects also filed lawsuits against the government and won compensation settlements. And nuclear workers exposed to radioactive substances and suffering ill health as a result, all those miners, scientists, and technicians that built America’s nuclear power and weapons infrastructure, were granted long over-due compensation in 2000.

 

After Bill Clinton’s usual “I feel you pain” speech which said much but did little, Welsome continued her research, uncovering the names and histories for the remaining thirteen people who had been injected with Plutonium all those years ago. In addition, her 1999 book gives an overview of other radiation experiments and atomic tests in the decades following WWII. It is an engrossing and disturbing read of an all-too-quickly-forgotten past, a past that will, in all likelihood, come back to haunt us.  

 

I hadn’t meant to write a book report, but I guess this is mostly that. I’d been reading at the same time as The Plutonium Files John Hersey’s “Hiroshima”, which was a 30,000-word essay published in 1946 in The New Yorker magazine.  Hersey had visited Hiroshima several months after the bombing and interviewed six of the city’s residents, giving the reader a riveting and horrifying minute-by-minute account of their experience as the world’s first atomic bomb exploded above their city. He created a portrait of everyday people who survived and lived in the aftermath of that terrible August morning seventy-six years ago. The New Yorker article garnered national and international attention for weeks as readers gained insight into the true costs of an atomic war. Not only did Hersey provide an account of the immediate destruction of the city, the horrific effects of the firestorm and the silent, but deadly fallout, whose radioactivity would claim in the months and years to come as many lives as the initial explosion, but he also provided a human portrait of courage and survival.

 

In yet another book I was reading at the same time, Fallout, by Lesley Blume, she examines how the military and government attempted to cover up the effects of the bomb; how President Truman, in the days following the bombings of Hiroshima and Nagasaki said the weapon “was just the same as getting a bigger gun than the other fellow had to win a war and that’s what it was used for. [It was] nothing else but an artillery weapon.” He and the occupation forces in Japan downplayed the issue of lingering radioactivity in the cities. Reports of a “mysterious illness” and deaths of people exposed to the bombs and, in the weeks and months following, of birth defects and miscarriages, were vigorously censored by the Allied Occupation forces in Japan and by authorities in America. General Leslie Grove, who directed the Manhattan Project during the war, also played down the effects of radiation poisoning saying, absurdly, it could be a “very pleasant way to die.” (4)  Blume’s book examines the attempts by American authorities to block access to the bombed cities, and to censor domestic Japanese media from any mention of what happened at Hiroshima and Nagasaki. She also relates a fascinating, behind the scenes look at how the New Yorker magazine came to publish this ground-breaking essay during a time of national emergency. At one point, she says, editors at the magazine feared they might be charged under the Espionage Act for revealing classified information if Hersey’s story were published. But they went ahead, anyway.


It must be said that the atomic bomb went from a theoretical construct to a finished product in a little over three years. Much was unknown about how the device would work in practice and what its effects2 would be. That said, however, there were many scientists and physicists, including Albert Einstein, urging caution, and calling for a “demonstration” of the new weapon instead of attacking a civilian target. By this point in the war, General LeMay had been firebombing Japanese cities. In March of that year Tokyo had been firebombed, killing an estimated 90,000 civilians in a single night raid. So, the next step, an atomic bombing of a city was easier to contemplate and no longer deemed beyond the pale by some military and political leaders. But many senior commanders, including General MacArthur, the Allied Commander, were ambivalent about its use. The war was nearly won, they said; Japan had already made peace overtures to the Russians. Why further antagonize the population and make the job of occupying the Japanese home islands that much more difficult? But President Truman felt that the bomb would compel Japan to surrender sooner.3 And the rest is history.

 

I'LL END HERE WITH THIS: People died from being exposed to radioactive substances when they were first discovered in the late nineteenth century; Marie Curie is perhaps the most famous victim of radiation poisoning. In the 1920s and 30s, workers at the United States Radium Corporation in New Jersey suffered from a variety of cancers after using radium-laced paint. At a Los Alamos laboratory in 1945 an experiment scientists dubbed tickling the dragon’s tail, resulted in massive radiation exposure and subsequent death for one researcher. Since then, as noted above, there have been a variety of accidental and deliberate exposures of humans (and animals) to radiation, often with horrific consequences.

At the same time, radiation today is used in variety of fields, particularly medicine. Fusion power is called the “holy grail” of nuclear enthusiasts and deemed just around the corner. Small, conventional nuclear reactors are being proposed for niche markets. And smaller “tactical” nuclear weapons are currently being designed for battlefield use.4 Nuclear energy, either as a source of power or weapons seems destined to be with us for some time. But aren’t a lot of people (like me) just a little worried that we’ve let the mad genie out of the bottle, and while he’s willing to grant us our wishes, this time he’ll not so easily be persuaded to get back inside the bottle. Instead, he’ll stay with us, for minutes or ages, until we learn that our wishes come with costs and our actions with consequences.

 

Cheers, Jake.  

 ________________________________________________________

 

* I read somewhere that if you want to hide important information, so it won’t be read, and still claim “full disclosure”, put it in the footnotes! Welsome’s first clue about human radiation experiments came from a footnote she read.

 

+ The Department of Energy (1977-present) grew out of the original Manhattan Project (1942-47) and the subsequent Atomic Energy Commission (1947-77). The Manhattan Project, of course, was the secret WWII effort that produced the world’s first atomic bomb. The AEC was formed to bring the various military-run nuclear projects under direct civilian control. Today’s DOE is responsible for all things nuclear, including the manufacture of radioactive material for bombs and reactors.

 

1 Literally thousands of soldiers were deployed at the Nevada Test Site, tasked to perform manoeuvres across irradiated grounds of recently exploded atomic bombs to observe how troops operated in a nuclear landscape.  In the South Pacific, naval service men boarded obsolete ships that had been stationed near ground zero and irradiated by underwater thermonuclear explosions. The servicemen took Geiger-counter tests as well as cleaning radioactive dust and debris from the ships, often without any protective gear.

 

2 “What the American public did not see: photos of the Hiroshima and Nagasaki hospitals ringed by the corpses of blast survivors who had staggered there seeking medical help and died in agony on the front steps…nor did they see images of the crematoriums burning the remains of thousands of anonymous victims, or pictures of scorched women and children, their hair falling out in fistfuls.” (Fallout, 5) For scientists, though, this was a real-world experiment, where an incredible amount of data could be gathered from studying the impacts of atomic explosions on infrastructure, water and soil, vegetation, and animal life and by studying the civilian population to better understand how radioactivity and the human body interacted.

And in the following decades, this need to understand the effects of radioactivity led to human radiation experiments that were conducted by various branches of the government and military. Interestingly, Dr. Joseph Hamilton, one of the lead doctors responsible for injecting Plutonium into some of the eighteen human test subjects that Welsome highlighted in her 1993 exposé, wrote a confidential memo to the head of the AEC in 1950 recommending monkeys be used instead of humans for any future radiation experiments because such experiments, he said, "might have a little of the Buchenwald touch." Perhaps Hamilton had a change of heart, or perhaps he knew he had leukemia (possibly from his careless handling of radioactive materials earlier in his career) and was more attuned to the risks such experiments might have for his fellow humans. He died in 1957 at the age of 49.

 

3 There are probably additional reasons for Truman’s fateful use of the weapon. One reason might have been to use the bombs in order to send a message to his new enemy/former ally, Russia, that Washington was in charge now. Regardless of why, I feel there was a large streak of nihilism in the room when that decision was made.

 

4 “The Trump Administration addressed these questions in the Nuclear Posture Review released in February 2018, and determined that the United States should acquire two new types of nuclear weapons: a new low-yield warhead for submarine-launched ballistic missiles and a new sea-launched cruise missile. The Biden Administration many reconsider these weapons when it conducts its Nuclear Review in the later half of 2021.” (Congressional Research Service: Non-strategic Nuclear Weapons. Updated July 15, 2021.)

 

 

 

 

[Factoid: Marie Curie, Nobel Laureate, discoverer of Radium, became ill with aplastic anemia from exposure to the radioactive material and died in 1934.  In New Jersey, female workers at the nation’s largest dial-painting company, the Radium Corporation, in the 1920s and 30s, began to be sick from radiation poisoning. Small amounts of Radium were used in the paint that the women painstakingly applied to watch and clock dials (so they glowed in the dark).  An unfortunate habit many women used was to wet their brushes with their tongues to get a fine tip with which to paint the numbers. Many developed oral cancers and other diseases caused by the radiation.]

 

[Factoid: German scientists in 1938 discovered nuclear “fission”. “Fission is a nuclear reaction or radioactive decay process in which the nucleus of an atom splits into two or more smaller, lighter nuclei and often other particles.” (Wikipedia) However, the Germans never succeeded in developing a “controlled chain reaction”, a critical next step for developing an atomic bomb. That was first accomplished by scientists  at the University of  Chicago, in 1942.]

 

[Factoid: Polonium, a radioactive metal first isolated from “pitchblende” by Marie and Pierre Curie in 1898, is named after Poland, her home country. In 2006, former-Russian dissident, London-based, Alexander Litvinenko was believed to have been poisoned with this highly dangerous substance in a liquid form. It was first used in 1945 as an “initiator” or trigger to start the chain reaction in the uranium core of the Hiroshima bomb nicknamed, “Little Boy.”]

 

[And here is a report published in the Los Alamos Science newsletter from November  23, 1995. It goes into great technical detail around the Plutonium experiments, and tries to rebut Welsome's 1993 Tribune article. While heavy on the science it's light in the humanities, and right and wrong and culpability are words lost in the desert wind.]

 

Welsome, Eileen. The Plutonium Files. The Dial Press. Random House, Inc. New York, NY, 1999.

 

Hersey, John. Hiroshima. Alfred A. Knopf, Inc., 1946. BN Publishing. New York, 2012.

 

Blume, M.M., Lesley. Fallout: The Hiroshima Cover-Up and the Reporter Who Brought It to the World. Simon and Schuster, 2021.

 

Monday 9 August 2021

BOOK REPORT: ANIMAL, VEGETABLE, JUNK: A HISTORY OF FOOD FROM SUSTAINABLE TO SUICIDAL by MARK BITTMAN

 

I WAS DOING SOME READING ABOUT FOOD the other day and a set of interesting facts caught my attention: Around seventy-two percent of farms in the world are less than 2.5 acres in size. But they operate on only 8% of the available agricultural land, while farms over 123 acres—which are just 1% of the world’s farms—operate on 65% of the land. Okay, I thought, modern farms are on the larger side. Some are ginormous! Canadian farms average about 800 acres while some in the prairies are thousands of acres in size! (Surprisingly American farms are smaller, on average, at about half the size of their Canadian counterparts. Though, I’m sure farm acreage in the American mid-west easily matches ours.) After all, we have a big population to feed and need big farms, right? So, move on—there’s nothing to see here. Or is there?

 

Another set of statistics caused my brows to wrinkle a tad: More than 500 million peasant farms operating on 25% of the land feed 70% of the world’s population. Modern, “industrial” farming, (also known as “factory farming”), on the other hand, operates on 75% of the arable land, but feeds only 30% of humanity.

Even if you’re like me (someone statistically challenged) these figures don’t add up. Something isn’t right. Modern, technological farming feeds only a third of the planet while using ¾ of the cultivable land to do it? I thought the whole purpose of the Green Revolution, GMOs and the entire armature of the food system, from farm to table, was geared to feeding the world’s population now and in the future. What’s going on? Where does all the food from our modern farms go?

 

In reading the PNAS (Proceedings of the National Academy Sciences of the United States of America) and an article I found there discussing the Green Revolution (GR) and the need for a second one, a “GR 2.0”, I was surprised (perhaps I really wasn’t) at the emphasis placed on technological development and increasing the yields of “public goods”, i.e., commercial crops worldwide, as well as strengthening seed patent law, lowering trade barriers, and finally how the global food system needs to incorporate the remaining smallholder farmers into itself. It states the need to “enhance competitiveness of modernizing agricultural systems and increasing “smallholder productivity growth.”

 

But wait, aren’t small farms productive? Aren’t they the backbone of global agriculture? Didn’t the statistics I quoted at the beginning of this post suggest they, in fact, produce most of the world’s food on a fraction of its arable land (25%)? “Industrial farms” are farms that use heavy machinery, GMOs, patented seed technologies, artificial pesticides, fertilizers, and other “inputs” to grow mostly “cash crops” that are often exported or sold in upscale markets. As mentioned, they produce only about one-third of the world’s food on 75% of the land. This begs the question: Why do we stress the need for another Green Revolution and for further “modernization” of our agriculture system*? Why do we consider today’s agriculture a success?

 

Fortunately, Mark Bittman’s new book, Animal, Vegetable, Junk, goes a long way to answer these and other questions about how we grow our food, and he has strong criticisms about our current way of doing things. In the chapter entitled “Where We’re At”, Bittman makes a compelling argument for radical change in the methods of food production that today have become global in scope. He says:

     “ You will hear, ‘The food system is broken.’ But the truth is that it works almost perfectly for Big Food. It also works well enough for around a third of the world’s people, who have the money to demand and have at a moment’s notice virtually any food in the world.

     But it doesn’t work well enough to nourish most of humanity, and it doesn’t work well enough to husband our resources so that it can endure. Indeed, the system had created a public health crisis [and] it’s a chief contributor to the foremost threat to our species: the climate crisis. The way we produce food threatens everyone, even the wealthiest and cleverest.” (242)

 

He begins his study with an overview of agriculture’s rise after millennia of experimentation, trial and error and just plain luck. Following the last Ice Age, around 12,000 BC, as the world’s climate gradually warmed, steppe and boreal forest climes gave way to temperate forest and grasslands, and to ecologies with plants and soils, and animals, that would prove helpful for human populations that were adapting and expanding into these increasingly habitable and fertile regions of the earth. He reviews the “Paleo” diet humans ate for millennia as hunter-gatherers and how such a lifestyle would preclude things like “surpluses” of food. Described as the “optimal foraging” (9) formation, hunter-gatherer group dynamics, like those found in today’s remnant hunter-gatherer societies, suggest pre-agrarian cultures were more egalitarian and non-hierarchically structured, and thus better suited for small groups of individual families or clans migrating across the landscape, eating what was available from the plant and animal world.

Over time, by chance or circumstance, certain plants, and later animals, served to “anchor” humans to a more sedentary lifestyle that, with further domestication of plants and animals, gave rise to settlements and eventually the societies and civilizations of the early Holocene epoch.**

 

IN GENERAL, most of us know the “big picture” concerning the growth (some might say metastasizing) of our modern food system. But the devil is in the details, and Bittman provides many details for us to understand where our current approach to feeding ourselves has gone off the rails.

Interestingly, he cites the nineteenth-century’s discovery of bird droppings, or rather the copious deposits laid down over millennia of bat and bird “guano” on offshore South American islands as a key moment in launching humankind along a dangerous and ultimately unsustainable path.

 

I hope the reader will forgive my giving short shrift to the several millennia between the time when hunter-gatherers first discovered the food value in various perennial grasses, and animals, and jumping ahead thousands of years to the New World of the Americas. But, in short: by the mid-nineteenth century, the dominant societies of Europe and its colonies were running out of productive farmland, even as the wealth and bounty of an entire hemisphere had been used to fuel and feed its various empires and enterprises. The agrarian strategies that had mostly kept pace with humankind’s growth around the globe—techniques such as “slash and burn” agriculture, fallowing, “inter-cropping”, composting, “green manures”, and so on—had fallen behind the rapid population growth during the nineteenth century, where primarily European technologies, trade, and conquests had, up until then, allowed for expanding populations. Of course, the eighteenth-century discovery of coal and its powerful combustive properties began what is called the Industrial Revolution, whose processes and manufactures and wealth generation added fuel to the population fire. 

But something else was needed to increase farm yields to feed people.

And that something was fertilizer. Or more specifically the wealth of nitrogen, potassium and phosphate found in layers of seabird excreta deposited on offshore islands along the Pacific coast of South America that the Europeans and other rich nations in short order mined to exhaustion. Bittman states:

 

     “Thousands upon millions of years of fertilizer was being carted across the globe, only to be exhausted in decades. Europeans would realize the folly of this approach over the following half century—and especially after the development of chemical fertilizer—as it became clear that flouting the natural laws that prevent infinite growth was not a system built to last.”  (74)

 

He raises concerns and criticisms around an economic system that promotes such rapacious and ultimately unsustainable activities, and just how unfair and anti-egalitarian capitalist economies can be. He cites one example from nineteenth-century India, a country with a long tradition of diverse and sustainable agriculture but was now burdened by huge swaths of arable lands given over to cotton, wheat and corn production (and opium, but that’s another story), mostly destined for export. The imposed dislocation of traditional croplands and markets by the British overlords directly resulted in two devastating famines, killing millions across the subcontinent.

Capitalism, by its nature, promotes unsustainable and inequitable growth, a theme Bittman returns to throughout his book. In particular, he cites American innovation in the food “industry” during the late nineteenth century, and particularly following WWI with advancements in farm machinery, petrochemical sciences and the manufacture of artificial fertilizers, pesticides, and herbicides, allowing America's version of agriculture to become the major driver of globalized, industrial farming, as well as becoming a chief emitter of greenhouse gases. One of the main “themes” of this new way of farming, one that fits nicely with the critique of capitalism in general, is the supposed need for agriculture systems to become more “efficient”. Like modern manufacturing and marketing enterprises, efficiency is the key component for how corporations (and now farms) are to be run: maximizing profits while minimizing costs. It sounds reasonable enough, until you examine the actual costs of such enterprises and how efficiency limits farming to an ever-narrowing suite of unsustainable and increasingly costly practices. When pundits call for increased efficiency in the farming sector, what do they mean? Generally, they refer to larger farms, more mechanization and technology (GPS, drones, etc.), and of course greater use of patented seeds, artificial fertilizers, and herbicides, manufactured by giant petrochemical and pharmaceutical companies and foisted off onto the world’s farmers. As well, multi-national agricultural conglomerates use their market dominance and favourable trade agreements to dictate, in many cases, the types of crops grown in different countries. 

 

FOR EXAMPLE, great swaths of North American farmland are given over to the production of corn, wheat, and soybeans, as are countries like Brazil and Argentina and large tracts of Europe. In the United States, about 40% of the corn crop is “destined to make inefficient fuel.” [Ethanol] (186) About 36% goes for animal feed. Most of the rest is exported or else processed into high-fructose corn syrup (HFCS) which is used as a “feedstock” to produce just about every type of junk food you can imagine. Bittman points out that relatively little of the corn harvest in the US goes to feeding humans directly. (He does not consider salt-laden, HFCS-infused corn chips a food.) And, of course, this goes a long way in answering that question I asked earlier about where all the food goes from our modern agricultural system. It goes disproportionately toward bio-fuels, animal feed and junk foods. 

 

Thus, markets are distorted. Unsuitable crops are planted, animals are treated in the most appalling manner, lands despoiled, forests destroyed, fish stocks depleted. And that’s just on a Monday for "Big Food"! With his strong convictions and clear prose, Bittman takes the industry to task, showing how it has become a system of food production that is killing us—literally: Through our excessive consumption of highly processed foods and diets heavy in meat+ and animal products that are available mostly to that well-fed third of humanity mentioned earlier. Our food system is killing us with  short-sighted, extractive methods of soil management, with toxic chemicals and drugs and other additives that end up on our plates, as well as through resource depletion, dangerous genetic experiments, pollution, and environmental degradation.

 

AND SO, we are left with a system that fails to support (most of) us, and one that will not safely, sustainably, or nutritionally feed us as our population continues to rise in the coming decades. We are at the cliff’s edge, yet believe we can fly, even as we fall. Bittman says he is not hopeful that we will be able to change our food system in time to avert catastrophe, but he does say that while we “do not see many examples of better food systems in [our] daily lives, they do exist.” (265) Much of Africa and Asia still have highly decentralized farming systems that can be used as models for sustainable agricultural practices. Agricultural movements like “La Via Campesina (“the Peasant Way”), organic farming, the holistic farming methods of Rudlolf Steiner (called “bio-dynamics”), permaculture and “restorative agriculture” are some methods that give us hope. All of them fall under the rubric of “agroecology”, a holistic way of growing food. 

 So, here’s hoping! (I think I’ll go out back and pull a carrot or two from my patch, and sit and chew a while.)

 

Cheers, Jake. 

 

________________________________________________________

 

*Not forgetting, of course, animal husbandry and fishing, two vital links in humanity’s food chain. Each face similar challenges as modern production methods compete with traditional ones.

 

**For example: barley and “aurochs” from Europe; wild wheat (“emmer”), goats and sheep from the Fertile Crescent. Later millet grasses, pigs and chickens from India and SE Asia, rice from China, Andean potatoes and llamas, maize from Central America, turkeys and bison from North America, etc.

 

+ David Pimentel, professor of ecology at Cornell University states: "If all the grain currently fed to livestock in the United States were consumed directly by people, the number of people who could be fed would be nearly 800 million. If only grass-fed livestock were raised in the United States, "individual Americans would still get more than the recommended daily allowance (RDA) of meat and dairy protein.”

 

[Factoid: Zyclon-B, the gas used in Nazi extermination camps to kill millions of Jews in Europe, was originally invented by German scientists in the 1920s as a pesticide.]

 

 

 

Bittman, Mark. Animal, Vegetable, Junk: A History of food from Sustainable to Suicidal. Houghton Mifflin Harcourt. Boston, New York, 2021.