The spectre of nuclear annihilation continues to stalk us these seventy two years (to the week) after the mass murder/irradiation of 200,000+ Japanese civilians at Hiroshima & Nagasaki, a figure that the UCLA’s “Children of the Atomic Bomb” project suggests is overly conservative. The President of the United States is threatening the Korean peninsula with “fire and fury like the world has never seen,” while the Supreme Leader of the Democratic People’s Republic of Korea is openly fantasizing about destroying Guam in “an enveloping fire.” The Islamic Republic of Iran has so far complied with its promise to halt the development of its nuclear weapons program, but a fresh round of unilateral sanctions threatens to undo what is the most promising nuclear non-proliferation agreement since the Cold War. And this as the man imposing the sanctions brags—dishonestly, as it turns out—about strengthening his country’s nuclear arsenal.
All of this has me thinking of the 1965 film The War Game, a mock-documentary that dramatizes a thermonuclear attack on a small British city. The film, directed by Peter Watkins, won an Academy Award for best documentary, and this despite the fact that it is a work of fiction and was ultimately denied airtime by the BBC because of its graphically violent content. The experience of watching the film is disturbing, not only because it survives to remind us of the military-industrial insanity that gripped global culture during the Cold War, but because the threat of nuclear warfare has been a sometimes understated, but often overt reality of 21st-century international relations. The United States’ dubious justification for its war on Iraq illustrates how even imagined nuclear proliferation can end in decades of violence and misery. The 2003 invasion of Iraq and its aftermath render the current administration’s saber rattling—including its gross braggadocio regarding American nuclear stockpiles—all the more terrifying. This is one area where watching The War Game today has something to teach us, for it not only illustrates the idiocy of developing new nuclear weapons, but it also serves as a reminder of the global community’s failure to fully dismantle existing nuclear arsenals.
There are other aspects of the film that I find relevant as well. First, the nuclear war is provoked by a conflict between the United States and China over the United States’ military involvement in South Vietnam. The United States threatens to strike the Chinese military with tactical nuclear missiles, which in turn provokes the Soviet Union to assist the East Germans in unifying Berlin under the Communist regime. In an effort to defend Berlin from Communist aggression, the United States and Britain strike Soviet forces with a nuclear missile, thus initiating a full-scale nuclear conflict. This obviously never happened, but it was a likely enough scenario at the time to lend the film a sense of urgency, and it should give us pause about how easily the threat of a nuclear confrontation between the United States and North Korea can explode into a larger conflict between any number of nuclear powers.
Second, as the threat of nuclear warfare becomes increasingly immanent early in the film, the British government suspends its democratic system and institutes an authoritarian government headed by a special council of fifteen officials. These bureaucrats proceed to force evacuations and civilian billeting (this provokes a racist backlash, as the white citizens openly resist the directive to house and feed their black neighbors), institute food rationing, and establish explicit classes of people that will receive varying types of treatment in case of a nuclear attack (those of a certain class will receive no treatment, but will be “put out of their misery” by the police). The suspension of democratic law leads to military officers shooting enlisted men for refusing orders, and domestic police summarily execute citizens for civic unrest. The War Game is onto something here. It does not seem naïve to me to think that the world’s democracies will suspend their constitutions under the pressures of nuclear war.
Third, The War Game makes regular and convincing comparisons between the atrocity suffered by the fictitious British city and the actual atrocities that occurred at Dresden and Hiroshima & Nagasaki. The comparisons are convincing because the filmmakers place the British citizens in situations modeled on actual reactions to these WWII-era bombings. So when the film’s British police collect, loot, and burn the bodies of thousands of dead citizens in the streets, they mimic the actions of German police in the wake of the Dresden bombings. Similarly, when the film’s British survivors seem to regress into a state of apathy, filth, and disease, they repeat patterns observed among the Japanese survivors of America’s atom bombs.
Finally, the film smartly transitions from the physical effects of nuclear annihilation to the psychological effects survivors would suffer for the rest of their lives. It is important to recognize that the human condition would never be the same after a nuclear war. The human psyche would suffer such a horrendous blow that many would no doubt envy the dead. Just imagine the consequences of an entire nation suffering from severe Post-Traumatic Stress Disorder (post-war North Korea is a good example). The future would be grim indeed.
But perhaps we’ve already been acclimating ourselves to precisely such a future. It does seem that many people have already accepted the threat of nuclear war as an unlikely but ultimately justifiable reaction to hostile nations. I’m profoundly uncomfortable with this blasé attitude toward such a future. A nuclear strike by any nation under any circumstance would be an outright assault on human the most basic elements of dignity. We’re fortunate to have organizations like the Ploughshares Fund that have remained diligent in resisting the irrationality of nuclear weapons. We need them to remind us of the very real dangers these weapons pose to the human species. But we also need old films like The War Game to remind us of how close we’ve actually come to destroying ourselves, for that is exactly what we threaten to do every time a nation produces and/or enhances a nuclear weapon, and every time a national leader shoots off at the mouth about “fire and fury.”
I encourage you to watch The War Game, which I’ve embedded below.
The War Game, directed by Peter Watkins, performances by Michael Aspel, Peter Graham, and Kathy Staff, British Broadcasting Corporation (BBC), 1965.
Untitled film still, The War Game, directed by Peter Watkins, British Broadcasting Company, 1965 (featured image)
Promotional poster, The War Game, British Broadcasting Company, 1965
I spent the morning reading Luke Morgan’s The Monster in the Garden: The Grotesque and the Gigantic in Renaissance Landscape Design, which is a fascinating book. I’m particularly interested in what he has to say about Italian garden statuary, a topic that is much more exciting than it sounds. This is especially true when he focuses on the weird hybrid creatures and colossal monsters that populate Renaissance gardens, fountains, and grottos. Good stuff!
At any rate, because I’m unfamiliar with virtually all of the examples Morgan cites in his book, I spent some time searching the web for photos of the various artifacts he discusses. One of the images I found is this shot of Antonio Novelli’s colossal Polyphemus, which stood in the Orti Oricellari, a sixteenth-century Florentine garden that is now largely lost. Today the site of this once ornate garden is occupied by a modern urban park, which includes the cheap basketball court you see pictured in the foreground.
There is something eerie about this image. The clash of historical times, the discrepancy of scale. It reminds me a bit of Percy Shelley’s “Ozymandias,” which is to say that it makes me anxious about the future. It makes me think that, though we believe we are grand, we are actually shrinking.
A few weeks ago, the website Literary Hub published a brief overview of Chinese crime writing* under the title “Shanghai Noir: How to Write Crime Fiction in a City with a 100% Conviction Rate.” Written by British journalist and true crime author Paul French, the survey touches on how difficult it can be to write about crime in a society that denies crime’s existence, or otherwise cultivates the myth of a flawless judicial system. French notes that in nations such as China, where the conviction rate for murder stands at 99.9%, and where maintaining such a rate is crucial to the state’s political project, one’s ability to write about crime critically and honestly is fundamentally compromised. He writes: “The truth is crime in China is a problematic genre — it all too often raises tricky political issues, when it appears the censors [sic.] axe falls swiftly; local politicians are powerful and prickly. Crime shows on TV are no better — showing valiant and incorruptible policemen and women in a cardboard cut-out way that would have been laughed at in America in the 1950s!”
I haven’t been able to shake this statistic — a 99.9% conviction rate. It seems to me to cut two ways. First, it contributes to the appearance of social harmony underwritten by a diligent and expert police state. The appearance of peace and security is key here, for it offers the peace-of-mind that things are exactly as they should be. Everything is under control. This is one reason why authoritarian regimes suppress crime statistics while so radically inflating conviction rates. But this peace-of-mind is only available to those who are unlikely to be accused. This leads us to the second way in which the statistic cuts: For those who belong to one of the groups that find themselves subject to routine scapegoating — one group French mentions that falls within this category is Shanghai’s “population of migrant workers” — a 99.9% conviction rate no doubt compounds a difficult and pervasive sense of insecurity. When no statistical difference exists between being accused and convicted, the only statistic that matters is the rate of accusation.
Authoritarian societies are not the only places where crime statistics are skewed by outside social and political forces. One need look no further than America’s failed “War on Drugs,” which has led to wildly disproportionate numbers of African American citizens being convicted of drug-related charges, even as drug use among white citizens continues unabated. But perhaps the most striking example of politically skewed crime statistics in a major democracy can be found in the United States’ near-perfect conviction rate of those who stand accused of terrorism-related offenses. According to a very informative database published earlier this week, the U.S. Department of Justice has charged 802 people with terrorism-related offenses since the 9/11 terrorist attacks. Of the 802 people charged, only two have been acquitted, with three having had their charges dropped. In other words, when it comes to terrorism prosecutions, the United States convicts 99.4% of defendants — just shy of China’s clearly skewed (and terrifying) conviction rate for murder.
It seems to me that much of what French says about crime in China can be applied to terrorism in the United States. As with crime in China, terrorism in America is politically sensitive, and there are powerful interests invested in shaping — often through overt scapegoating — how Americans view both terrorism and terrorists. Unfortunately, those interests have been largely successful. Perhaps 1950s America would have laughed at contemporary Chinese television depictions of “valiant and incorruptible policemen and women,” as French claims, but 21st century America isn’t laughing at the absurdity of valiant and incorruptible federal prosecutors who always get their man.
A 99.4% terrorism conviction rate lays bare the political dimension of American counter-terrorism efforts, and the message is clear: The counter-terrorism police state exists to protect you. It is doing its job. You have nothing to fear.
How can American writers write critically, or even interestingly, about terrorism under such conditions? The closest anyone has come, to my knowledge at least, is Ben Fountain’s outstanding novel Billy Lynn’s Long Halftime Walk, which presents a scathing portrayal of America’s post-9/11 mentality. Mohsin Hamid’s The Reluctant Fundamentalist is also quite good,** and Kent Johnson’s Doggerel for the Masses comes to mind, but I can’t think of many other literary or pop-cultural examples that succeed in cutting through the absurdity of America’s response to 9/11.*** (If there are examples I’m missing, let me know; I want to read them.) This is a failure not only of imagination, but also of social and political courage to grapple with the complexities of current affairs. We need writers, filmmakers, artists, and critics to do this work, and we need them to do it sooner rather than later. Their success may very well prove crucial to the success of a larger project for an honest reckoning with the contemporary world.
*Many of the examples aren’t Chinese, though they are set in China.
**I should mention that Hamid is not American, though he does write in English and is widely read in the United States.
***As I write this, I’m reminded of Gavin Hood’s 2007 film Rendition, which I seem to recall presenting a more complex story than the typical good guys vs. bad guys scenario that dominates popular terrorism narratives, but I can’t remember the film well enough to comment on it here.
When I think of the word “ecology,” images of rainforests leap immediately to mind. The dense canopy, the intense diversity of flora and fauna, the screeching monkeys and brilliantly colored birds. If I dwell on the word a bit longer, my imagination expands to include rivers, mountains, deserts, coral reefs, and even the frozen expanses of the arctic. These are the sorts of settings that make nature documentaries such as BBC One’s Planet Earth so compelling to watch. But world ecology encompasses so much more, including human beings (people are notably absent from Planet Earth). In his essay “The Three Ecologies,” Félix Guattari identifies three “ecological registers”: “the environment, social relations and human subjectivity,” all of which are intimately interconnected and mutually contingent (18). Where there are rainforests, rivers, mountains, and deserts, there are also social relations and the complexities of human subjectivity. To suggest that humanity and nature exist in separate spheres is to engage in a fallacy, just as it’s naive to neglect the extent to which natural ecology penetrates the human species.
Food is one of the most important means by which natural ecology enters human experience. We eat, and in so doing we incorporate nature into our bodies. It enriches our bodies as it passes through them. As philosopher and literary critic Timothy Morton argues, “All life-forms, along with the environments they compose and inhabit, defy boundaries between inside and outside at every level” (274). The most obvious way this is true is that we consume aspects of the biological world when we eat, and in turn we produce organic matter (including our own bodies) that feeds back into the biosphere. But eating is also a key aspect of human sociability. What occurs at mealtime is responsible, in significant and far-reaching ways, for human culture, and even for civilization itself. The fact that natural ecology is reflected in every plate of food puts nature at the center of culture. And this, it seems to me, opens up possibilities for shared recognition between distant and sometimes unfamiliar cultures, as well as opportunities to exchange knowledge and experiences that may prove crucial to our survival in this age of ecological crisis.
I began thinking about this after reading Karen L. Kilcup’s recent article on the popular nineteenth-century children’s periodical Juvenile Miscellany. In that article, Kilcup touches on how famed abolitionist and women’s rights advocate Lydia Maria Child, who served as the Miscellany’s editor from 1826 to 1834, used natural history to connect her New England readership with the diversity of cultures around the world. One of the ways Child accomplished this was by drawing her readers’ attention to the relative continuity of human attitudes towards, and interactions with, natural ecology, even when specific cultural practices diverge. For example, in her article surveying the various ways people use insects, Child refuses to “ignore traditional practices, even if they make readers uncomfortable, including descriptions of how various cultures consume insects as food — a practice that, she underscores, the Bible references” (Kilcup 268). By drawing a parallel between modern entomophagy (i.e., the practice of eating insects) and the biblical tradition, Child suggests a point of commonality between her predominantly Christian audience and the many people around the world who eat insects.
There are indeed biblical examples of people practicing entomophagy, the most famous of which is John the Baptist surviving on “locusts and wild honey” as he wandered the desert (Matthew 3:4). Somewhat less famous is the dietary code outlined in the Torah, which condones eating “the locust after its kind, the destroying locust after its kind, the cricket after its kind, and the grasshopper after its kind” (Leviticus 11:22). By emphasizing biblical entomophagy’s precedent, Child was clearly attempting to cultivate within her predominantly Christian audience some measure of tolerance for insects as a viable food source, while at the same time advocating sympathy for those cultures that practice dietary customs unfamiliar to the West. If locusts fed the prophets, why should modern Christians be so repulsed by those who eat insects today? Perhaps locusts, crickets, and grasshoppers should be a part of every omnivore’s diet.
Child was working against the grain of deep-seated cultural assumptions. As important as a nutritious diet may be, the fact remains that people make food choices based on a spectrum of concerns, many of which have little to do with sustenance. Prominent cultural anthropologists such as Claude Lévi-Strauss, Roland Barthes, and Mary Douglas have long understood that food has symbolic value. What we eat, and the manner in which we eat it, helps shape our social and individual identities. Lévi-Strauss went so far as to contend that careful attention to eating habits can yield “a significant knowledge of the unconscious attitudes of the society or societies under consideration” (qtd. in Caplan 1–2). Such cultural attitudes, including those expressed in the Western taboo against entomophagy, can be difficult to shake, which is why Child’s biblical appeal did little to persuade her young readers and their parents to incorporate insects into the American diet.
It’s been nearly 200 years since Child made her point regarding entomophagy, and people in the United States — and the West more generally — still object to insects as a viable food source. The degree to which American’s are repulsed by the practice of eating insects is reflected in how entomophagy is represented in pop culture. Consider, for example, American television programs such as Fear Factor, The Amazing Race, Survivor, Man vs. Wild, and Bizarre Foods, all of which feature Americans (or a Briton, in the case of Man vs. Wild) struggling to eat foods that are commonly consumed by people around the world. The insect-eating segments of these programs participate in the pervasive sadomasochism that characterizes reality television; viewers enjoy watching people choke down bugs precisely because entomophagy is considered to be outrageous and vile, if not downright degrading. This is true of even the most sympathetic of these programs. For example, when Andrew Zimmern, celebrity chef and host of Bizarre Foods, consumes insect-based dishes, he often seems to enjoy what he’s eating, and yet the appeal of his show is undoubtedly the abnormal spectacle of someone eating food that Americans find disgusting.
But why is eating an insect any more disgusting than, say, eating a pig — an animal that is reviled by many cultures, including the culture that produced the Bible? The answer to this question leads away from food and toward Western notions of ethnocultural supremacy. In its 2013 report Edible Insects, the Food and Agriculture Organization of the United Nations (FAO) repeatedly notes that people in most Western countries view the eating of insects with disgust, and that this feeling of disgust “forms the basis of moral judgement” (Van Huis et al. 35). Related to this is the report’s conclusion that people in the West “perceive the practice [of eating insects] to be associated with primitive behavior” (Van Huis et al. 35). Joseph Bequaert makes a similar case in his 1921 article “Insects as Food,” arguing that it “can be attributed only to prejudice, that civilized man of today shows such a decided aversion to including any six-legged creature in his diet.”* In other words, one of the unconscious attitudes reflected in the taboo against entomophagy is the belief that Western culture has advanced beyond the so-called “primitive” stage of human development, relegating to a distant — and shameful — past such backwards practices as eating insects.
Unfortunately, this disparaging attitude toward entomophagy negatively influences the eating habits of people who have maintained the tradition of consuming insects, arachnids, mealworms, and other creatures that disgust the Western palate. For example, the FAO makes the case that people in Southeast Asia and sub-Saharan Africa have reduced their consumption of insects in an effort to emulate Western standards and norms (Van Huis et al. 39; Halloran and Vantomme). This is especially true of converts to Christianity. Indeed, there is evidence of Christian missionaries explicitly discouraging people from eating insects on the basis that doing so is “a heathen custom.” One Malawi convert is on record as saying that “he would never taste such things [i.e., winged termites], valuing them as highly non-Christian” (Carl-Axel Silow qtd. in Van Huis et al. 39). This is an old story, and it fits within a larger history of Western ethnocentrism:
In 25–50 percent of Native American tribes, … there existed a long history of insect eating; yet because Western cultures lacked strong cultural experience with the practice and considered it primitive, they discouraged and suppressed it among Native American tribes when these two cultural groups began to interact in the eighteenth and nineteenth centuries. Western cultures inflicted similar damage on other indigenous groups, including many in sub-Saharan Africa, with the goal of modernizing or westernizing them. This cultural suppression was still prevalant [sic.] at the end of the twentieth century. As a result, entomophagy has almost disappeared from Canada and the United States and is showing signs of abating in West Africa. (Van Huis et al. 39)
The abhorrence of insects as a food source should be challenged, and not just for the sake of more balanced cultural relations between the West and those societies that practice entomophagy. At a time when human population growth poses a serious threat to global ecology, people everywhere need to rethink how their diets affect the environment. In his National Geographic article “A Five-Step Plan to Feed the World,” Jonathan Foley, director of the University of Minnesota’s Institute on the Environment, makes the case that dietary changes are imperative if we intend to feed the Earth’s growing human population without doing irreparable damage to the planet. He argues that “finding more efficient ways to grow meat and shifting to less meat-intensive diets — even just switching from grain-fed beef to meats like chicken, pork, or pasture-raised beef — could free up substantial amounts of food across the world.” It would also do a great deal to mitigate animal agriculture’s devastating environmental impacts.**
I would push Foley’s point much further, urging the widespread adoption of plant-based diets, and especially veganism. But in the context of omnivorous food culture, the West has much to learn from those societies that practice entomophagy. Not only are insects a protein-rich food suitable for human consumption, but they can also be used for animal feed, and they are significantly less land and water intensive than traditional livestock. The environmental, health, and social benefits are many. Here are just a few:
Insects have a high feed conversion efficiency because they are cold-blooded. Feed-to-meat conversion rates (how much feed is needed to produce a 1 kg increase in weight) vary widely depending on the class of the animal and the production practices used, but nonetheless insects are extremely efficient. On average, insects can convert 2 kg of feed into 1 kg of insect mass, whereas cattle require 8 kg of feed to produce 1 kg of body weight gain.
The production of greenhouse gases by most insects is likely to be lower than that of conventional livestock. For example, pigs produce 10–100 times more greenhouse gases per kg of weight than mealworms.
Insects can feed on bio-waste, such as food and human waste, compost and animal slurry, and can transform this into high-quality protein that can be used for animal feed.
Insects use significantly less water than conventional livestock. Mealworms, for example, are more drought-resistant than cattle.
Insect farming is less land-dependent than conventional livestock farming.
Insects provide high-quality protein and nutrients comparable with meat and fish. Insects are particularly important as a food supplement for undernourished children because most insect species are high in fatty acids (comparable with fish). They are also rich in fibre and micronutrients such as copper, iron, magnesium, manganese, phosphorous, selenium and zinc.
Insects pose a low risk of transmitting zoonotic diseases (diseases transmitted from animals to humans) such as like H1N1 (bird flu) and BSE (mad cow disease).
Livelihood and Social Benefits
Insect gathering and rearing can offer important livelihood diversification strategies. Insects can be directly and easily collected in the wild. Minimal technical or capital expenditure is required for basic harvesting and rearing equipment.
Insects can be gathered in the wild, cultivated, processed and sold by the poorest members of society, such as women and landless people in urban and rural areas. These activities can directly improve diets and provide cash income through the selling of excess production as street food.
Insect harvesting and farming can provide entrepreneurship opportunities in developed, transitional and developing economies.
Insects can be processed for food and feed relatively easily. Some species can be consumed whole. Insects can also be processed into pastes or ground into meal, and their proteins can be extracted. (Halloran and Vantomme)
There has been some modest movement toward entomophagy in the United States. For example, there is a growing demand for cricket flour, which is used in everything from cookies to protein bars, and educator-friendly information about the dietary benefits of insects is readily available. Just type “entomophagy infographic” into Google, and you will find dozens of examples. Two of my favorites can be found here and here. There are also organizations that advocate for insects as a sustainable food source. Little Herds is a good example. And yet a typical American market is unlikely to stock a single item that makes use of grasshoppers, crickets, termites, or other insects that the FOA recommends as nutritious and sustainable food sources. The disgust toward entomophagy — and the unconscious attitudes it reflects — effectively deprives a sizable portion of the world’s population from a perfectly sensible source of nutrition.
The commitment to “progress” and “modernity” has led us to the brink of environmental catastrophe. Large-scale industrialization, a voracious fossil fuel industry, a blind faith in free markets, and rampant consumerism are a few of the forces that have contributed to the problem. But there are deeper forces at work as well. One such force is the idea that the West — its culture, its religion, its politics, its technology — represents “progress,” and that those cultures that embrace different values and customs are backwards, primitive, and morally deficient. This ethnocentrism has deep roots and manifests itself in many ways, and it has proven remarkably adept at expanding its sphere of influence. Indeed, one of recent history’s great tragedies is how so many of the world’s cultures have accepted this ethnocentric narrative. The widespread enthusiasm for Western food norms— including the disgust toward entomophagy — is but one example.
There are those who believe technological innovation will save us from the worst of our accelerating environmental degredation, allowing us to progress out of the crisis into which “progress” has delivered us. But perhaps the most progressive thing we can do is to listen to those whose customs are the objects of western disgust. There are communities of people in the world who hold a wealth of traditional knowledge, yet the practices derived from that knowledge are too often dismissed as “primitive,” or as belonging to “a heathen custom.” Western ethnocentrism is, in this regards, maladaptive. We need to learn from each other. The future of our species may depend on it. But to do so we must first become aware of how our unconscious attitudes make us averse to cultural practices that can benefit us and our shared environment. Entomophagy is one such practice that the West would be wise to reconsider.
*In his work on entomophagy, Joseph Bequaert, like Lydia Maria Child, draws attention to the fact that the Bible permits the eating of insects.
**Kip Andersen and Keegan Khun make this point in convincing fashion in their 2014 documentary film Cowspiracy: The Sustainability Secret.
Christian theology is at odds with itself when it comes to the natural world. On the one hand, Christianity promotes a deep-seated aversion to nature, which is said to be corrupted by sin. Joseph Campbell touches on this aversion in The Power of Myth, his famed series of interviews with Bill Moyers, noting that “it’s in the biblical tradition, all the way, in Christianity and Islam as well. This business of not being with nature, and we speak with a sort of derogation of the ‘nature religions.’ You see, with that fall in the garden, nature was regarded as corrupt. There’s a myth for you that corrupts the whole world for us. And every spontaneous act is sinful, because nature is corrupt and has to be corrected, must not be yielded to.” This contempt for nature, and in particular for the natural functions of the human body, permeates the cannon of Judeo-Christian myth and has done much to degrade western attitudes toward the environment.
On the other hand, Christianity teaches that human beings have a responsibility to care for the earth. The bounty of nature is imagined as a trust, with humanity acting as both trustee and beneficiary. The theology of environmental stewardship has received renewed attention following Pope Francis’s second encyclical, Laudato si, a document that urges environmental protection as a Christian duty. Francis summarized the Christian position viz. environmental stewardship in his 2014 address to the European Parliament: “Each of us has a personal responsibility to care for creation, this precious gift which God has entrusted to us. This means, on the one hand, that nature is at our disposal, to enjoy and use properly. Yet it also means that we are not its masters. Stewards, but not masters. We need to love and respect nature, but instead we are often guided by the pride of dominating, possessing, manipulating, exploiting.”
The contradictions between these two positions seem intractable. How are we meant to reconcile the idea of nature as a gift with the belief that nature is “fallen” and corrupt? It’s worth remembering that the myth of the fall imagines Eden as containing within itself the source of sin (and thus also our own deaths), just as it shames the natural condition of the human body. Much loathing of ourselves and our environment grows from this root. And yet there is indeed a Christian imperative to care for what the Pope’s namesake, Saint Francis of Assisi, called “our Sister, Mother Earth, who sustains and governs us, and who produces various fruit with coloured flowers and herbs” (qtd. in Francis, Laudato si).
I was reminded of this imperative by two short articles I recently came across in TheSpectator, an underground newspaper published by students at Indiana University from 1966–1970. Both articles draw on biblical language to make the point that we have abdicated our responsibilities toward the environment. The lead article uses familiar phrases from Genesis to argue that both environmental degradation and human want are the consequences of our irresponsibility and ignorance: “We are fruitful and multiply so that overpopulation and starvation are commonplace, subdue our planet by destroying it, exercise dominion with poison and killing” (Williamson). By echoing language taken directly from the first book of Genesis (see 1:28*), this sentence makes the point that modern humanity’s mistreatment of the earth stands in opposition to the doctrine of environmental stewardship.
More striking still is the second article, which recasts the seven days of creation as a perverse undoing of ecological balance and planetary health. Ironically titled “Last Chapter of Genesis,” the article reads:
In the end, there was earth, and it was with form and beauty; and man dwelt upon the lands of the earth, and meadows, and trees — and said, “Let us build our dwellings in this place of beauty.” And he built cities and covered the earth with concrete and steel. And the meadows were gone, and man said, “It is good.”
On the 2nd day, man looked upon the waters of the earth. And man said, “Let us put our wastes in the waters that the dirt will be washed away.” And man did and the waters became polluted and foul in their smell. And man said, “It is good.”
On the 3rd day, man looked upon the forests of the earth and saw they were beautiful. And man said, “Let us cut the timber and grind the wood for our use.” And man did and the lands became barren and the trees were gone. And man said, “It is good.”
On the 4th day, man saw that animals were in abundance and ran in the fields and played in the sun. And man said, “Let us cage these animals for our amusement and kill them for our sport.” And man did. And there were no more animals on the face of the earth. And man said, “It is good.”
On the 5th day, man breathed the air of the earth. And man said, “Let us dispose of our wastes in the air for the winds shall blow them away.” And man did. And the air became heavy with dust and choked and burned. And man said, “It is good.”
On the 6th day, man saw himself and, seeing the many languages and tongues, he feared and hated. And man said, “Let us build great machines and destroy, lest others destroy us.” And man built great machines and the earth was fired with rage. And man said, “It is good.”
On the 7th day, man rested from his labors and the earth was still, for man no longer dwelt upon the earth. And it was good…
It’s difficult to read “Last Chapter of Genesis” without sharing in its misanthropic attitude, especially when the current ecological crisis is considered alongside the myth of Eden. Not that its misanthropy is out of step with mainstream environmental consciousness. Even Laudato si is misanthropic, especially in its salutation, which bemoans the fact that the earth “now cries out to us because of the harm we have inflicted on her by our irresponsible use and abuse of the goods with which God has endowed her. We have come to see ourselves as her lords and masters, entitled to plunder her at will. The violence present in our hearts, wounded by sin, is also reflected in the symptoms of sickness evident in the soil, in the water, in the air and in all forms of life.” Francis leaves little doubt that the responsibility for environmental degradation rests squarely with humanity.
Perhaps Kurt Vonnegut (one of America’s great misanthropes) was correct when he suggested that the only hope for a peaceful, verdant future is the possibility that human beings may still evolve out of our highly-destructive hyper intelligence. In his view, this means regressing (or is it progressing?) into a species of comparatively unintelligent seal-like creatures.** But even this participates in the Christian aversion to nature that Campbell identifies, for Vonnegut’s story exploits an antagonism between humanity and nature that can only be resolved when one or the other is purged from existence. Rather than deepening this antagonism, we need to develop a synthesis between culture and nature. It will only be when we move beyond the sort of dualistic thinking that Zen philosopher Daisetz Suzuki characterized as “God against man, man against God, man against nature, nature against man, nature against God, God against nature” (qtd. in Campbell) that we will begin to come to terms with how thoroughly we are invested in the natural world.
It’s unfortunate that western thinking continues to be shaped so powerfully by Judeo-Christian myth, especially when it comes to our attitude toward nature. So long as people think of nature as a “gift” from God that is now “at our disposal” (and here Francis seems to fall into the ideological trap of “ownership” that he criticizes in so many of his writings), we are unlikely to experience the behavioral revolution that our ecological crisis demands. In the meantime, we should seize ground wherever we can. If Christian thinking insists upon dominion over nature, let’s commit to a wise and noble dominion. We can then hope that the sort of responsible stewardship urged by The Spectator so many decades ago will serve as a catalyst toward the deeper work that remains to be done.
*Genesis 1:28 —Then God blessed them, and God said to them, “Be fruitful and multiply; fill the earth and subdue it; have dominion over the fish of the sea, over the birds of the air, and over every living thing that moves on the earth.”
The first time I entered a desert was fourteen years ago, in the winter of 2002. I drove west from New Jersey to Indiana, and then southwest to Gallup, New Mexico, where I rested for the first time since leaving home. From Gallup I continued west into the Painted and Sonoran Deserts, before cutting back through the Chihuahuan Desert on my way east through Texas to Alabama, and then northeast to New Jersey.
Looking back on that journey, I’m certain I entered the desert without realizing I had arrived. My concept of the desert at that time owed too much to illustrated stories of Moses wandering through the Sinai, or The Road Runner leaving Wile E. Coyote in a cloud of dust. These were barren landscapes full of danger and desolation, the palettes bleak, and death abounding. But what I found in the American Southwest was vibrant and full of life.
America’s deserts are rugged, but they also capture color. The reds and browns of the soil. The blues and purples of the distant mountains. The sky often rich with clouds, fast moving and prone to sudden showers you can see as swathes of gray against the horizon. But most surprising, for me at least, are the varied greens and yellows: A landscape alive with flora fed by those intermittent rains.
The same can be said for deserts around the world. They too are alive. Earlier this year I drove through a portion of the Arabian Desert on my way from Dubai, where I live, to Muscat, the capital city of Oman. The landscape between these cities is marked by shifting dunes and dark-rocked mountains that cut through the sand, yet even here the earth shows signs of life. My two young sons, looking in confusion from the backseat, didn’t believe me when I told them we were in the desert. They couldn’t see it any better than I could when I first encountered it back in 2002.
This condition of not being able to recognize the desert for what it is is a symptom of ecological know-nothingness — excusable in a child, yes, but hardly so in an adult. I have lived my life content in ecological ignorance, and while I’ve learned to see the desert through the flora, I can’t name a single one of those plants, let alone explain their relationship to each other and the land from which they grow. They are a blur, an impression.
An acquaintance of mine suggested that rather than make a New Year’s resolution, we would be better served by committing to a “theme” for the year, the idea being that a theme offers a more nuanced, expansive way to affect change than a resolution. A theme. Something around which to organize our thoughts and actions. I like this idea, so I choose for myself the theme of “ecology.”
I want to think of ecology broadly, as encompassing what Félix Guattari calls the “three ecologies”: natural ecology, social ecology, and mental ecology. He shows how these ecological registers come to bear on each other; when one of them falls out of balance, the others follow. In other words, the three ecologies are integral parts of a larger ecosystem, which speaks to my earlier point about ecological know-nothingness. Being blind to the desert is to be blind to much more, including ourselves.
Throughout his ecological writing, Gary Snyder stresses the importance of developing an intimacy with our surroundings, including learning the names of the plants that grow around us. This means looking closely and accounting for what’s there, noting the details and the subtle variations that occur over space and time. This means staying put and watching where we step.
Knowing our neighbors’ names may be the beginning of basic civility, but it’s also — at a deeper level — the beginning of responsible coexistence, which is necessarily bound up in some degree of self-knowledge. To watch where we step is to look at our own feet. I am writing this from my balcony in Dubai, approximately one kilometer from the Persian Gulf. It is December 27, 2016. The weather is mild. I am looking at a tree. It’s a mature date palm, its fronds erect and in excellent health.
There is too much at stake at this critical juncture to continue in ecological ignorance. This is true for the deserts, the mountains, the forests, the rivers, the oceans. But it’s also true for human sociability, for the stability of our communities, for our bodies and our minds. If there is any hope of avoiding the worst of our accelerating ecological crisis, it very well may depend upon our learning to see clearly what surrounds us every day.
Guattari, Félix. The Three Ecologies. Translated by Ian Pindar and Paul Sutton. Bloomsbury, 2014.