The Myth Of The Noble Savage


Tristan Søbye Rapp is co-founder of The Extinctions, a site that delves into the mass vanishing of species over the last 50,000 years.

“I am as free as Nature first made man,

Ere the base Laws of Servitude began,

When wild in woods the noble Savage ran.”

Dryden, “The Conquest of Granada by the Spaniards,” 1672

The State Of Nature

Before the engine, before the city, before the plow and the chain and the steel axe, was a time when Mankind lived in peace and simplicity. It was a hard life, certainly, often brutish and short as the phrase goes, but it was free, egalitarian and spent in constant communion with nature. From this primordial state, some peoples rose, or fell, depending on one’s view, but a few persevered. They remained unto the very threshold of modernity — some remain even unto today — the last of the “noble savages.”

Or so the story goes. Reports of the unique insight and value of Indigenous land stewardship are frequent, and it is often claimed of so-called primitive tribes — from Australia’s Aboriginals to the Amazonian Indians — that they possess a cultural concern for nature not matched in the industrially minded West. So widespread is this conception that it is scarcely interrogated, even within intellectual circles, even if some token swipe may be made at the phrase “noble savage” itself.

The origins of the noble savage as an archetype are often traced back to 18th-century philosopher Jean-Jacques Rousseau, specifically his “Discourse on Inequality,” where he is said to have lavished praise upon the virtues of the primitive savage as far superior to his own decadent civilization. To Rousseau, the noble savage finds his wellspring in a form of moralizing primitivism, arising as a counter-reaction to the arrogance of enlightenment rationalism.

The principal issue with this narrative is that it is wrong. As anthropologist Ter Ellingson details in “The Myth of the Noble Savage,” Rousseau in fact was not at all as effusive about “savagery” as is often supposed, and indeed does not use the term “noble savage” at all in his writing. As indicated by the title of Ellingson’s book, there is a degree to which the entire concept of the “noble savage” is a myth — not merely in the sense that it is not true anthropology, but that it is not even a true, historical belief about anthropology.

Ellingson traces the “true” genealogy of the noble savage-idea not to Rousseau, but to one Marc Lescarbot. Lescarbot was a 16th- to 17th-century French lawyer, traveler, poet and writer, who maintained an amicable relationship with the Canadian First Nations people, the Mi’kmaq, whom he held in generally favorable regard. It is in his writings that we first see the term, “Les Sauvages sont vraiment Nobles,” translated as “The Savages are truly Noble,” in English literature.

Lescarbot intends what he wrote quite literally, associating his admiration for these “savages” — their pursuits of hunting, hospitality and warrior culture — with the blue-blooded nobility of European aristocracy. This is far from the term’s meaning as we understand it today. We do, however, see the kernels from which later conceptions might spring.

Lescarbot speaks of the “sauvages” as being beyond (or, in a developmental sense, prior to) the “mine and thine” culture of civilized peoples. It is their generosity and reciprocity that he is principally interested in, paralleling it as he does to the conventions of conduct attached to European nobles. Nevertheless, the echoes of later ideas are also present — implications of non-ownership and a perhaps lesser materialism.

From these investigations into the development of one idea, we must now temporarily turn to that of another. The concept of prehistory. It is a strange thing to consider that until only the last two centuries or so there was really no conception of the “prehistoric” in the Western imagination. The record of the world was traced in Scripture to its utmost origins, and what blanks remained — the unrecorded intervals in the heathen wilds between Babel and Christianization — were merely local gaps in an otherwise comprehensive history.

By the 16th century, broad scholarly agreement had arisen that the world was created circa 4000 BC, a date arrived at through literal calculations of the biblical genealogies. The gradual collapse of this narrative is sufficient fodder for a whole essay of its own, but in short, it spanned some 100 years, beginning with the works of Buffon and later Cuvier and culminating with the publication of Darwin’s “On The Origin of Species” in 1859.

The sheer scope of the philosophical and imaginary transformations entailed by these discoveries — of our world’s true scope and antiquity — is difficult to pithily surmise. It was the opinion of American polymath Guy Davenport that the discovery of the Archaic, of “deep time,” was among the great inventions of our age. Crucially, from this new conception of the Earth’s antiquity sprang a new conception of Man’s.

“Though the image of the “ecologically noble savage” has been spread with the intention of benefitting and praising Indigenous peoples, this has frequently proved a dubious help at best.”

The traditional view of human history had always been static, if not marked by regression, with Adam and Eve created in a state of perfection from which our ancestors subsequently Fell. With the advent of the prehistoric, however, arose something new — a developmental view of history, wherein the higher arises from the lower, and Man began not, as it were, through descent, but ascent.

From the very beginning of colonial contact, Europeans associated the tribal peoples they encountered with primitivity — not merely at the individual level, but also at the cultural. Writers like Lescarbot sought explanations for the nature of Amerindian society in Europe’s ancient past, drawing comparisons between their seemingly simple, naked existence and the fabled Golden Age of Greek and Roman legend. When the advent of paleontology plunged the scope of human history drastically backward into an unremembered past, the state of modern-day “savages” naturally presented itself as an analogy for Prehistoric Man.

The result was the indelible association, particularly in the public mind, of what might be termed in Victorian fashion “Prehistoric” and “Savage Man” into one unified conception of the “Primitive Man.” At its simplest, the enduring sense — amongst writers both progressive and conservative, lay and academic — has been that the two are in a sense “of a piece.” When the image of the one changed, invariably this would spill over into the perception of the other. With this linkage established, the scene was set for the last developments: The concurrent rises of Indigenous and ecological awareness.

The environmental or conservation movement, now a universal facet of thinking across the globe, had its origins in 1800s North America. Its precedents could be found in Europe in the form of the Romanticist movement, typified by such poets as Keats and Shelley, yet the Romantic attitude to nature remained essentially aesthetic. In Europe, nature was overwhelmingly a domesticated affair, conceived of primarily in the form of the aged and little-changing agrarian landscape.

Things were different in North America. Not only did the sheer scale and grandeur of its uncultivated wildernesses captivate the minds of settlers and explorers from the earliest days, but they also provided a frame of contrast. Where change was slow in the European countryside, it was both rapid and seismic in America.

Against the backdrop of the swift destruction of the forests of Ohio and Pennsylvania, the extermination of the bison and the extinction of the passenger pigeon, the nascent environmental movement was born. The displacement and destruction of America’s natural heritage was almost invariably correlated with that of its aboriginal population, and a conceptual link was therefore soon established between the fate of the Amerindian and of the natural environment.

Entering the 20th century, the rise of indigenous awareness occurred even as the conservation movement was swiftly gaining steam. The two were interlinked from the very beginning. Historically, the association of native peoples with “primitivity” had been derisive or patronizing at best, but now it was reframed: American Indians, New Zealand Māori, Amazonian natives and more were seen as stewards of their traditional landscapes, over and against the implicitly rootless, exploitative and commercially obsessed Western culture. The culmination of this trajectory was such cliches as the (in)famous “Crying Indian” picture. (The photo’s model was later revealed to not even be a Native American.)

We arrive, then, at the meeting point of all our threads. The total association in popular culture of Indigenous peoples and environmental stewardship has birthed what academics have termed the “ecologically noble savage” — humanity in the “state of nature,” perfectly aligned with its environment, conscious of its limitations and compassionate toward its surroundings.

Furthermore, through the deeply engrained associations between modern tribal societies and the prehistoric past, this image, once formed, was subsequently projected backward through time: We were all ecologically noble savages, once. Some stayed true to the path, remaining faithful to nature and the natural world, whilst others strayed, deviating into the Malthusian traps of hierarchy, intensification and exploitation. It is an effective narrative, an inviting one. Only one uncertainty remains, lurking in the back of all conversations on the topic: whether any of it is true.

The Hunting Ape

A great deal of the long and often labored debate about the role and impact our prehistoric ancestors had on nature may be summarized with a single question: What happened to the woolly mammoth? Like a crime scene begging for a smoking gun, the vanishing of the mammoth, alongside the vast cadre of large game that accompanied it — the wooly rhinoceros, the cave lion, American horses and more — has been the topic of much curiosity for over two centuries.

“Cultures have often been labeled as utilizing ‘conservation practices’ due to a focus on the effects of their behaviors rather than whether or not they exhibit conscious concern for the environment.”

The very concept of extinction, that a species might altogether vanish from the face of the Earth, is a recent and startling one, at first. The scientific world became aware of the mammoth and its kind well before it accepted its disappearance: When Thomas Jefferson (whose polymathy extended to dabbling in then-nascent paleontology) first described the bones of giant sloths and mastodons from the U.S., he still insisted on their existence somewhere in the wilds beyond the frontier. In time, however, it became clear that these animals no longer existed anywhere save for their remains.

Yet unlike the dinosaur, also around then being described by science for the first time, the mastodon and the mammoth were of a different age, less thoroughly ancient. They might no longer coexist with modern Man, but they had once; associated with their remains were stone spear tips and other artifacts of unmistakably human origin. Evidently, then, these great beasts had endured all the way down unto the very threshold of human history, a fact that begged two great and obvious questions — where did they go, and why?

A general skepticism has long dogged the very notion that early humans might have played a key role in the prehistoric extinctions. Our ancestors were too few, too primitive, their impact simply could not have been that great — so the objections often run. The cost-benefit analysis of heavily hunting large game simply does not add up. The fault, then, must rest on the climate and the ending of the Ice Age, which transformed habitats and made it impossible for the prehistoric megafauna, the era’s large beasts, to keep up.

It is a compelling narrative — some 15,000 years ago the Ice Age terminated and with it the Ice Age fauna; their time ended, and so did they. Unfortunately, it does not hold up. For one thing, the chronology is hopelessly confused: Most extinctions occurred either well before or substantially after the actual end of the glacial period, with only a comparatively small subset clustering around this date. Furthermore, they occurred at wildly different times depending on the region, barely corresponding to any major climatic shifts. What the extinctions do consistently correlate with, however, is the date of human arrival at a given location.

Furthermore, increasing evidence has pointed toward the impact of our ancestors well-predating the emergence even of our species in its modern form. The early evolution of complex, intelligent hominins has been connected to the decline of saber-toothed cats, with one study implicating them in a startling nearly 99% decline in the functional richness of all large carnivores in eastern Africa between 3.5 to 1.5 million years ago.

Here, in the cradle region of humankind, our ancestors first began intruding into predatory niches, abandoning the near-herbivory of our simian relations — and the world gave way before us. Nor was this early impact necessarily restricted to competing large predators: A Swedish study published in June links the spread of advanced hominins over the same period to the collapse of terrestrial turtle species throughout Africa and Eurasia — a pattern of extinction affecting neither their aquatic relatives nor their terrestrial counterparts in regions our ancestors had not yet reached.

When, eventually, modern humans crossed the Beringian wastes to North America and the seaways into Oceania, they encountered giants such as the American giant tortoises of the genus Hesperotestudo, some rivaling the size of its surviving Galapagoan cousins, or the bizarre horned Meiolaniidae. All were swift to join their Afro-Eurasian relatives in extinction, their thick shells useless defenses against the spear and axe.

So much for the paleontological and archaeological record. What insights can be gleaned from turning instead to the modern and historic anthropological record? If such was the conduct or, at any rate, the impact of “Prehistoric Man,” what of modern peoples existing in more primitive conditions?

Indigenous peoples such as the Tukano of the northwestern Amazon are often held up as exemplars of environmentally conscious and sustainable conduct. The Tukano inhabit a region dominated by the so-called blackwater rivers, highly acidic, tannin-loaded waterways low in nutrients that cross a country of poor jungle soils.

A host of ingenious conservation practices are ascribed to them, including the deliberate spacing of settlements far apart to conserve resources, a cultural prohibition against the deforestation of the riversides and a carefully managed system of fisheries. But as anthropologist Allyn Maclean Stearman points out, all these practices most likely constitute adaptive strategies to a harsh environment without which life would simply be impossible. They are not deliberate attempts at conservation; indeed, the Tukano appear to harvest their waterways at very close to the maximum possible level.

“A population living sustainably purely for circumstantial reasons will most likely not remain sustainable once said circumstances change.”

Many Indigenous populations undoubtedly have had only small to negligible impacts on their local environments, but further investigation tends consistently to show that this is more due to their low numbers and limited means than any deliberate expression of consideration. Cultures have often been labeled as utilizing “conservation practices” due to a focus on the effects of their behaviors rather than whether or not they exhibit conscious concern for the environment.

The danger here is a conflation of action and intention, such as presuming that small, marginal populations are deliberately conserving resources, when in fact they simply lack the means to substantially exploit them. The historical disbelief in extinction as a concept was not a specific artifact of Western culture, but merely a reflection of broader human tendencies — many aboriginal cultures do not practice any resource management strategies because they do not actually believe resources are finite.

A population living sustainably purely for circumstantial reasons will most likely not remain sustainable once said circumstances change. The truth and consequences of this can be observed in the many island colonizations in historical or late prehistoric times. The settlement of Oceania by Austronesian peoples — the Polynesians and Micronesians — and their Melanesian followers saw a wave of extinctions sweep across the island world, with flightless and ground-nesting birds left particularly decimated.

Where Indigenous peoples have gained access to modern technologies and economic pursuits, including oil extraction, mining and real estate development, they have consistently chosen to embrace these new ways of life, seeing them as means of attaining greater prosperity. “Conservation” as a concept entails not merely an inability to exert change but also a conscious application of restraint. True conservation awareness is evinced only when the possibility of unsustainable exploitation is both present and also actively rejected.

Though the image of the “ecologically noble savage” has been spread with the intention of benefitting and praising Indigenous peoples, this has frequently proved a dubious help at best. The issue with being ascribed values and beliefs you do not actually hold, nor have ever claimed to, is that you will often be held accountable for perceived “violations” of them.

Even aside from the purely intellectual offense of their untruth, narratives of perceived Indigenous “naturalness” and ecological harmony have served to burden tribal peoples across the world with unreasonable and often impossible expectations, often accompanied by serious repercussions when they inevitably fall short.

The term “myth” has two primary definitions that are often conflated: The first and triter one is simply that of a false story or notion. The second and more profound is of a cycle of beliefs or traditions, an idea or ideas around which we structure and organize our worldview. Not all myths in this second sense are wrong, but if our investigation so far has been to see into which of these categories the “ecological noble savage” fits, the only honest conclusion must be both.

Interrogated under the hard light of evidence, the primordial harmony of humankind and nature appears to dissipate, vanishing like a pleasant mirage. What we are left with in its stead is a profound ambiguity, one that impinges discomfortingly on the major issues of our time — of humanity’s place in nature, of the planet’s future and of our role and position within it. The noble savage is gone, and man is in consequence forlorn. Yet where one myth dies, another may be born. Honesty about our past may yet prove the first prerequisite for an open and fruitful consideration of our future.

The Abolition Of Eden

At the root of our concerns with noble savagery, primitive conservation and ancient extinctions lies a question of human ecology and where humanity fits in the grand scheme of nature. Part of the difficulty is the very ambiguous character of both these terms — “human” and “nature.” Once we include pre-Homo sapiens ancestors and relatives like the Neanderthals and Homo erectus, our very concept of what it means to be human becomes stretched and the matter takes on an existential character. Similarly, the “nature” we speak of today is not an obvious, universal concept, shared across cultures, but one with a particular origin traceable back to the modern West.

Even today, much debate rages about the concept, with the topic of invasive species a particular site of disagreements about what constitutes “nativity” and what role human agency should play. For some, the very suspicion of human manipulation is close to disqualifying for any claim of “naturalness,” our species being seemingly possessed of a pernicious Midas touch, anything we contact rendered false and artificial by the association. Others allow a more expansive role for human influences in nature.

“True conservation awareness is evinced only when the possibility of unsustainable exploitation is both present and also actively rejected.”

But all modern conceptions wrestle with the same fundamental tension: That at its very core, what we envision as natural seems to be about juxtaposition, defined by its contrast, against the cultural and manmade. If a landscape is “natural,” we mean that it is not cultivated, not sculpted, not the product of human hands. If a wilderness is “wild,” it is so because humanity seems absent in it.

It is questionable, however, whether it is even meaningful today to speak of such a notion of the wild and natural as isolated from mankind’s touch. In a world where atmospheric carbon impacts every corner of the world, where fertilizing nitrogen rains from the sky hundreds of miles from its source of origin and microplastics are found even in the glaciers of Antarctica, the “natural” seems nowhere to be found.

Nor is it obvious that it was ever a truly meaningful distinction: Even the “wilds” of Yellowstone were populated by Amerindian tribes for millennia prior to their removal, while hunter-gatherers have been shaping landscapes across the world with controlled fires for more than 10,000 years. When the great glaciers pulled back from Europe at the end of the last Ice Age, the warming world they unveiled was not truly “virgin soil,” unmarred by Man, but had in fact been populated for some 30,000 years already.

The ecosystems that developed in the millennia that followed were defined from the outset by the absence of species, many of which had already been hunted to extinction long before the end of the great chill. They were shaped at a fundamental level by the seismic yet indirect consequences of our ancestors. As noted earlier, our predecessors may have begun altering the environments around them even before they fit our modern conceptions of what it means to be truly human.

So much, then, for Man against Nature. For better or worse, the great hunting ape is an ape, and all attempts to delineate neatly between the domains of the cultural and natural seem in the end to prove essentially aesthetic — not so much the absence of humanity’s touch as the mere appearance of that absence. This can surely tell us little. Perhaps what is needed is a fundamental reframing of our perspective. If there was never an age in which our ancestors did not shape the world around them, it is because there was never an age in which they were apart from it.

Our species is not an ordinary one — we are, to borrow a phrase, a rational animal; our impacts are measured on the scale of asteroids and volcanos. Whilst this reality is undoubtedly distressing due to the destruction we cause, we also possess a unique ability to think, consider and act beyond mere necessity. As a species, we can choose whether to repeat the actions and conduct of our ancestors. A shadow of determinism lies across all discussions of the noble savage and the lifestyles of prehistory, an unstated implication that the nature of our predecessors indelibly shapes the opportunities of our future. But the reality is that while we, our species — Homo sapiens — are no less the sapient apes our Ice Age forbearers were, that sapience is precisely what imbues us with autonomy and an ability to both learn and decide for ourselves.

Cultural landscapes, from the traditional wood pastures of Europe to even suburban yards, are often havens of substantial biodiversity. If the arrival of the Māori to New Zealand was heralded by a wave of extinctions, then by the modern age they had developed the concept of “kaitiakitanga” — of traditional land stewardship and conservation. Even the Amazonian Tukano, if only out of necessity, truly have found a way to live sustainably with their surroundings.

Familiarity with one’s land breeds knowledge, which may indeed be of vital import to its conservation and sustainable use, but it does not necessarily breed the disposition to apply said knowledge in this manner. That, all evidence suggests, demands a conscious and rare commitment. That it has not often been made in the past does not mean it cannot be made in the future; lessons learned even through utilitarian pursuits may yet be applied to higher ends.