We Ran Out of Mummies Because Europeans Ate Them
You'd think there'd be tons of mummies in the world, given how routine a practice it was in ancient Egypt. We should be up to our ears in friggin' mummies. The only reason we aren't is, well ... we ate them.
"We," of course, refers to the Europeans who "discovered" them and then turned around and derided the cultures they stole from as cannibal "savages" in a stunning display of the pot calling the kettle a pot. Due to either an unfortunate mistranslation or just plain pseudoscience, white people in the Middle Ages believed mummies had healing properties, so they crushed them up and used them in medicine. These medicines were usually administered orally, probably because mainlining mummies is a great way to get a literal blood curse.
It was such a popular treatment that eventually, the mummy supply started running low, and just any dead body would do. King Charles II was known to drink a potion of alcohol and crushed human skull and henceforth as the most goth king, and poor people who couldn't afford to buy bodies to eat would show up to public executions and haggle for a cup of the condemned's blood. Even Da Vinci, normally a distinctly un–Gwyneth Paltrowish man of science, was into it. The practice only started falling out of favor in the 18th century, when people figured out how illness works (turns out it was germs all along), but you could find mummy powder in German medical catalogs right through the early 20th. That's right: We're only about as far removed from Charlie Chaplin as we are from prescription mummy.
Top image: Keith Schengili-Roberts/Wiki Commons
Related: That Time A Mummy Got A Passport (Because Bureaucracy)
Discover the History You Missed...
and so much more! One Cracked Fact delivers one new story from the worlds of history, science and pop culture, directly to your inbox every day. Sign up now!
Recommended For Your Pleasure
The story begins in a kitchen. Two young women in bonnets sit by shelves of crockery and an open window. Warm, yellow light streams in, and we see just a hint of blue sky. Inside, the colors are browns, sepias, and ochres. The first woman’s dress: brown. The shadows on her dress: darker brown. The table by the wall and the copper pots above: light brown, medium brown, and dark brown.
If art historians and conservators are right about Interior of a Kitchen, Martin Drölling’s painting from 1815, the artist had help from a surprising source—the grave. Scholars believe he relied heavily on a popular pigment of his time—mummy brown—a concoction made from ground-up Egyptian mummies. From the 16th to the 19th century many painters favored the pigment, and it remained available into the 20th century, even as supplies dwindled. In 1915 a London pigment dealer commented that one mummy would produce enough pigment to last him and his customers 20 years.
Nineteenth-century painters Eugène Delacroix, Sir Lawrence Alma-Tadema, and Edward Burne-Jones were just a few of the artists who found the pigment useful for shading, shadows, and, ironically, flesh tones. (On discovering the source of the pigment, Burne-Jones is said to have been horrified and felt compelled to bury his reserves of mummy brown.)
But it wasn’t just artists who were using ground-up bodies. Since the 12th century, Europeans had been eating Egyptian mummies as medicine. In later centuries unmummified corpses were passed off as mummy medicine, and eventually some Europeans no longer cared whether the bodies they were ingesting had been mummified or not. These practices, however strange, are just some of the many ways people have made something useful out of death.
A Gross Misunderstanding
The eating of Egyptian mummies reached its peak in Europe by the 16th century. Mummies could be found on apothecary shelves in the form of bodies broken into pieces or ground into powder. Why did Europeans believe in the medicinal value of the mummy? The answer probably comes down to a string of misunderstandings.
Today we think of bitumen as asphalt, the black, sticky substance that coats our roads. It’s a naturally occurring hydrocarbon that has been used in construction in the Middle East since ancient times. (The book of Genesis lists it as one of the materials used in the Tower of Babel.) The ancients also used bitumen to protect tree trunks and roots from insects and to treat an array of human ailments. It is viscous when heated but hardens when dried, making it useful for stabilizing broken bones and creating poultices for rashes. In his 1st-century text Natural History, Roman naturalist Pliny the Elder recommends ingesting bitumen with wine to cure chronic coughs and dysentery or to combine it with vinegar to dissolve and remove clotted blood. Other uses included the treatment of cataracts, toothaches, and skin diseases.
Natural bitumen was abundant in the ancient Middle East, where it formed in geological basins from the remains of tiny plants and animals. It had a variety of consistencies, from semiliquid (known today as pissasphalt) to semisolid (bitumen). In his 1st-century pharmacopoeia, Materia Medica, the Greek physician Dioscorides wrote that bitumen from the Dead Sea was the best for medicine. Later scientists would learn that bitumen also has antimicrobial and biocidal properties and that the bitumen from the Dead Sea contains sulfur, also a biocidal agent.
While different cultures had their own names for bitumen—it was esir in Sumeria and sayali in Iraq—the 10th-century Persian physician Rhazes made the earliest known use of the word mumia for the substance, after mum, which means wax, referring to its stickiness. By the 11th century the Persian physician Avicenna used the word mumia to refer specifically to medicinal bitumen. We now call the embalmed ancient Egyptian dead “mummies” because when Europeans first saw the black stuff coating these ancient remains, they assumed it to be this valuable bitumen, or mumia. The word mumia became double in meaning, referring both to the bitumen that flowed from nature and to the dark substance found on these ancient Egyptians (which may or may not have actually been bitumen).
As supplies of bitumen became increasingly scarce, perhaps partially because of its wonder-drug reputation, these embalmed cadavers presented a potential new source. So what if it had to be scraped from the surface of ancient bodies?
The meaning of mumia shifted in a big way in the 12th century when Gerard of Cremona, a translator of Arabic-language manuscripts, defined the word as “the substance found in the land where bodies are buried with aloes by which the liquid of the dead, mixed with the aloes, is transformed and is similar to marine pitch.” After this point the meaning of mumia expanded to include not just asphalt and other hardened, resinous material from an embalmed body but the flesh of that embalmed body as well.
Take Two Drops of Mummy and Call Me in the Morning
Eating mummies for their reserves of medicinal bitumen may seem extreme, but this behavior still has a hint of rationality. As with a game of telephone, where meaning changes with each transference, people eventually came to believe that the mummies themselves (not the sticky stuff used to embalm them) possessed the power to heal. Scholars long debated whether bitumen was an actual ingredient in the Egyptian embalming process. For a long time they believed that what looked like bitumen slathered on mummies was actually resin, moistened and blackened with age. More recent studies have shown that bitumen was used at some point but not on the royal mummies many early modern Europeans might have thought they were ingesting. Ironically, Westerners may have believed themselves to be reaping medicinal benefits by eating Egyptian royalty, but any such healing power came from the remains of commoners, not long-dead pharaohs.
Even today the word mummy conjures images of King Tut and other carefully prepared pharaohs. But Egypt’s first mummies weren’t necessarily royalty, and they were preserved by accident by the dry sands in which they were buried more than 5,000 years ago. Egyptians then spent thousands of years trying to replicate nature’s work. By the early Fourth Dynasty, around 2600 BCE, Egyptians began experimenting with embalming techniques, and the process continued to evolve over the centuries. The earliest detailed accounts of embalming materials didn’t appear until Herodotus listed myrrh, cassia, cedar oil, gum, aromatic spices, and natron in the 5th century BCE. By the 1st century BCE, Diodorus Siculus had added cinnamon and Dead Sea bitumen to the list.
Eating mummies for their reserves of medicinal bitumen may seem extreme, but this behavior still has a hint of rationality.
Research published in 2012 by British chemical archaeologist Stephen Buckley shows that bitumen didn’t appear as an embalming ingredient until after 1000 BCE, when it was used as a cheaper substitute for more expensive resins. This is the period when mummification hit the mainstream.
Bitumen was useful for embalming for the same reasons it was valuable for medicine. It protected a cadaver’s flesh from moisture, insects, bacteria, and fungi, and its antimicrobial properties helped prevent decay. Some scholars have suggested that there was also a symbolic use for bitumen in mummification: its black color was associated with the Egyptian god Osiris, a symbol of fertility and rebirth.
Although in the early days intentional mummification was reserved for pharaohs, the process gradually became democratized, first for nobility and other people of means. By the Ptolemaic (332 to 30 BCE) and Roman (30 BCE to 390 CE) periods, mummification had become affordable for even the budget-minded populace. Many of the Romans and Greeks living in Egypt at this time also underwent mummification, though they were motivated more by prestige than by a rewarding afterlife. Mummification became a status symbol: a jewel-encrusted mummiform was the ancient version of driving a new Audi (although much less fun for the user). Because these expats weren’t necessarily concerned with preserving their bodies for the long run, many of the Roman mummies discovered from this period are elaborate on the outside—some have been found covered in gold and diamonds—but collapsing within the wrappings. These mummies also made use of bitumen, resin’s affordable substitute.
These more recent, cheaper-made mummies were the ones that found their way to Europe; they were not the pharaohs some presumed them to be. Ambrose Paré, a 16th-century barber surgeon who opposed the use of mummy as a drug, claimed, “This wicked kind of drugge doth nothing help the diseased.” Paré noted that Europeans were most likely getting “the basest people of Egypt.” The results, though, would have looked the same; not until the 20th century could scientists tell which mummies were embalmed with bitumen.
What was the attraction of mummy medicine in early modern Europe? Likely the exoticism of mummies, at least in part. Europeans began exploring Egypt in the 13th century, and interest continued for hundreds of years (and still continues). In 1586 English merchant John Sanderson smuggled 600 pounds of mummy parts from an Egyptian tomb:
We were let down by ropes, as into a well, with wax candles burning in our hands, and so walked upon the bodies of all sorts and sizes, some great and small. They have no noisome smell at all, but are like pitch, being broken. For I broke off all the parts of the bodies to see how the flesh was turned to drug, and brought home divers head, hands, arms, and feet for a show. We brought also six hundred pounds . . . together with a whole body. They are lapped in above a hundred double of cloth, which rotting and peeling off, you may see the skin, flesh, fingers, and nails firm, only altered black.
But as mummy medicine became popular in the West, merchants found new ways to satisfy demand. Tomé Pires, a 16th-century Portuguese apothecary traveling in Egypt, wrote that merchants “sometimes pass off toasted camel flesh for human flesh.” Guy de la Fontaine, a doctor to the king of Navarre, on a visit to Egypt in 1564 asked an Alexandrian merchant about ancient embalming and burial practices. The merchant laughed and said he had made the mummies he was selling. Karl H. Dannenfeldt, author of an influential 1985 article on the subject, “Egyptian Mumia: The Sixteenth Century Experience and Debate,” describes Fontaine’s scene: “The bodies, now mumia, had been those of slaves and other dead persons, young and old, male and female, which he had indiscriminately collected. The merchant cared not what diseases had caused the deaths since when embalmed no one could tell the difference.”
A Local Remedy
As authentic Egyptian mummies begat fraudulent ones, a medicinal shift occurred in Europe that made this trickery less necessary: there was a growing fondness for medicines made of often local, recently dead human flesh, bone, secretions, and even excretions. Scholars now refer to human-derived drugs generally as corpse medicine, which in early modern Europe included blood, powdered skull, fat, menstrual blood, placenta, earwax, brain, urine, and even feces. Richard Sugg, author of Mummies, Cannibals and Vampires: The History of Corpse Medicine from the Renaissance to the Victorians, writes, “For certain practitioners and patients there was almost nothing between the head and feet which could not be used in some way.”
An epileptic (or more likely his doctor) seeking a remedy for seizures in 1643 might turn to Oswald Croll’s Basilica chymica, where, on page 101, can be found a recipe for an “Antepileptick Confection of Paracelsus.” The recipe’s central ingredient consists of the unburied skulls of three men who died a violent death. Another recipe for “Treacle of Mumy” has been quoted often, and for good reason. Croll writes, “Of Mumy only is made the most excellent Remedy against all kinds of Venomes.” He explains:
First . . . Chuse the Carcase of a red Man whole clear without blemish, of the age of twenty four years, that hath been Hanged, Broke upon a Wheel, or Thrust-through, having been for one day and night exposed to the open Air, in a serene time.
A footnote explains that Croll’s mumy is “not that Liquid matter which is found in the Egyptian Sepulchers. . . . But according to Paracelsus it is the flesh of a Man, that perishes by a violent death, and kept for some time in the Air.” That is, Croll defines mummy as the flesh of almost any corpse one could find.
What compelled people to believe this form of (medicinal) cannibalism was both useful and acceptable at a time when reports of cannibalism in the New World shocked and horrified Europeans?
Medical historian Mary Fissell reminds us that common understandings of medicinal usefulness were once quite different. Medicine that produced a physiological effect—whether purging or excreting—was considered successful. It certainly makes sense that eating human remains might induce vomiting. Fissell also points out that many of the hormonal treatments developed in the 20th century were made from animals or their by-products. “They had to boil down a hell of a lot of mare’s urine to get that early hormone,” she says, “so we’re not even as far away [in terms of what we deem acceptable or gross today] as we might think.” Premarin, an estrogen-replacement medication derived from mares’ urine, is still widely used today. Bert Hansen, also a medical historian, points out that many medicines were selected through a process of trial and error. “A lot of medical treatment was only one step away from cooking.” He adds that people were “willing to taste and eat things that we now find disgusting” and that for a “middle-class household with no running water and no refrigeration . . . hands, bodies, everything is somewhat smelly and icky all the time. That’s life.”
Not all early doctors or apothecaries advocated the use of corpse medicine. Aloysius Mundella, a 16th-century philosopher and physician, derided it as “abominable and detestable.” Leonhard Fuchs, a 16th-century herbalist, accepted foreign mummy medicine but rejected the local substitution. He asked, “Who, unless he approves of cannibalism, would not loathe this remedy?” Mummy medicine devotees, including English King Charles II, seemed able to work around this uneasiness by differentiating between food and medicine. (Charles reputedly carried around his own homemade tincture of human skull, which was given the nickname the King’s Drops). In her book Medicinal Cannibalism in Early Modern English Literature and Culture, Louise Noble suggests that people were able to distance the final medicinal product from its original source—the human body—by convincing themselves it had somehow transformed into something new. For his part Sugg describes a perceived division between food and medicine when he quotes the religious writer and historian Thomas Fuller, who described mummy medicine as “good Physic [medicine], but bad food.”
Another powerful factor in people’s thinking about corpse medicine was just that—its perceived power. Egyptian mummies were linked to ancient knowledge and wisdom. Many scholars have noted parallels to the Catholic Eucharist, wherein Christ’s own flesh, through the belief in transubstantiation, is ingested for spiritual well-being. With the flesh comes what Noble describes as “the awesome potency of God,” perhaps the strongest corpse medicine of all. Noble suggests that the Catholic belief in the Eucharist led to an acceptance of human corpse medicine. She writes, “One is administered to treat the disease of the body and the other the disease of the soul. . . . Both reflect the belief that the essence of a past life has pharmacological power when absorbed into a life in the present.” As for Protestant mummy-medicine partakers, Noble proposes that eating mummy was a way of getting some power from flesh without having to participate in the Catholic Eucharist.
Noble compares the Christian beliefs in the sacred power of human flesh with the beliefs of Paracelsus, the Renaissance physician, botanist, and alchemist who saw an innate power in the human body. Behind this corpse pharmacology, Noble writes, “is the perception that the human body contains a mysterious healing power that is transmitted in ingested matter such as mummy.” Of this power Paracelsus wrote, “What upon the earth is there of which its nature and power are not found within the human being? . . . For all these great and wondrous things are in the human being: all the powers of the herbs [and] trees are found in the mumia.” Paracelsus’s mumia was not the foreign kind but the healing power of the human body, which he believed could be transferred from person to person.
Though these ideas may seem strange to 21st-century readers, Noble theorizes that the idea of transferring life force is not so different from how some perceive organ donation. Actor Liam Neeson spoke proudly about his deceased wife, Natasha Richardson, who had donated her organs. He told CNN’s Anderson Cooper, “We donated three of her organs, so she’s keeping three people alive at the moment . . . her heart, her kidneys, and her liver. It’s terrific and I think she would be very thrilled and pleased by that.” This somewhat spiritual and partly practical idea that a dead person can give life to another also hints at a sort of immortality for the dead, who get to live on in someone else.
The Father of Utility Gives Dissection a Good Name
Where did all of these early modern, corpse-medicine cadavers come from? Some resulted from accidental deaths, but many others were executed criminals. Great Britain’s Murder Act of 1752 allowed executed murderers to be dissected. The reasoning was twofold: first, it denied murderers a proper burial, thereby inflicting extra punishment beyond death (the language of the act actually claimed it as “further terror and a peculiar mark of infamy”); and second, it provided bodies for anatomical research and medical education. After dissection the bodies often went to apothecaries to be made into medicine.
As much as the sick may have appreciated using other people’s bodies as medicine, surely no one would actually want to end up being one of these bodies. Why would anyone volunteer to undergo an act reserved for the worst criminals? Jeremy Bentham, philosopher and father of Utilitarianism, took steps to change the stigma and fear that most people reserved for dissection when, in the spirit of usefulness, he became the first person to donate his body to science in 1832.
In 1824 Bentham’s friend and colleague Thomas Southwood Smith published “Use of the Dead to the Living.” The pamphlet detailed the difficulties medical schools faced in obtaining bodies for their students to dissect. Without a reliable supply of cadavers, Smith wrote, public hospitals would be made into “so many schools where the surgeon by practicing on the poor would learn to operate on the rich.” Bentham had already decided to donate his body, but Smith’s pamphlet also compelled him to draft legislation that inspired the Anatomy Act of 1832, which permitted anyone with legal custody of a dead body to donate it for dissection. Bentham turned his body over to the custody of Smith and instructed him, on Bentham’s death, to dissect it publicly, and then to preserve the head and stuff the body in as lifelike a manner as possible.
As much as the sick may have appreciated using other people’s bodies as medicine, surely no one would actually want to end up being one of these bodies.
Smith failed in his mummification of Bentham’s head: the result was unrecognizable. He wrote, “I endeavored to preserve the head untouched, merely drawing away the fluids by placing it under an air pump over sulphuric acid. By this means the head was rendered as hard as the skulls of the New Zealanders; but all expression was of course gone.” To carry out Bentham’s wishes Smith asked anatomical modeler Jacques Talrich to create a lifelike wax model. The body was stuffed, the wax head attached, and Bentham’s own clothing and hat placed on the body. He now sits in a glass and mahogany case on public view at University College London, where the dissection took place. The mummified head is stored nearby in a climate-controlled room in the college’s archaeology division.
While many today donate their bodies to science out of a sense of utility or goodwill, just as Jeremy Bentham had hoped, not everyone thought so highly of the Anatomy Act. Drafted to encourage body donations, the bill became deeply feared and despised, gaining nicknames like the “Dead Body Bill” and the “Blood-Stained Anatomy Act.” The act forced people without the means for burials into a fate formerly reserved for murderers: the anatomy classroom. Originally a supporter of the bill, Smith had already begun to worry about shifting the burden of body donation from convicted criminals to the dying poor before the act had even passed. In 1829 he said the poor
suppose that they must still serve their masters even after death has set them free from toil, and that when the early dawn can no longer rouse them from the pallet of straw to work, they must be dragged from what should be their last bed, to show in common with the murderer, how the knife of the surgeon may best avoid the rich man’s artery, and least afflict the rich man’s nerve.
While no one today is scouting poorhouses looking for dissection targets, money is still a factor in most life decisions, including that of death.
What Was Old Is New
Death in the Western world has changed. An array of posthumous options exists like never before, and the choices beyond coffins and urns are staggering in scope. You or your loved one could become a diamond, a painting, a stained-glass window, a vinyl record, an hourglass, an (artificial) coral reef, or a tattoo. And at least two companies will take your cremated remains and launch them into space; while expensive, this option is still $2,000 cheaper than the average cost of a funeral.
As the median funeral price in the United States has increased (it grew 36% between 2000 and 2012, going from $5,180 to $7,045), more Americans have opted for cremation (43% in 2012 compared with about 26% in 2000), which cost about $1,650 in 2012. The Cremation Association of North America lists cost as the number-one reason people opt for cremation over a typical burial. The organization also cites the range of creative possibilities cremation provides (such as turning ashes and bones into diamonds or tattoos), concerns over environmental impact, increasing geographic mobility (as people move around more, they are less inclined to visit a cemetery), and a decrease in religiousness as contributing factors.
Cost is just one reason; some people are also looking to make something of death, whether it’s becoming a unique piece of jewelry or contributing their bodies to science. Body donation has grown in popularity and gained an altruistic status that Bentham would have appreciated. A 2010 study conducted at Radboud University in Nijmegen, Netherlands, on why people donate their bodies to science concluded that the primary reason was a desire to be useful. Even during the most recent economic recession only 8% of participants were motivated by money. Still, the Pennsylvania Humanity Gifts Registry—the office that oversees all body donations to the state’s medical schools—reported that donations increased in 2009 (about 700 compared with 600 the year before) after the economy nosedived. Funerals cost money; body donation is essentially free.
Most body donors have no way of knowing how their bodies will be used. When Alan Billis signed up to donate his body to science just weeks after being diagnosed with terminal lung cancer in 2008, he was in the rare position of knowing exactly what would happen. The 61-year-old taxi driver from Torquay, England, was the only person to respond to an advertisement seeking a volunteer to be mummified using ancient Egyptian techniques. He became the star of the 2011 BBC Channel 4 documentary Mummifying Alan: Egypt’s Last Secret.
The project was born after a team of British scientists realized that only by actually embalming a human body could they know if their theories about mummification techniques during Egypt’s Eighteenth Dynasty were correct. To prepare for the project archaeological chemist Stephen Buckley, mentioned earlier in the story, practiced on pigs. “I took it very seriously,” Buckley says. “If I was going to do this experiment on humans, I needed to be as sure as I could be what was going to happen.”
Billis died in January 2011; it took Buckley and his team until August of that year to complete the mummification process. Buckley thinks mummification came closest to perfection during the Eighteenth Dynasty, which lasted from about 1550 to 1292 BCE. This was the period when Tutankhamun briefly reigned. In fact, during the two years that Billis awaited his fate, he cheerfully took to calling himself “Tutan-Alan.” There was a range of embalming techniques and pricing scales available during the Eighteenth Dynasty, but Buckley’s team mummified Billis as the pharaohs of that time would have been embalmed. “The best of the best,” Buckley says.
Buckley and his team first removed Billis’s organs through the abdomen, leaving his heart and brain. The ancient Egyptians thought the heart was what provided intelligence—and was therefore necessary for the afterlife. As for the brain, research has shown that some of Egypt’s best-preserved mummies still held onto theirs, providing a good case to leave Billis’s in place. The team then sterilized the body cavity, substituting an alcohol mixture for palm wine, the original ingredient. Then they lined the cavity with linen bags filled with spices, myrrh, and sawdust and sewed the abdomen closed, sealing it with beeswax. The first four steps were commonly accepted as part of the embalming process, but the fifth step was controversial.
Buckley wanted to prove, however counterintuitive it might seem, that Eighteenth Dynasty embalmers soaked the body in a solution made up of natron, a naturally occurring compound of soda ash and baking soda, and a lot of water. Natron was harvested from ancient Egypt’s dry saline lake beds and was used as both a personal and household cleaning agent. It also was employed as a meat preservative, which was essentially its role in mummification. For Billis, Buckley combined sodium carbonate, sodium bicarbonate, sodium chloride, and sodium sulfate. The natron stopped bacterial growth by raising the pH levels and inhibiting enzymes and bacteria from functioning. The water helped retain the body’s form, preserving a lifelike appearance.
Before Buckley’s experiment the accepted theory was that mummification employed dry natron, resins, oils, and herbs. Remember, Egypt’s original mummies were preserved just by the desert’s dry, salty sands. If intentional mummification was desperately trying to keep moisture out, why would embalmers ever want to let water in? In the early 20th century Alfred Lucas, a forensic chemist and Egyptologist, surmised that mummies might have been soaked in a natron solution, but he lacked the technology to prove it. By the beginning of the 21st century Buckley had what he called “a lightbulb moment” when he examined X-rays of mummies from the Eighteenth Dynasty and discovered salt crystals in the soft tissue. He deduced that water must have helped the crystals permeate the body so deeply. Buckley also thinks the water was symbolic. “The natron solution was about being reborn,” he says. During the early Eighteenth Dynasty, Egypt itself was being reborn as the pharaohs attempted to reassert Egyptian identity after the expulsion of foreign rulers. “Egyptologists come in two schools,” says Buckley. “They either see [mummification] as practical only, or they study rituals through texts, and the two groups never get together. But Egyptian mummification was both practical and symbolic.” Buckley kept this in mind while mummifying Billis, allowing the body time to “lie in state” before beginning the mummification process, giving family time to grieve.
Mummifying Billis helped prove Buckley’s theory but also produced an unexpected result. Since the documentary aired, more than a dozen people have inquired about donating their bodies for further mummification study.
Buckley and his team were the first to carry out ancient Egyptian embalming techniques on a 21st-century human, but others have offered a contemporary version of mummification for years. Salt Lake City, Utah, is home to Summum, a religious nonprofit organization and licensed funeral home. Claude Nowell, who later would go by the name Corky Ra, founded the religion in 1975 after he claimed to be visited by what he described as advanced beings. The encounter helped form the tenets of Summum, which draws some inspiration from ancient Egyptian beliefs. Summum promotes mummification and offers its services to believers and nonbelievers for around $67,000.
The group has gotten its fair share of media attention over the years. They are, after all, the only modern mummification facility on the planet, according to Ron Temu, Summum’s mummification expert. The pyramid-shaped headquarters on the edge of Salt Lake City only adds to the intrigue. Summum’s biggest moment of unsought fame came shortly after the death of Michael Jackson in 2009, when a private unmarked helicopter landed near the group’s home. “We did not mummify Michael Jackson,” Temu says, in a tone that suggests he has uttered those words before. Perhaps several times.
We want to keep the body as natural as possible, do as little damage as possible, so people’s souls can go to the next world.
Temu was a conventional funeral director for many years before he joined the group. The first human he mummified was his friend and colleague Corky Ra, who died in 2008. Like Jeremy Bentham in London, Corky Ra remains a presence at Summum in mummified form, “joining in” on Summum activities and standing by as Temu works on other people and animals. Corky Ra’s body now rests inside a golden mummiform decorated with a very realistic depiction of his face.
Summum’s process is loosely based on Egyptian techniques, though details remain secret. What Temu would reveal is that like the technique used by Buckley’s team, the cadaver is immersed in a solution. But Summum soaks the bodies for up to six months and uses an embalming solution it claims better preserves DNA. Temu says that where the Egyptians were dehydrating the body, Summum aims to seal moisture in. “We want to keep the body as natural as possible, do as little damage as possible,” he says, “so people’s souls can go to the next world.” One of Temu’s proudest achievements is the ability to perfectly preserve people’s eyeballs. “That’s a big tell-tale in mortuary chemistry; after you die your eyes go soft very quickly because your eyeball is close to 100% water.” Summum’s eyeball technique is a closely kept secret.
Many inquire about mummification at Summum despite the steep price. Several have asked if Summum can help them preserve genetic material for future reanimation. But that, Temu says, is not what Summum is about. “Genetic material is preserved, so you could clone the person, but it’s certainly not the goal.”
Many of those interested in Summum’s mummification process are not Summum followers, nor are they—like Alan Billis—donating themselves in a Benthamesque endeavor to be useful. Rather, they seem to find relief in knowing that they, family members, and pets (Summum mummifies many pets) will remain on Earth, in body if not in soul, for the foreseeable future. Are they striving for immortality? Or are they providing some comfort to the still living? Maybe deciding what happens to your body after death is a way of asserting one last bit of control and perhaps putting it to one final bit of use.
is the Institute’s manager of video and multimedia production.
Mummies are often a star attraction at many of the world’s great museums. Their temperature-controlled glass cabinets protect and preserve these bodies, which are thousands of years old. Locked within them is the history of how people lived along the Nile many millennia ago. Modern scholars treat them with reverence and great care, but it was not always the case.
Until very recently, Egyptian mummies were used by Europeans for practical rather than academic purposes. Their bodies were treated as a commodity because of the medical, supernatural, and physical characteristics they were believed to possess. Starting in the 15th century, merchants sought to profit from trafficking mummies out of Egypt and into Europe, and a robust “mummy trade” grew around them.
Please be respectful of copyright. Unauthorized use is prohibited.
Mummification was a complex, lengthy process that helped preserve the body for its journey in the afterlife. Although the process changed over time, many of its core practices remained the same. After removing the body’s internal organs, priests would use natron, a naturally occurring salt, to dry it out. Sometimes fragrant substances, like myrrh, were used to anoint the body. Oils and resins would be applied to the body, which would then be stuffed with linen rags or sawdust before being sealed and wrapped in bandages. (Learn more about the mummification process.)
Scholars have had difficulty pinning down exactly how mummies came to be used for medicine. There is evidence that Europeans believed that embalmed bodies contained otherworldly healing powers. Other scholars trace the relationship’s origin to the misconception that mummies contained bitumen, a substance long associated with healing in the ancient world.
Please be respectful of copyright. Unauthorized use is prohibited.
Black, sticky, and viscous, bitumen is a form of petroleum found in areas around the Dead Sea. First-century A.D. writers Pliny the Elder and Dioscorides, as well as the second-century A.D. Galen, wrote about its healing properties. Dioscorides described one form as a liquid from Apollonia (modern Albania) known, in Persian, as mumiya. According to Pliny, it could heal wounds and a range of maladies. (These cultures also preserve their dead.)
European scholars in the Middle Ages associated bitumen with a blackish substance found in the tombs of Egypt. An 11th-century physician, Constantinus Africanus, wrote that mumiya “is a spice found in the sepulchers of the dead . . . That is best which is black, ill-smelling, shiny, and massive.”
Europe began to link mummies with medicine in the 15th century, in response to a robust demand for medical mumiya. Naturally occurring bitumen was rare, so enterprising merchants went hunting in Egyptian tombs for alternative supplies. When ground to a powder, those preserved bodies and their resins, oils, and aromatic substances not only had the same consistency and color as original Persian mumiya but also smelled better.
It was not always easy to acquire a mummy, so less scrupulous Eastern merchants decided to make their own. Apothecaries noticed a difference. As Guy de La Fontaine complained in 1564, after his journey to Alexandria to acquire the drug, the problem was that in many instances the mummies were modern corpses treated to resemble ancient mummies. A distinction was then drawn between primary or true mumiya and secondary or false mumiya.
The process of turning a recently deceased human being into a persuasive facsimile of an ancient Egyptian mummy was an unpleasant one. Luis de Urreta, a Spanish monk in the Dominican Order, gives a detailed account of the murderous and grim method used in his 1610 work Historia de los reynos de la Etiopía (History of the Kingdoms of Ethiopia). The procedure consisted of repeatedly starving a captive and giving him special “medications” before cutting off his head as he slept. The body was then drained of blood, filled with spices, wrapped in hay, and buried for 15 days. After exhumation, it dried in the sun for 24 hours. By the end of this gruesome process, the flesh had darkened and transformed. The monk described it as being not only cleaner and finer than that of ancient mummies but also more effective.
Please be respectful of copyright. Unauthorized use is prohibited.
The 17th-century naturalist Benoît de Maillet gave an extremely accurate description of the geography and wildlife of Egypt in his Description de l’Égypte. Portrait of Maillet by Étienne Jeaurat, Versailles
Benoîte de Maillet, the French consul in Egypt between 1692 and 1708, was the first European to stage a mummy unwrapping in front of an audience. The event took place in Cairo in September 1698. Maillet did not take notes of his process or methods, but he did detail some of the amulets and objects found among the wrappings.
Not everyone sang the praises of mumiya as a drug, regardless of whether it was “true” or “false.” As early as 1582, the Frenchman Ambroise Paré wrote in his Discours de la mumie,“the effect of this malevolent drug is such that not only does it do nothing whatsoever to improve patients, as I have seen for myself on numerous occasions among those forced to take it, but it also causes them terrible stomach pains, a foul smell in the mouth, and great vomiting, which are the origin of disorders in the blood and even make it flow from the vessels that contain it.”
Europeans used ground up mummies as medicine, but they also used them in art. From at least the 16th century, a pigment called “mummy brown” was made from mummified human remains and appeared on the palettes of European artists. To mix the pigment, ground-up ancient bodies were mixed with pitch and myrrh. At the time, apothecaries, who were responsible for producing medicines made out of mummies, would often double as mixers of the pigment, making it easy to make the leap from medicine cabinet to artist’s palette.
Please be respectful of copyright. Unauthorized use is prohibited.
Historical records date mummy brown’s early use to the Renaissance. Painters were said to prize mummy brown for its richness and versatility; they often used it for shading, chiaroscuro, and, appropriately, flesh tones.
The Lost Pharaohs
Please be respectful of copyright. Unauthorized use is prohibited.
When discovered in 1817, the tomb of Seti I contained no royal mummy. His body was later found among the Royal Cache at Deir el Bahri. French Egyptologist Gaston Maspero unwrapped Seti I’s mummy in June 1886 and found the body in remarkable condition.
In 1881 a hidden tomb of royal mummies was discovered at Deir el Bahri in the Theban Necropolis. The site known as the Royal Cache (or tomb DB320) was found to contain the remains of many powerful 18th- and 19th-dynasty pharaohs, including Thutmose III, Seti I, and Ramses II. Egyptologists believed they were transferred to this cache for safekeeping sometime during the 21st dynasty (11th and 10th centuries B.C.) in a successful attempt to elude looters.
How often it was used and in what specific paintings has been difficult for art historians to ascertain, but the color remained in use until the Romantic painters of the late 19th century. Many artists did keep the color in stock, such as Pre- Raphaelite painters Edward Burne-Jones and Lawrence Alma-Tadema. Eugène Delacroix, one of the greatest painters of France’s 19th-century Romantic school, is known for the large areas of shadow and gloom on his canvases, which have struck scholars as likely candidates for the use of mummy brown.
The economic demand for mumiya worked in parallel with the equally powerful forces of fear and superstition. Even though, from the classical period onward, the occasional Greek or Roman traveler returned home from a trip to Egypt with a mummified animal, it seems that prior to the 15th century, there was little interest in transporting mummies to Europe as mementos or collector’s objects.
Mummies were perceived as powerful spiritual objects. This lingering superstition survived well into the 20th century. Howard Carter’s 1922 discovery of the tomb of Tutankhamun inspired tales of the “mummy’s curse” that protected the pharaoh’s tomb and killed several members of Carter’s team.
This fear has deep roots in the European imagination, and preyed on the guilty minds of tomb robbers. Renaissance-era chronicles tell of Octavius Fagnola, a 16th-century Christian who converted to Islam. He had been a tomb robber in Egypt. While at work among the graves of Giza, he came across a corpse with no internal organs wrapped in an ox skin and containing a scarab, a kind of amulet that was thought to protect the heart.
Dodging the customs men and loading the mummy onto a ship bound for Italy proved to be the easy part. Halfway through the voyage, a violent storm rose up; it seemed like the ship would be lost. “The corpses of Egyptians always stir up storms,” reflected Fagnola, and consign his mummy to the waters that night.
Please be respectful of copyright. Unauthorized use is prohibited.
Such stories were commonplace in 16th-century Europe, when the Christian world and the Ottoman Empire were vying for control of the Mediterranean. At the Battle of Lepanto in 1571, the Holy League defeated the Turkish fleet. Following this decisive victory, the news raced around the Mediterranean’s bustling ports, which were fertile ground for gossip. A rumor circulated that the Turks were doomed by having a mummy aboard one of their ships. The defeat that followed only served to reinforce the idea that mummies exercised a power to inflict maritime disaster on the unwary.
Fear of such objects was not enough to cause a drop in demand in Europe for medicine derived from mummies. The 16th-century Ottoman authorities who ruled Egypt enacted laws to control the trade in mummies. This measure backfired, creating a lucrative black market.
Parties and performance
By the 18th century, using mummies as medicine had fallen from favor. European attitudes toward mummies were shifting, and scholars began to be more interested in what lay under the winding sheets of a mummy’s wrappings. Unwrapping a mummy would become an event, one that could be hosted in a private home or, later, in a public theater. The first recorded account of a mummy unwrapping occurred in 1698. Benoît de Maillet, the French consul in Cairo, was the first European to delve beneath the bindings and take extensive notes. In the early 1700s, Christian Hertzog, apothecary to the Duke of Saxe-Coburg, unwrapped a mummy in front of an audience. He published his findings in the book Mumiographia, a detailed account of the artifacts found inside.
Please be respectful of copyright. Unauthorized use is prohibited.
The public study of mummies continued and reached a new peak in the early 19th century after the Napoleonic Wars and English colonialism were stirring up new interest in ancient Egypt. Throughout the 19th century, public mummy unwrappings were highly popular events in England. The man who pioneered them was Thomas Pettigrew, a 19th-century English surgeon, who became known later in life as “Mummy Pettigrew.” He began his Egyptology career as assistant to Giovanni Battista Belzoni, the Italian explorer who discovered the tomb of Seti I in 1817. An astonishing find, the tomb was missing its mummy.
Please be respectful of copyright. Unauthorized use is prohibited.
As part of an exhibition of reliefs from Seti’s tomb, Belzoni, aided by Pettigrew, unwrapped a mummy before a group of physicians in 1821. Pettigrew became fascinated himself and began a lifelong career in the study of Egypt. In 1834 he published a treatise on mummies that included descriptions of the objects found inside.
Pettigrew’s public dissections of mummies were wildly popular in the 1830s. Spectators were left spellbound or nauseated as the face—gaunt and desiccated but nevertheless that of a recognizable human being, dead for many thousands of years—was gradually revealed from beneath its protective garments.
After noting that one individual had a large bone tumor, Pettigrew began to see how a mummy was a record of a real person. He understood that his investigations could reconstruct the details of an individual life. Pettigrew’s insight moved the study of mummies away from pure public spectacle (though it partly remained that) and into the realm of scientific analysis. His A History of Egyptian Mummies is considered as one of the founding texts of Egyptology.
In the late 19th and early 20th centuries, a series of important archaeological discoveries provided new insights as Egyptology was developing into a more formal discipline. In 1881 a huge cache of royal mummies from the New Kingdom—including Seti I’s missing body— was discovered in the Theban Necropolis, followed in 1898 by the tomb of Amenhotep II in the Valley of the Kings. Many of these mummies were unwrapped, but their physical appearances and any artifacts were carefully documented according to the academic practices of the time. (See nine photographs of extraordinary mummies from around the world.)
Please be respectful of copyright. Unauthorized use is prohibited.
In the early 1900s new methods for studying mummies came into practice. Grafton Elliot Smith, an anatomist at the Cairo School of Medicine, photographed the royal mummies. His 1912 book, Catalogue of the Royal Mummies in the Museum of Cairo, is still used as a reference. Smith was the first to use x-rays on mummies.
Please be respectful of copyright. Unauthorized use is prohibited.
Mummies were beginning to be seen as precious repositories of knowledge in addition to being human remains that demanded respect. Some old habits died hard, however. As late as 1900, a tomb believed to hold Pharaoh Djer, who died circa 3055 B.C., was excavated. Djer is thought to be the third king from the 1st dynasty, one of the first rulers to preside over a unified Egypt. Yet when a mummified arm, complete with bracelets, was found, the jewelry was carefully removed and preserved. As for the arm, it was noted, photographed, and thrown in the trash—an act that would fill modern scholars with horror and outrage.
Author of several books on pyramids and daily life in ancient Egypt, José Miguel Parra has participated in recent excavations at Luxor.
Medical cannibalism is the consumption or use of the human body, dead or alive, to treat diseases. The medical trade and pharmacological use of human body parts and fluids arose from the belief that because the human body is able to heal itself, it can also help heal another human body. This belief was shared among different groups, including ancient Mesopotamian, Egyptian, Greek, Chinese and Judaic cultures. Many of the medical-cannibalism recipes applied the principles of sympathetic magic to disease, i.e., the belief that like affects like, e.g., that powdered blood helps bleeding, human fat helps bruising, and powdered skulls help with migraines or dizziness.
Medical cannibalism may have begun in ancient Egypt with the exploitation of mummies. There are no primary sources for the practice of medical cannibalism in Ancient Egypt. What we now know about the practice is mostly through secondary sources that rely on accounts written sometimes thousands of years after the event. Worse still, many pieces of so-called information are taken from 21st century sensationalist accounts. One such example is the claim that Egyptian rulers cured their parasitic infections by bathing in human blood.
Medical cannibalism in Europe can be traced back to the Roman Empire in the second century AD. According to fifteenth century philosopher Marsilio Ficino, Romans drank the blood of slain gladiators to absorb the vitality of strong young men and suggested adopting the practice by drinking blood from the arm of young persons. Medical cannibalism in Europe reached its peak in the sixteenth century, with the practice becoming widespread in Germany, France, Italy, and England.
Most "raw materials" for the practice came from mummies that were stolen from Egyptian tombs, skulls that were taken from Irish burial sites, gravediggers who robbed and sold body parts. Medicines were created from human bones, blood, and fat and believed to treat many types of illnesses. Tinctures were made to treat internal bleeding by soaking mummified bodies in alcohol or vinegar. Powdered skull was used to treat ailments of the head, and was even sometimes mixed with chocolate to treat apoplexy. In the 1800s, Englishmen treated epilepsy by mixing skull with molasses. In addition, human fat was used to treat problems of the outer body either by rubbing it directly on the skin or soaking the bandage in fat first then applying it on the wound.
As this practice became more and more common, the adaption of "like cures like" was used to determine the treatment for various ailments. For example, parts of the head was used to treat issues relating to the head and eyes of dead people were collected and used to treat Ophthalmological issues.
Blood, specifically, soon evolved to be seen as a substantial elixir, especially fresh, warm human blood because it was believed to still possess the soul of the deceased. For example, it was believed that drinking the blood of a strong person or a wise person would result in increase of strength or wisdom, respectively, because once ingested, the spirit of the deceased connects with that of the consumer and lends it its power. This belief was especially common in Germany, as well as in Rome, where during the Renaissancegladiator's blood was drank gain their strength. Even the poor, who could not afford other remedies, took part in this practice by taking their own cups to executions, paying the executioner a small fee, then filling their cups with the fresh blood. The execution of criminals was seen as "killing two birds with one stone," it reduced the criminal burden and served the public good.
Europeans also adopted the what they thought was the Ancient Egyptian belief that the more valuable corpses were those of a fresh, young body, especially those that died a brutal sudden death, for it was believed that the spirit of the body would remain trapped in there for a longer period of time, and thus would have greater healing powers.
Although the blood was normally drunk warm and fresh for increased effectiveness, some people preferred to have it cooked. Therefore a recipe of how to turn blood into marmalade was invented. In 1679, a Franciscan apothecary suggested letting the blood partially dry and chopping into small pieces to allow remaining water to seep out. Then, cooking the blood into a batter, before sifting it into a jar.
In the seventeenth and eighteenth century, "man’s grease" became in high demand. Executioners would sell the fat of the people they executed, which would then be melted and filled into vessels. Apothecaries sold it as a remedy for pain, inflammation, rabies, joint problems, and scars. Additionally, the skin of the executed was also used for medical purposes. Pregnant women placed that skin around their belly during childbirth because it was thought to reduce birth pains. Others placed it around their neck to prevent thyroid problems.
Mummia, a medicine that started out in Egypt and quickly became in high demand throughout Europe within the sixteenth century, was thought to cure any ailment there was. The black remnants in the skull and abdominal cavities were scraped out of mummies and placed in a large vase. Apothecaries mixed this mummia with herbs and wine, then prescribed it as medicine for their patients.
In Germany, around the early 1600s, a recipe for wine from flesh was invented. According to this recipe, the body of a human, specifically a young, flawless, red-head was used. Their flesh was chopped up and mixed with aloe and myrrh, then mashed and cured into a "wine."
Modern medical cannibalism
Medical cannibalism is not simply the eating, drinking and use of human body parts and fluids, but also the exploitation of the human body for medical purposes, especially without the prior consent of the person. This practice was seen, for example, in the early twentieth century, when Germans were sold mummies in medical catalogs.
Although most people do not engage in drinking of blood or usage of essential oils derived from mummies, one way that medical cannibalism continues today is in the form of organ transplants. Because of the high demand for organs and the shortage in supply, the practice organ trafficking became widespread. Organ trade or trafficking involves the purchase and sale of transplant organs from live donors. Although this practice is prohibited in most countries, it still occurs frequently. In fact, it is estimated the approximately 10,000 transplanted organs are obtained through some form of organ trafficking annually.
One extreme case of this occurred in 2001 and 2002, where more than 100 kidney transplants were done at St. Augustine Hospital in South Africa illegally. Another occurred in 2005, where kidneys of Chinese prisoners were sold to British patients. Yet another example is the large scale organ trafficking system that was uncovered in India in 2002, that was valued at about $32.4 million. This system involved about 1,972 cases of illegal organ transplants, of mostly poor and migrant workers who did not receive proper care and were threatened with imprisonment if they did not comply.
Ate mummies europeans
The Gruesome History of Eating Corpses as Medicine
The last line of a 17th century poem by John Donne prompted Louise Noble’s quest. “Women,” the line read, are not only “Sweetness and wit,” but “mummy, possessed.”
Sweetness and wit, sure. But mummy? In her search for an explanation, Noble, a lecturer of English at the University of New England in Australia, made a surprising discovery: That word recurs throughout the literature of early modern Europe, from Donne’s “Love’s Alchemy” to Shakespeare’s “Othello” and Edmund Spenser’s “The Faerie Queene,” because mummies and other preserved and fresh human remains were a common ingredient in the medicine of that time. In short: Not long ago, Europeans were cannibals.
Noble’s new book, Medicinal Cannibalism in Early Modern English Literature and Culture, and another by Richard Sugg of England’s University of Durham, Mummies, Cannibals and Vampires: The History of Corpse Medicine from the Renaissance to the Victorians, reveal that for several hundred years, peaking in the 16th and 17th centuries, many Europeans, including royalty, priests and scientists, routinely ingested remedies containing human bones, blood and fat as medicine for everything from headaches to epilepsy. There were few vocal opponents of the practice, even though cannibalism in the newly explored Americas was reviled as a mark of savagery. Mummies were stolen from Egyptian tombs, and skulls were taken from Irish burial sites. Gravediggers robbed and sold body parts.
“The question was not, ‘Should you eat human flesh?’ but, ‘What sort of flesh should you eat?’ ” says Sugg. The answer, at first, was Egyptian mummy, which was crumbled into tinctures to stanch internal bleeding. But other parts of the body soon followed. Skull was one common ingredient, taken in powdered form to cure head ailments. Thomas Willis, a 17th-century pioneer of brain science, brewed a drink for apoplexy, or bleeding, that mingled powdered human skull and chocolate. And King Charles II of England sipped “The King’s Drops,” his personal tincture, containing human skull in alcohol. Even the toupee of moss that grew over a buried skull, called Usnea, became a prized additive, its powder believed to cure nosebleeds and possibly epilepsy. Human fat was used to treat the outside of the body. German doctors, for instance, prescribed bandages soaked in it for wounds, and rubbing fat into the skin was considered a remedy for gout.
Blood was procured as fresh as possible, while it was still thought to contain the vitality of the body. This requirement made it challenging to acquire. The 16th century German-Swiss physician Paracelsus believed blood was good for drinking, and one of his followers even suggested taking blood from a living body. While that doesn’t seem to have been common practice, the poor, who couldn’t always afford the processed compounds sold in apothecaries, could gain the benefits of cannibal medicine by standing by at executions, paying a small amount for a cup of the still-warm blood of the condemned. “The executioner was considered a big healer in Germanic countries,” says Sugg. “He was a social leper with almost magical powers.” For those who preferred their blood cooked, a 1679 recipe from a Franciscan apothecary describes how to make it into marmalade.
Rub fat on an ache, and it might ease your pain. Push powdered moss up your nose, and your nosebleed will stop. If you can afford the King’s Drops, the float of alcohol probably helps you forget you’re depressed—at least temporarily. In other words, these medicines may have been incidentally helpful—even though they worked by magical thinking, one more clumsy search for answers to the question of how to treat ailments at a time when even the circulation of blood was not yet understood.
However, consuming human remains fit with the leading medical theories of the day. “It emerged from homeopathic ideas,” says Noble. “It’s 'like cures like.' So you eat ground-up skull for pains in the head.” Or drink blood for diseases of the blood.
Another reason human remains were considered potent was because they were thought to contain the spirit of the body from which they were taken. “Spirit” was considered a very real part of physiology, linking the body and the soul. In this context, blood was especially powerful. “They thought the blood carried the soul, and did so in the form of vaporous spirits,” says Sugg. The freshest blood was considered the most robust. Sometimes the blood of young men was preferred, sometimes, that of virginal young women. By ingesting corpse materials, one gains the strength of the person consumed. Noble quotes Leonardo da Vinci on the matter: “We preserve our life with the death of others. In a dead thing insensate life remains which, when it is reunited with the stomachs of the living, regains sensitive and intellectual life.”
The idea also wasn’t new to the Renaissance, just newly popular. Romans drank the blood of slain gladiators to absorb the vitality of strong young men. Fifteenth-century philosopher Marsilio Ficino suggested drinking blood from the arm of a young person for similar reasons. Many healers in other cultures, including in ancient Mesopotamia and India, believed in the usefulness of human body parts, Noble writes.
Even at corpse medicine’s peak, two groups were demonized for related behaviors that were considered savage and cannibalistic. One was Catholics, whom Protestants condemned for their belief in transubstantiation, that is, that the bread and wine taken during Holy Communion were, through God’s power, changed into the body and blood of Christ. The other group was Native Americans; negative stereotypes about them were justified by the suggestion that these groups practiced cannibalism. “It looks like sheer hypocrisy,” says Beth A. Conklin, a cultural and medical anthropologist at Vanderbilt University who has studied and written about cannibalism in the Americas. People of the time knew that corpse medicine was made from human remains, but through some mental transubstantiation of their own, those consumers refused to see the cannibalistic implications of their own practices.
Conklin finds a distinct difference between European corpse medicine and the New World cannibalism she has studied. “The one thing that we know is that almost all non-Western cannibal practice is deeply social in the sense that the relationship between the eater and the one who is eaten matters,” says Conklin. “In the European process, this was largely erased and made irrelevant. Human beings were reduced to simple biological matter equivalent to any other kind of commodity medicine.”
The hypocrisy was not entirely missed. In Michel de Montaigne’s 16th century essay “On the Cannibals,” for instance, he writes of cannibalism in Brazil as no worse than Europe’s medicinal version, and compares both favorably to the savage massacres of religious wars.
As science strode forward, however, cannibal remedies died out. The practice dwindled in the 18th century, around the time Europeans began regularly using forks for eating and soap for bathing. But Sugg found some late examples of corpse medicine: In 1847, an Englishman was advised to mix the skull of a young woman with treacle (molasses) and feed it to his daughter to cure her epilepsy. (He obtained the compound and administered it, as Sugg writes, but “allegedly without effect.”) A belief that a magical candle made from human fat, called a “thieves candle,” could stupefy and paralyze a person lasted into the 1880s. Mummy was sold as medicine in a German medical catalog at the beginning of the 20th century. And in 1908, a last known attempt was made in Germany to swallow blood at the scaffold.
This is not to say that we have moved on from using one human body to heal another. Blood transfusions, organ transplants and skin grafts are all examples of a modern form of medicine from the body. At their best, these practices are just as rich in poetic possibility as the mummies found in Donne and Shakespeare, as blood and body parts are given freely from one human to another. But Noble points to their darker incarnation, the global black market trade in body parts for transplants. Her book cites news reports on the theft of organs of prisoners executed in China, and, closer to home, of a body-snatching ring in New York City that stole and sold body parts from the dead to medical companies. It’s a disturbing echo of the past. Says Noble, “It’s that idea that once a body is dead you can do what you want with it.”
Maria Dolan is a writer based in Seattle. Her story about Vaux's swifts and their disappearing chimney habitat appeared on SmithsonianMag.com in November 2011.
"Hello, Elena. Again. Sorry for being late, I didnt think that you would collect your belongings so quickly and come, Nikolai Petrovich turned out to. Be next to him, again smiling slyly.
You will also like:
- Rice lake polaris dealer
- Kendall county sheriff
- Worx tree trimmer
- Oldest samsung phone
- Anime girl face claims
- Top afro songs 2020
- Final fantasy xiv merch
- Outdoor forum alaska
- 5 8 concrete anchor
- Heat press bundle
- Holosun 407k vs 407c
- Lost twin cities 1
Thread of Ariadne, with the same feeling of free flight into an unknown and full of tragedy future, when you absolutely know for sure, that no matter how happy you feel now and no matter what joyful things happen to you now, sooner or later it will all end and end in complete and inevitable collapse. The flame was either extinguished, or it went out on its own, but the alternate airfield also did not accept our plane.
And the pilots, without asking anyone about anything (the door to the cockpit was wide open), decided to land the plane wherever they could, and he, circling for a long time in the pitch darkness, in the end he plopped down on some concrete road, settling at the end of the run on his belly due to the broken chassis.
The pilot, muttering something under his breath, went out and, opening the door, threw the small ladder out.