Thinking about death

When is the last time you thought about your life ending? Did this cause distress or cheer you up? Either way, you’re thinking about your own mortality, and that can change the way you think and act on other issues. More on this later.

            When I was a teenager, I thought a fair bit about the end of my consciousness, and started reading about death. I don’t remember the names of any of the authors except for the Spanish writer Miguel de Unamuno, and don’t remember what any of them said!

            At Rice University in 1968, I had the opportunity to take a class titled “The meaning of death in Western culture.” I remember that I wrote my final essay arguing that religious arguments for immortality were inadequate and the best scientific evidence for life after death was from psychic phenomena, and this evidence was inconclusive. I was majoring in physics and I guess my scientific mindset was on display.

            In the following years, the issue of my own mortality became less salient. I still read and thought about death-related issues, for example through my studies of nuclear extinction, the euthanasia debate and Death imagined as a powerful perpetrator. Even so, I became far more accepting of the end of consciousness.

            What would it be like to go to sleep and not wake up? The best answer I discovered was in the book The Mind Club by Daniel M. Wegner and Kurt Gray. The authors say it’s intrinsically impossible to understand non-existence because there would be no “I” to think about it. As they put it, “Trying to perceive your dead mind is paradoxical, because you have to perceive a state that is incapable of perception — which is impossible while you are currently perceiving.”

Mortals

Recently, while in the Sydney bookshop Dymocks hunting for something to read, I saw Mortals: How the Fear of Death Shaped Human Society. Written by two psychologists based in Sydney, Rachel Menzies and Ross Menzies, it seemed a perfect opportunity to refresh and update my understanding of death issues.

            Menzies and Menzies begin by tackling a big issue: religion. They go through several of the world’s major religions, including Christianity, Islam, Hinduism and Buddhism — the four largest in terms of adherents — arguing that much of their appeal comes from their promise of immortality, in one form or another. And why should anyone seek immortality? The fear of death, of course. If major religions are successful in recruiting and retaining followers due to their role in reducing the fear of death, this is indeed a powerful influence on human society. In some religions, your immortality comes via your mind, but in Christianity your body is part of the package, which presumably is more appealing. Menzies and Menzies say less about a negative side of immortality: the possibility of everlasting damnation. Why would religion be attractive if it comes along with the risk of going to hell? Maybe this uses the fear of death in an even more potent combination: frighten people with visions of hell and then promise everlasting life in heaven if only they believe.

            But is fear of death the key driver of religious belief? Research shows that people who are religious are happier, on average, than those who are not. Religious belief plays a role in this, but so do social relationships, which are known to promote happiness, among believers. Some religions have rituals involving expressing gratitude, something that reliably improves happiness. So there might be more to the attractions of religion than just warding off the fear of death.

TMT

In the 1980s, three researchers — Jeff Greenberg, Sheldon Solomon and Tom Pyszczynski — developed what is called Terror Management Theory or TMT. Despite the name, this has nothing to do with terrorism. It is about people’s fear of death, a fear so great as to warrant the word “terror,” and posits that this terror, even when not recognised consciously, has major influences on thought and behaviour. When I first read about TMT, years ago, it sounded a bit crazy, but there’s lots of research showing the impact of being reminded about death.

            In a typical experiment, the participants — most commonly undergraduate psychology students — are brought into the lab and asked to undertake a task, like solving anagrams. The task is seldom the real purpose of the study but is designed to distract their attention, so they don’t realise what the experimenter is trying to find out. Along the way, some participants are exposed to an article or video with images of death, whereas others, the controls, are exposed to a “neutral” equivalent like a cat video. Then there is a further task, or something happens, and the participants are watched to find out what they do. In one study, they needed to wash their hands, and the experimenters cleverly weighed soap dispensers and counted paper hand towels before and after.

            With ingenious experiments, researchers have discovered all sorts of fascinating things about how people react to being reminded of death. One finding is that some people become more willing to punish those not in their own group, such as foreigners. But only some people are affected this way, mainly those with certain personality traits or political orientations. Still, the overall picture is worrying. According to Menzies and Menzies, “Hundreds of studies show that nearly any reminder of death makes people more aggressive, more racist and more willing to inflict harsh punishments.” (pp. 82-83)

Living forever in the flesh

What are your prospects for immortality in your own body, here on Earth? Back in the 1970s, one of my colleagues, Tom, planned to have his body frozen when he died so that, with future recovery techniques, he could later be restored to life. This process is called cryonics. Tom wasn’t alone. Thousands of people have signed up to have their bodies, or just their heads, frozen at ultra-low temperatures in the hope of being brought back to life when the technology is available.

            Tom was a peculiar guy, lacking typical social skills. This was not uncommon for pure mathematicians, but Tom was at an extreme end. I used to imagine some future group of scientists restoring frozen people from a previous century and saying, “This is amazing. Everyone back in the 1970s was a lunatic!” If Tom was an emissary from our time to the future, he was hardly typical. But at least he was a gentle, introspective soul, not a megalomaniac.

            Menzies and Menzies use cryonics as one of many examples of the human quest for immortality. Incidentally, they give many reasons why the prospects for resuscitating a frozen brain are minimal: those relying on cryonics to have their minds restored have let hope triumph over the evidence.

Living forever symbolically

Menzies and Menzies offer a new perspective on Michelangelo’s painting of the Sistine Chapel. Michelangelo insisted on making this artwork a fresco, so it is part of the surface, making it far more lasting than a wall painting. In their telling, Michelangelo was willing to spend years of agonising effort so his art would be long remembered. In this, he was successful. However, most artists are not. Before long, they are forgotten.

            Menzies and Menzies argue that striving for symbolic immortality is important in driving cultural production. I thought this could apply to me because I’ve written lots of articles and books. The quest for a type of immortality may play a role, but there are other factors. Artistic production is one way to enter into a state called flow in which one’s focus is entirely on what’s happening and the sense of self recedes from consciousness. The pioneer researcher on flow, Mihaly Csikszentmihalyi, found that this is a highly desirable mental state that can be entered through all sorts of means, typically exercising a skill at a level challenging enough to avoid boredom but not so challenging as to induce anxiety. Quite independently of the fear of death, entering flow can be a motivator for producing artistic works. On the other hand, is flow a way to avoid thinking about death?

            Menzies and Menzies discuss several other ways that people try to deal with their unconscious fear of death, for example taking vitamin supplements and exercising. In every case, there are other factors. For example, physical activity is the most reliable way for people to feel better physically and mentally, which surely is a worthy goal even for those unconcerned about dying.

            The authors make a grand claim: “We have shown that nationalistic fervour, aggression against outgroups, religious wars, popping vitamins, endless hours on treadmills, investing in cryonics and futile health interventions all arise from failing to accept one’s mortality.” (p. 181) I think they’re on solid ground with cryonics, but for the other topics more is involved, and the precise role of the fear of death remains to be determined.

            Mortals is filled with fascinating information from cultures around the world. How about this? In Alabama, you can have your ashes incorporated into a shotgun shell. In this way, you can protect your family after you’re gone! Well, it’s only a replica, but it’s a thoughtful gesture.

            Menzies and Menzies are psychologists and have treated many patients with mental problems. They argue that the fear of death is an underlying factor in many mental illnesses that seem to be about something else. An example is a spider phobia. A therapist might try to reassure a patient by saying, “Don’t worry, you’re not going to die just by looking at a spider.” The trouble is that the patient is going to die, eventually, of something. To say that a fear of death underlies many mental disorders might sound outlandish, but Menzies and Menzies cite some striking evidence in support, including that the level of people’s death fears correlates with mental health problems, medication use, hospitalisations and the recurrence of problems.


Rachel Menzies

Implications

If the fear of death has so many harmful consequences, what is to be done? The authors say, basically, accept that you will die and get on with life. They tell about the Stoics, the philosophers in ancient Greece who advised not to worry about things you can’t control. This is good advice generally and certainly applies to the fact that everyone dies.

            Menzies and Menzies also discuss funeral practices, noting that the practice of embalming — routine in the US — is environmentally damaging. They discuss the “death-positive movement” that promotes acceptance of death and has led to environmentally friendly options for burial.

            On a much bigger scale is human overpopulation, a factor in the environmental crisis. Menzies and Menzies say having children is a way to help deal with the fear of death, because children carry on our genes and our culture. Also, in most societies, having children is looked on favourably and thus helps build self-esteem, a buffer against the fear of death. This is plausible, and then there’s research showing that when men are asked how many children they would like to have, they give a higher number after having been subliminally reminded of death.


Warding off a fear of death?

            The authors also argue that people’s belief that the human species is immune to disaster, including catastrophic global warming, derives from an inability to face death. You may not agree with all these assessments, but the stakes are potentially high. If you turn away from the evidence and arguments presented in Mortals, does that reflect an aversion to thinking about your own death?

            Reminders of death are all around us, in the news and entertainment, though this varies a lot depending on the culture. I started this post by mentioning death, and that should have influenced your thinking, at least in the short term. It’s definitely worth learning about how reminders about death affect us, so if you can stand an intense yet engaging tour of death-related topics, why not read Mortals?


Ross Menzies

Brian Martin
bmartin@uow.edu.au

Thanks to Chris Barker, Kelly Gates, Emily Herrington and Julia LeMonde for helpful comments.

Anonymous authorship

The problems with authors being anonymous may not be what you think.

My friend and collaborator, the late Steve Wright, worked to expose and challenge repression technology. For many years, he regularly visited “security fairs” where merchants tout wares for controlling populations such as electroshock batons, guillotines, acoustic weapons and surveillance equipment. They sell technology for torture and social control to governments of all stripes, including known human rights violators.

            Steve would talk with merchants, collect sales brochures and covertly take photos. Back home in Britain, he passed information and photos to human rights groups such as Amnesty International. In addition to articles and reports using his own name, he sometimes used the pseudonym Robin Ballantyne. For Steve, a degree of anonymity was vital, especially when visiting security fairs in repressive countries such as Turkey and China.

            I thought of Steve’s experiences when, a couple of years ago, I read about the new Journal of Controversial Ideas that explicitly allows authors to use pseudonyms. This is to enable authors of contentious articles to avoid reprisals by colleagues and others. How sensible, I thought.

            Then I read comments hostile to the journal’s policy on anonymity. Helen Trinca, associate editor of The Australian and long-time editor of its higher education supplement, penned an article titled “As ideas go, hiding behind an alias is as false as they come.” She lauded Peter Singer, co-editor of the new journal, for bravely proposing his own challenging ideas. She said, though, that he wouldn’t have had such an impact if he had used a pseudonym: “the likelihood that a fresh and different idea will actually spark a conversation is reduced when it’s put forward by someone who cannot be seen, who is not known, and who has no profile to Google or CV to check.”

            Philosopher Patrick Stokes, in an article in The Conversation, presented the pros and cons of anonymous authorship. In conclusion, he asked,

“Are you, in the end, making life better for other people, or worse? In light of that standard, a pseudonymous journal devoted entirely to ‘controversial’ ideas starts to look less like a way to protect researchers from cancel culture, and more like a safe-house for ideas that couldn’t withstand moral scrutiny the first time around.”

I’m not so sure about this.

Anonymous whistleblowing

Over the past several decades, I’ve spoken to hundreds of whistleblowers. They come from all walks of life, including the public service, private companies, schools, the police, the military and churches. They report a potential problem, usually to their superiors, and frequently end up suffering reprisals. In the worst cases, their careers are destroyed.

            What happens, time and again, is that managers and bosses don’t like the message and target the messenger. Therefore, for many years, I have recommended blowing the whistle anonymously whenever possible. The value of anonymity is that the focus is more on the disclosure rather than the person who made it. In the huge volume of commentary about whistleblowers like Chelsea Manning and Edward Snowden, there is often more attention to them as individuals than to what they spoke out about.

            The same considerations apply to scholars. They can be subject to adverse actions due to speaking out on sensitive issues. I’ve talked to several Australian academics who raised concerns about “soft marking,” in particular the lowering of standards when grading international students. This is a touchy topic because it smacks of racism and because it is threatening to universities’ income. I don’t know whether any of the claims about soft marking could be substantiated, but every one of these academics encountered problems in their careers as a result of raising concerns.

Pascal

In 1990 I began corresponding with Louis Pascal, a writer based in New York City. He had published a couple of articles in well-respected philosophy journals. He had come up with an idea: that AIDS may have entered humans via contaminated polio vaccines given in the late 1950s to hundreds of thousands of people in central Africa. This idea was highly threatening to the medical research mainstream. Who would want to acknowledge that a vaccination campaign might have inadvertently led to a new disease in humans costing tens of millions of lives? Pascal met great resistance in getting his papers about AIDS published. That is another story.

            The key point here is that “Louis Pascal” was, almost certainly, a pseudonym. I never met him nor spoke to him. He used a private address that may have been a mail drop. After a huge flurry of correspondence with me and others, by the mid 1990s he vanished, at least so far as his Pascal identity was concerned. Many have speculated that “Louis Pascal” was, in public, a different person, who wanted to keep his writings about population and AIDS separate from his public identity.

Nicolas Bourbaki

There can be other reasons for anonymity. Bourbaki is the name of a group of mathematicians. By using a pseudonym for the group, they renounced acknowledgement for their contributions.


Bourbaki Congress of 1938

            This can be for an altruistic reason. Normally, researchers build their reputations and careers through being known, especially through publications. The mixing of two motivations — contributing to knowledge and advancing in a career — leads to a number of dysfunctions such as sloppy and premature publication. The members of Bourbaki, by remaining anonymous, more purely adhered to the scholarly ideal of seeking knowledge, without the contamination of career motives.

Toxic anonymity

Rather than getting worried about a few scholars writing articles under pseudonyms, there are much bigger problems with anonymous authorship, ones that deserve far greater attention.

            Many contributors to social media are anonymous. Many are polite and constructive, but quite a few are nasty and threatening. Individuals who are prominent or outspoken are vulnerable to abuse online, and women and minorities are prime targets. Researcher Emma Jane, at the University of NSW, has documented the horrific abuse to which women are subjected.

            Closer to the academic scene, reviewers of scholarly papers are commonly anonymous. The rationale is that reviewers, if they could be identified, might be less than candid. But there’s a negative consequence: some reviewers sabotage submissions by rivals or authors whose opinions they dislike. By remaining anonymous, they aren’t accountable. This is a longstanding problem that has received little attention. If it is important that authors take responsibility for their contributions, why should the authors of reviews of scholarly manuscripts not have to take responsibility for their reports?

            In many fields, especially scientific ones, supervisors and senior figures add their names to publications to which they made little or no intellectual contribution. PhD students, postdocs and junior scientists in large labs are especially vulnerable to this type of exploitation. It should be called plagiarism: credit is inappropriately claimed for the work of others. This practice of unwarranted authorship is widespread, yet it is often considered just the way things are done, and there has been remarkably little public concern expressed about it.

            This form of misrepresentation reaches greater heights in medical research. Pharmaceutical companies carry out research and write papers and then, to give the findings greater credibility, identify university professors who agree to be the nominal authors of the papers, even though they were not involved in the research, have no access to the primary data and did not write the papers to which they append their names. Meanwhile, the actual researchers may or may not be listed as co-authors. Some of them remain anonymous. Many papers produced in this fraudulent fashion are published in the most prestigious medical journals. The sponsoring companies then print thousands of copies and use the publication to tout their drugs.

            A ghostwriter, sometimes called a ghost, does some or all of the writing while someone else is listed as the author. Ghostwriting is common in autobiographies of prominent individuals such as politicians, sports stars and celebrities. Sometimes the ghost is listed as a co-author; other times the ghost remains entirely anonymous. Ghostwriting is also standard for the speeches and articles of politicians. Anonymous authors contributed to many famous speeches, for example President Dwight D. Eisenhower’s famous warning about the military-industrial complex.

Conclusion

It is reasonable to have concerns about authors being anonymous, but whether anonymity is beneficial or damaging depends quite a bit on the circumstances. I am sympathetic to the view that an author should reveal their identity when possible. However, the biggest abuses and misrepresentations associated with anonymity — social media harassment, exploitation of subordinates and ghostwriting — seem to receive the least attention.

Postscript

I submitted a paper to the Journal of Controversial Ideas. It received two rounds of rigorous refereeing before publication. I didn’t choose to be anonymous but, if my experience is typical, the journal seems far from being, in the words of Patrick Stokes, “a safe-house for ideas that couldn’t withstand moral scrutiny the first time around.”

Brian Martin
bmartin@uow.edu.au

Is age just a number?

Thinking positively about being old has surprisingly powerful effects.

In my years of teaching undergraduates, there were many instances in which students seemed clueless — and had poor memories. A student would come by my office asking how to get to their classroom. I’d say, “What’s the name of your subject?” “I can’t remember.” Then there were students about to hand in their assignments who couldn’t remember the name of their tutor.

            If these students had been 60 years old, we might have said they were having a “senior moment.” But they were 20. Were they having a “junior moment”?

            During a class, students would sometimes forget the names of their classmates — if they ever learned them — or get the day of the week wrong, among other simple mistakes.

            Then there was the challenge of finding their way around the building where I work, the notorious building 19. Many students needed directions. We used to say that once they could find their way around building 19, we’d give them a degree.

            The idea of a “senior moment” reflects a cultural assumption that older people’s memories fail. This same cultural expectation is apparent in all sorts of areas, from physical activity to job opportunities.

Breaking the age code

What are your age beliefs? Here’s a simple test. Imagine an old person and write down the first five words or phrases that come to mind, anything from “my grannie” to “absent-minded” or “helpful.” If you come up with words like “doddery” and think getting old means going downhill, losing your memory, becoming incapacitated and senile, then you have “negative age beliefs.” On the other hand, if you come up with words like “graceful” and think of old age as a time of wisdom, maturity and emotional stability, you have positive age beliefs. Does it matter what sort of beliefs you have? For the answer, get Becca Levy’s powerful new book Breaking the age code.

            Her answer is a resounding yes. Levy is a Yale University researcher who has been studying many aspects of age beliefs for decades. What she and co-authors discovered is striking: individuals with positive age beliefs do better in all sorts of ways. They do better both physically and mentally. Is this just a placebo effect? If so, it’s a powerful one that can improve your biomarkers and your performance on mental acuity tests.

“In study after study I conducted, I found that older people with more-positive perceptions of aging performed better physically and cognitively that those with more-negative perceptions; they were more likely to recover from severe disability, they remembered better, they walked faster, and they even lived longer. I was also able to show that many of the cognitive and physiological challenges we think of as linked to growing old — things like hearing loss and cardiovascular disease — are also the products of age beliefs absorbed from our social surroundings.” (p. 5)

            Levy had been doing research for years when one study suddenly made her a media star. She looked at the difference in life span between individuals with the most positive and the most negative age beliefs. “What I found was startling. Participants with the most-positive views of aging were living, on average, seven and a half years longer than those with the most-negative views.” (p. 93) That made people sit up and listen.

            However, changing your beliefs is not all that easy. If you imagine that you can say, “I’ll just start thinking positively about being older, and reap all those benefits,” think again. Individual beliefs can make a difference, but it’s hard to go against the surrounding culture. If nearly everyone around you has negative age beliefs, and speaks and acts accordingly, you’re almost bound to be influenced — negatively.

            When co-workers, faced with a challenging task, turn to younger colleagues and ignore you, you may feel unneeded, and furthermore you miss out on the intellectual and social stimulation that can help you maintain and develop your capacities. When doctors treat your ailments as “just getting old” and hence as less urgent than the same ailments in younger patients, you miss out on the help you need.

            Levy studied cultures where elders are respected. In such cultures, older people thrive. It’s as if they live up to expectations. Others’ beliefs affect what opportunities you have. If you’re continually challenged, mentally and physically, you are more likely to maintain your capacities.

Ageism

Watching academic appointments over decades, I’ve seen a preference for promise over performance: a younger applicant with “promise” is favoured over an older one with a solid record. Sometimes it seems to me that some of those on appointment committees don’t want to hire someone for a junior position who has achievements comparable to their own. This is just my impression but it accords with everything Levy says. She says ageism in employment, in the US anyway, is standard practice despite evidence that older workers can be creative, are more reliable, have fewer accidents and have more life wisdom. Discrimination in the workplace on the basis of gender or ethnicity is treated as a serious matter, even a legal matter, but there is no similar taboo against ageism.

            I talked with colleagues who, like me, are unpaid but still researching. Many of us have extensive experience and would be pleased to be more involved giving guest lectures, assessing theses, mentoring and helping in other ways. But it seems no one in authority is interested. If you’re retired or otherwise unpaid, you’re just about invisible.

            Another arena where ageism has major impacts is health care. Levy says that negative age stereotypes inform western medicine, and also notes that there’s more money in medicating disease than in preventing it through exercise and other means.

            I’ve often read about the impending demographic crisis of ageing: as a country’s population gets older, there will be fewer people of working age to support the greater numbers of the elderly with their greater demands on health services. One of the aspects of this “crisis” is self-inflicted: the requirement or pressure for people to retire, and the difficulty older workers have in finding a new job. The so-called demographic crisis would not be a problem if older people had greater opportunities to continue working. There’s another aspect: Levy cites a study showing that countries with older populations do not have higher public health expenditures. This goes against the usual assumption, and undermines the rationale for government policies to boost the birth rate or encourage immigration of young people.

            Over the years, I’ve occasionally run into someone I hadn’t seen in many years who says, “You haven’t changed a bit.” This sort of comment annoyed me in some way I couldn’t articulate, because of course I look considerably older than ten or twenty years ago. Levy explains that telling someone they haven’t aged is intended as a compliment but implies that ageing should be denied or is bad. Perhaps I should respond, “Actually, I’d really like to look older and more distinguished!”

Making a difference

Even if you live in a society with negative age beliefs, you can resist the messages around you and help to change attitudes. Levy offers a variety of practical exercises to change negative age beliefs to positive, based on changing people’s awareness and understanding, and confronting ageism. In one of the appendices, she provides information to challenge false age stereotypes. For example, you can counter the view that “Older workers aren’t effective in the workplace” by citing information that “Older workers take fewer days off for sickness, benefit from experience, have strong work ethics, and are often innovative.” (p.212).

            In Australia, there is no mandatory retirement age, but the way pension systems are set up discourages working past the 60s, and added to this are strong pressures to retire and give opportunities to younger workers. Levy tells about Jonas, a paediatrician, who retired from clinical practice, while continuing to teach. Jonas had much to offer, and told Levy, “I realized at the very end of my clinical career that most people retire as soon as they get good at something.” (p. 68) To cover his accumulated knowledge and abilities, the university had to hire two younger doctors.

            It seems the economic system is set up to throw away vast amounts of accumulated wisdom, yet people don’t recognise what’s happening because of the prevalence of negative age beliefs. Read Breaking the Age Code and help bring about change.


Becca Levy

Brian Martin
bmartin@uow.edu.au