Category Archives: education

In praise of scholarly values

Even for a critic of academia, scholarly values are worth defending.

In the half century of my academic career, I’ve repeatedly studied and exposed shortcomings in academic systems and behaviour. Problems include bias, misrepresentation, suppression of dissent, and unquestioning service to vested interests such as the military. This is not to mention bitter and destructive interpersonal and organisational politics. It is safe to say there are lots of negatives in academic life.

            In recent years, though, I’ve come to a greater appreciation of scholarly values. These values include respect for evidence and arguments, willingness to address the views of others, and the freedom to investigate and speak out about sensitive topics.

            Much of my research has been about public scientific controversies such as over nuclear power, pesticides and fluoridation. These provide a window into some of the most extreme behaviour by researchers, administrators and outside groups. However, it was only when I started studying the Australian vaccination debate that the importance of scholarly values really hit home. (This was years before Covid came on the scene.)

Ad hominem unlimited

Let’s start with respect for evidence and arguments. Scholars, ideally, engage with each other’s work by addressing, contesting and debating facts, methods, theories and perspectives. It is widely considered improper to openly criticise researchers as individuals. Behind the scenes, in private conversations, many researchers, including top ones, can be harshly critical of their opponents. Ian Mitroff in his classic book The Subjective Side of Science found that, in private, leading moon scientists would make derogatory comments about researchers with contrary views. However, personal attacks in the open literature are rare. Most scientific disputes are carried out in a seemingly respectful fashion.

            Outside scholarly forums, things can be much rougher. In the Australian vaccination debate, personal slurs are commonplace on blogs, Facebook pages and in some mass media outlets. Enough people in the debate have been sufficiently nasty to degrade the tone and deter others from participating.

            Although I engaged in the debate as a sociologist and defender of free speech, I became a regular target of ad hominem comments. Here’s a typical one: “I’d be embarrassed for a schoolkid that lazy or stupid. For a professional scholar, it’s gobsmacking. What a moron.”

            I get a laugh out of comments like these but abuse is not always funny. Many prominent women commentators regularly receive threats of rape or murder. This is remote from scholarly decorum.

Point-scoring

Many partisans in scientific controversies identify some mistake or shortcoming in the opponent’s case and seize on it, as if a single mistake or logical flaw invalidates the entire case. For example, pro-vaccination campaigners regularly refer to alleged fraud by British gastroenterologist Andrew Wakefield, implying that this discredits all criticisms of vaccination. For serious scholars, using this sort of point-scoring technique should be an embarrassment. It would be like discrediting a social theory because a leading theorist allegedly plagiarised.

Presenting contrary arguments

In philosophy, it is common to carefully present the arguments supporting a contrary view, critically examine them and, if possible, demolish them in terms of logic and evidence. This approach can be applied in examining controversial and provocative topics, as in Aaron James’ book Assholes.

            In many public controversies, this willingness to engage with the opponent’s arguments is sadly lacking. In the vaccination debate, each side presents evidence and arguments supporting its own position and attacking the opponent’s position. I am yet to discover a partisan on either side who presents the opponent’s argument in a fair fashion. The usual approach is to say, “Here are my strong points and there are your weak points,” with no acknowledgement of one’s own weak points or the opponent’s strong points.

            There is a reason for this. In polarised controversies, making any admission of weakness may be seized upon by opponents and used relentlessly. Many campaigners never admit a weakness or a source of bias, instead focusing exclusively on the weaknesses and biases of the opponent. In contrast, a good scholar should be willing to acknowledge weaknesses and to be open about possible sources of bias, in what is called reflexivity.

Academic freedom

Scholars like to imagine they can undertake investigations into controversial areas and be protected from adverse consequences. The reality is that few scholars ever tackle really sensitive topics, knowing it may be career suicide to challenge orthodoxy. Nevertheless, despite shortcomings in practice, freedom of inquiry remains a crucial academic value.

            Threats to academic freedom have come both from outside vested interests, such as big business, and from university administrators. With the advent of social media, it is now easier to express displeasure with researchers and their work, and easier to mount campaigns against academics whose views are unwelcome.

            After Hurricane Katrina in 2005, hurricane researcher Ivor van Heerden criticised the Army Corps of Engineers. He ended up losing his job at Louisiana State University.


Ivor van Heerden

            Canadian political scientist Tom Flanagan, identified with the conservative side of politics, was attacked online by the circulation of an extract from a talk he gave, surreptitiously recorded and misleadingly labelled, to discredit him personally. As recounted in his lucid book Persona Non Grata, the campaign had a damaging effect on his work, while his university did little to defend him.

            University administrations depend on their public reputations for recruiting high-quality staff and obtaining income via donations and student enrolments. As a result, they take a risk in standing up to campaigns against stigmatised scholars.

Scholarly values, another look

Having observed up close some online campaigns against dissident scholars, it seems to me that the rejection of scholarly values is less a betrayal than a disregard. In much political campaigning as well as in public scientific controversies, many members of the public are more concerned about winning by discrediting opponents than they are in having a rational conversation about an issue of social importance. Scholarly values can be boiled down to encouraging engagement on the basis of respectful interactions that address the issues. This means avoiding, when possible, making abusive comments, manipulating evidence and arguments, or trying to silence opponents.

            Though the academic system has many shortcomings, I realise now that many of them are due to a failure to live up to the values of respectful engagement and freedom of expression that are widely given lip service. The degraded commentary in many online confrontations should serve as a reminder of the positive aspects of academic discourse.

Brian Martin
bmartin@uow.edu.au

Anonymous authorship

The problems with authors being anonymous may not be what you think.

My friend and collaborator, the late Steve Wright, worked to expose and challenge repression technology. For many years, he regularly visited “security fairs” where merchants tout wares for controlling populations such as electroshock batons, guillotines, acoustic weapons and surveillance equipment. They sell technology for torture and social control to governments of all stripes, including known human rights violators.

            Steve would talk with merchants, collect sales brochures and covertly take photos. Back home in Britain, he passed information and photos to human rights groups such as Amnesty International. In addition to articles and reports using his own name, he sometimes used the pseudonym Robin Ballantyne. For Steve, a degree of anonymity was vital, especially when visiting security fairs in repressive countries such as Turkey and China.

            I thought of Steve’s experiences when, a couple of years ago, I read about the new Journal of Controversial Ideas that explicitly allows authors to use pseudonyms. This is to enable authors of contentious articles to avoid reprisals by colleagues and others. How sensible, I thought.

            Then I read comments hostile to the journal’s policy on anonymity. Helen Trinca, associate editor of The Australian and long-time editor of its higher education supplement, penned an article titled “As ideas go, hiding behind an alias is as false as they come.” She lauded Peter Singer, co-editor of the new journal, for bravely proposing his own challenging ideas. She said, though, that he wouldn’t have had such an impact if he had used a pseudonym: “the likelihood that a fresh and different idea will actually spark a conversation is reduced when it’s put forward by someone who cannot be seen, who is not known, and who has no profile to Google or CV to check.”

            Philosopher Patrick Stokes, in an article in The Conversation, presented the pros and cons of anonymous authorship. In conclusion, he asked,

“Are you, in the end, making life better for other people, or worse? In light of that standard, a pseudonymous journal devoted entirely to ‘controversial’ ideas starts to look less like a way to protect researchers from cancel culture, and more like a safe-house for ideas that couldn’t withstand moral scrutiny the first time around.”

I’m not so sure about this.

Anonymous whistleblowing

Over the past several decades, I’ve spoken to hundreds of whistleblowers. They come from all walks of life, including the public service, private companies, schools, the police, the military and churches. They report a potential problem, usually to their superiors, and frequently end up suffering reprisals. In the worst cases, their careers are destroyed.

            What happens, time and again, is that managers and bosses don’t like the message and target the messenger. Therefore, for many years, I have recommended blowing the whistle anonymously whenever possible. The value of anonymity is that the focus is more on the disclosure rather than the person who made it. In the huge volume of commentary about whistleblowers like Chelsea Manning and Edward Snowden, there is often more attention to them as individuals than to what they spoke out about.

            The same considerations apply to scholars. They can be subject to adverse actions due to speaking out on sensitive issues. I’ve talked to several Australian academics who raised concerns about “soft marking,” in particular the lowering of standards when grading international students. This is a touchy topic because it smacks of racism and because it is threatening to universities’ income. I don’t know whether any of the claims about soft marking could be substantiated, but every one of these academics encountered problems in their careers as a result of raising concerns.

Pascal

In 1990 I began corresponding with Louis Pascal, a writer based in New York City. He had published a couple of articles in well-respected philosophy journals. He had come up with an idea: that AIDS may have entered humans via contaminated polio vaccines given in the late 1950s to hundreds of thousands of people in central Africa. This idea was highly threatening to the medical research mainstream. Who would want to acknowledge that a vaccination campaign might have inadvertently led to a new disease in humans costing tens of millions of lives? Pascal met great resistance in getting his papers about AIDS published. That is another story.

            The key point here is that “Louis Pascal” was, almost certainly, a pseudonym. I never met him nor spoke to him. He used a private address that may have been a mail drop. After a huge flurry of correspondence with me and others, by the mid 1990s he vanished, at least so far as his Pascal identity was concerned. Many have speculated that “Louis Pascal” was, in public, a different person, who wanted to keep his writings about population and AIDS separate from his public identity.

Nicolas Bourbaki

There can be other reasons for anonymity. Bourbaki is the name of a group of mathematicians. By using a pseudonym for the group, they renounced acknowledgement for their contributions.


Bourbaki Congress of 1938

            This can be for an altruistic reason. Normally, researchers build their reputations and careers through being known, especially through publications. The mixing of two motivations — contributing to knowledge and advancing in a career — leads to a number of dysfunctions such as sloppy and premature publication. The members of Bourbaki, by remaining anonymous, more purely adhered to the scholarly ideal of seeking knowledge, without the contamination of career motives.

Toxic anonymity

Rather than getting worried about a few scholars writing articles under pseudonyms, there are much bigger problems with anonymous authorship, ones that deserve far greater attention.

            Many contributors to social media are anonymous. Many are polite and constructive, but quite a few are nasty and threatening. Individuals who are prominent or outspoken are vulnerable to abuse online, and women and minorities are prime targets. Researcher Emma Jane, at the University of NSW, has documented the horrific abuse to which women are subjected.

            Closer to the academic scene, reviewers of scholarly papers are commonly anonymous. The rationale is that reviewers, if they could be identified, might be less than candid. But there’s a negative consequence: some reviewers sabotage submissions by rivals or authors whose opinions they dislike. By remaining anonymous, they aren’t accountable. This is a longstanding problem that has received little attention. If it is important that authors take responsibility for their contributions, why should the authors of reviews of scholarly manuscripts not have to take responsibility for their reports?

            In many fields, especially scientific ones, supervisors and senior figures add their names to publications to which they made little or no intellectual contribution. PhD students, postdocs and junior scientists in large labs are especially vulnerable to this type of exploitation. It should be called plagiarism: credit is inappropriately claimed for the work of others. This practice of unwarranted authorship is widespread, yet it is often considered just the way things are done, and there has been remarkably little public concern expressed about it.

            This form of misrepresentation reaches greater heights in medical research. Pharmaceutical companies carry out research and write papers and then, to give the findings greater credibility, identify university professors who agree to be the nominal authors of the papers, even though they were not involved in the research, have no access to the primary data and did not write the papers to which they append their names. Meanwhile, the actual researchers may or may not be listed as co-authors. Some of them remain anonymous. Many papers produced in this fraudulent fashion are published in the most prestigious medical journals. The sponsoring companies then print thousands of copies and use the publication to tout their drugs.

            A ghostwriter, sometimes called a ghost, does some or all of the writing while someone else is listed as the author. Ghostwriting is common in autobiographies of prominent individuals such as politicians, sports stars and celebrities. Sometimes the ghost is listed as a co-author; other times the ghost remains entirely anonymous. Ghostwriting is also standard for the speeches and articles of politicians. Anonymous authors contributed to many famous speeches, for example President Dwight D. Eisenhower’s famous warning about the military-industrial complex.

Conclusion

It is reasonable to have concerns about authors being anonymous, but whether anonymity is beneficial or damaging depends quite a bit on the circumstances. I am sympathetic to the view that an author should reveal their identity when possible. However, the biggest abuses and misrepresentations associated with anonymity — social media harassment, exploitation of subordinates and ghostwriting — seem to receive the least attention.

Postscript

I submitted a paper to the Journal of Controversial Ideas. It received two rounds of rigorous refereeing before publication. I didn’t choose to be anonymous but, if my experience is typical, the journal seems far from being, in the words of Patrick Stokes, “a safe-house for ideas that couldn’t withstand moral scrutiny the first time around.”

Brian Martin
bmartin@uow.edu.au

What is university research for?

Every year, Australian academics spend long hours preparing applications to the Australian Research Council, which awards grants to the most highly ranked projects. Each application is scrutinised by experts in the field and judged by panels of leading scholars. Before the awards are made, they have to be signed off by the Minister of Education, usually a routine bureaucratic step. However, for the ARC round for 2022 funding, the Minister rejected six projects selected by the ARC, causing howls of outrage from the university sector. The projects were selected on academic merit. The Minister was jeopardising the reputation of Australian scholarship by injecting a political assessment into the process.

(Incidentally, if the Minister is going to veto projects, why not do it at the beginning, based on titles and abstracts, thereby saving researchers the effort of preparing their applications?)

There have been ministerial vetoes in several rounds of ARC applications in recent decades, nearly all of them being projects in the humanities and social sciences. One interpretation is that the Minister is appealing to voters who think academics are self-interested and privileged.

The vetoes can also be seen as part of a wider process of channelling university research in the direction of the “national interest,” usually interpreted as serving commercial or government interests. For years, all ARC applications have had to include a justification for how the proposed project serves the national interest. Apparently the “national interest” means commercial interests: the government has been pushing for more commercially oriented research.

These pressures raise the question of the purpose of university research. Just because there are profits to be made does not necessarily make something worthwhile. The classic example is the tobacco industry, which sponsored lots of research, but only continued supporting researchers who gave results serving the industry. Association with the tobacco industry is now a source of stigma, but this was not always true.

Today, some of the most corrupt research practices thrive in biomedicine. The pharmaceutical industry carries out its own research and sponsors research by academics. Research favouring industry products is far more likely to be published. In some cases, academics are listed as authors of papers ghostwritten by industry scientists. Dissidents may be subject to discrimination and reprisals.

Quite a few scholars have pointed to the corruption of academic research by commercial interests. Findings about drugs and environmental impacts, among other topics, are skewed towards the interests of companies, harming the public interest. The classical ideal of independent, disinterested research was never achieved, but with commercial inroads into universities, the reality is further than ever from the ideal.

It is quite common for scientists’ research to be sponsored by a company or government with a vested interest in the outcome. For the scientists, this represents a conflict of interest and should make the results suspect. Greater commercialisation accentuates this problem, indeed makes it a goal. It is a perversion of the ideals of independence to encourage and reward sponsorship of research by vested interests.


Philip Mirowski writes about commercialisation of US scientific research

What’s off the agenda?

The emphasis on commercialisation leads to neglect of research that serves human needs but has little or no profit-making potential. There are numerous areas where research is vitally needed but where results are likely to be contrary to commercial or government interests.

Industrial democracy involves workers participating in decisions about how to carry out jobs and sometimes even what products to produce. Some managers encourage limited forms of worker participation, but deeper forms are usually discouraged because they cut into managerial prerogatives. This is despite research, going back many decades, suggesting that greater worker participation can improve productivity. Decades ago, several Australian social scientists, including Fred Emery and Trevor Williams, were leaders in research on industrial democracy, but this pioneering work has fallen into a vacuum.

Even ignoring the benefits of greater productivity, greater worker participation has been shown to improve the quality of working life, more than improvements in salaries and conditions. Research into industrial democracy is a social good, but don’t expect companies or governments to sponsor much of it.

One of the most exciting areas today is the production of goods and services through the cooperative efforts of unpaid contributors. The most well-known example is free software. The computer operating system Linux is superior to proprietary alternatives, and it was produced through non-commercial means. Open-source approaches are now found in many areas, including colas, drug development and solar technology. Combined with 3D printing, open-source software opens the possibility of a jump in productivity using an entirely different model: distributed production with free sharing of ideas.

In the face of such emerging initiatives, pushing universities in traditional commercial directions is retrograde.

Research into peace and human rights is vital for dealing with the problems of war, genocide, torture and exploitation. Governments spend an enormous amount of money funding militaries, including military research, collectively feeding the war machine and human rights abuses. Scientists continue research into weapons, and there is a long history of militaries drawing on university research. By comparison, research into nonviolent methods of struggle is extremely limited, despite pathbreaking findings that challenging repressive governments through nonviolent means is more likely to be effective than armed struggle.

A different agenda

Pushing university researchers to serve government and corporate interests accentuates a decades-long trend away from independent research into areas of human need. However, it should not be assumed that priorities for university research set by scholars are necessarily worthwhile.

It has long been the case that researchers seek money, preferably with no strings attached, to carry out their pet projects. They want support without accountability except to their scholarly peers. This can lead to research that serves the researchers, with publications, status, promotions and prestige, but has little wider benefit.

Within research fields, jargon and esoteric theory can proliferate, so outsiders cannot easily understand studies, and topics pursued that have little potential social relevance. In some instances, so-called pure or blue-sky research turns out to have immense practical spin-offs. Are these the exceptions?

Nicholas Maxwell, a philosopher of science, argued that research agendas should be changed from a search for knowledge to a search for wisdom. Knowledge is not necessarily beneficial, such as knowledge about how to kill or exploit people. Maxwell’s “philosophy of wisdom” involves research to serve human needs, for example addressing issues of poverty, inequality and environmental destruction.

Following Maxwell’s analysis, the goal for university research should not be ivory-tower investigations, simply following the agendas of academics, but a greater orientation to pressing social issues. This means not separation from society, but wider community participation in setting research agendas. Corporations and governments should have a say, but so should farmers, builders, nurses, teachers, parents, people with disabilities and a host of others.

Rather than posing a dichotomy between ivory-tower research and research driven by government and commercial priorities, there is another option: research agendas shaped by input from members of the wider community – the ones who should be benefiting in the long run.

Brian Martin
bmartin@uow.edu.au

Thanks to Paula Arvela, Clark Chilson, Jungmin Choi, Caroline Colton and Olga Kuchinskaya for useful comments.

Be confident — but not too confident

Do you lack confidence? Are you afraid to set up a new business, embark on a new career, commit to a relationship or take up hang gliding?

Don’t worry too much about it. You might be making the right decisions. Being too confident can be worse than not being confident enough.

But how can you tell? Turn to Don A. Moore’s new book Perfectly Confident. It’s about making the best decisions.

            Moore says most popular treatments assume that more confidence is better. People just need to overcome their fears and jump in. This is true for some people and some decisions. But it can also be disastrous.

When you see a sports star making a seemingly brash prediction of winning, you might imagine that being really confident is necessary for success. After all, if you’re not confident, how can you do your best? Not so quick, says Moore. There’s actually little evidence that super-confidence improves performance. Those sports stars have worked hard and long, and may be making reasonable judgements about their chances of victory.

Overconfidence is potentially dangerous and can lead you to take unwarranted risks. If you’ve never tried base jumping, it’s better to be very cautious and prepare carefully before your first jump. Most new small businesses fail within the first year. Perhaps their owners were overconfident.

            There is evidence that most people overestimate how good they are at things. In a classic survey, 93% of US drivers said they ranked in the top half. Most young people think they are more honest than average and better than average at relationships. The reason is that people think, “I’m honest most of the time, so I’m better than average” but don’t stop to think that most other people may think the same way. Moore says the way to fix your perception of superiority is to be more specific. For example, if being a good driver is specified as never having had an accident or a ticket, then fewer people will overestimate their abilities.

There’s another side to people’s thinking about their own capabilities. When it comes to an uncommon skill, like riding a unicycle or subtracting large numbers in your head, most people underestimate their abilities. You might think, “I wouldn’t last three seconds on a unicycle” and forget to think that most other people might have the same difficulty.


Could you unicycle across China?

            One of the methods Moore recommends is to think probabilistically. Consider all possible outcomes of your decision. Consider the new business. You might guess that there’s a 10% chance of making a lot of money, 40% of making a little, 30% of losing a little and 20% of losing a lot. Just writing down the possibilities can be sobering. Overconfident people never stop to think of failure and hence can make unwise decisions. Assigning probabilities also helps in overcoming the tendency to think in terms of yes or no, success or no success.

You also need to weigh up the benefits against the costs. In setting up the business, you might be working 90-hour weeks. This can be exhilarating but it might also be exhausting. You should factor these possibilities into your decision. Vital here is the idea of opportunity cost. All that money and those hours of effort might be invested in some other activity. Thinking in terms of different possible outcomes and opportunity costs can help counter overconfidence.

A confident scholar?

Many times in my career as an academic I’ve had to make decisions about whether to write an article or a book and then, after writing it, where to submit it. When I was first starting out, I’d write an article and then try to figure out where to submit it. Before long, I learned this was not a good strategy, because sometimes there was no suitable outlet. Moore would say I was overconfident and needed to consider the possibility of wasting effort, at least for the purpose of publication, which is crucial for aspiring academics.

These days, before writing an article, I think about where I plan to send it, and the likelihood of it being accepted. Sometimes there is a high-prestige journal that I think could be worth trying. I might estimate the chance of acceptance as 5 percent, one out of twenty. I have to weigh up the effort of tailoring the article to this journal and going through the admission process, along with associated delays, against the 95% chance of rejection.

In many cases, I decide not to bother with the high-status journal and go straight to one where the odds are better. This points to another factor to consider when writing an article: are there fall-back options should my first-choice outlet reject my submission?

Another decision is whether to undertake a PhD. When I did my own PhD, aeons ago, I didn’t think about failure. I took a risk without considering the full range of outcomes. Now, as a potential PhD supervisor, I regularly talk to prospective students. They need to make several decisions: whether to pursue a PhD, what university to attend, what topic and what supervisor. It’s a big decision because writing a PhD thesis requires years of effort. Although about three quarters of students who’ve started with me as their supervisor have graduated, the cost for those who don’t finish can be large: they could have been doing something else with their time and energy. On the other hand, a student can acquire skills and obtain satisfactions along the way, a sort of consolation prize for non-finishers.

            Therefore, in advising prospective students, I point to the large and sustained commitment required and note that most PhD graduates do not obtain academic posts. After reading Moore’s book, in future I’ll recommend that prospective students assign probabilities to different outcomes. That will help counter overconfidence.

For students who are part way through their theses, a more common problem is under confidence. The challenge seems enormous. It can be helpful to have the courage to continue, knowing that most students, including most of those who finish, go through periods of self-doubt.

A confident whistleblower?

Another area where Moore’s recommendations are relevant is whistleblowing. Thinking from the point of view of managers in organisations, he says that being results-oriented is not necessarily a good thing. Being results-oriented often means rewarding employees for success and penalising them for failure.

This sounds logical but it misses an important consideration: sometimes it is wise to take risks even though some of them don’t pan out. If developing a new app costs $1 million and has only a 10% chance of success, it’s still a good bet if success means a return of $100 million. But when employees are penalised for failure, they won’t take risks like this. Apple never would have developed spectacularly profitable devices if it hadn’t supported risk-taking with positive expected returns.

Imagine being a manager and one of your employees reports possibly fraudulent activities in the organisation. You investigate and discover your employee is wrong. Does this warrant a penalty? Moore would say that this whistleblower should be encouraged even if the report was wrong, at least if there’s a reasonable chance it might have been right.

In practice, employees who make allegations of wrongdoing are often penalised even when they’re right, especially when the wrongdoing implicates higher management for being involved or for tolerating it. That’s another story.

            The whistleblower, treated badly, then turns to a watchdog agency such as an ombudsman or anti-corruption agency or court. A good idea? Moore’s advice would be to consider all possible outcomes and assign them probabilities, and also to consider other options. Few whistleblowers do this. They want vindication and assume that some higher authority will provide it. They do not investigate the success rate of previous whistleblowers, which can be abysmal. Because they know they are right, they do not consider the possibility that justice will not be done, and that many previous whistleblowers also knew they were right but failed in their efforts to be vindicated.

Moore recommends learning from experience. When you have a decision to make, assign probabilities to potential outcomes and consider alternative courses of action. When you learn the outcome, go back to your probabilities and figure out whether you may have been too confident or not confident enough. Gradually, over time, you can improve your skill in predicting outcomes.


Don’t just jump in! Learn to predict outcomes.

            This is good advice for many purposes. However, when you’re faced with a decision that is likely to be made just once in a lifetime — like doing a PhD or blowing the whistle — then it’s sensible to learn as much as possible about what others have done in the same situation. Why make your own mistakes when you can learn from others’ mistakes? By undertaking this sort of investigation, you minimise the risk of making a wrong decision. And when things don’t work out, remember that you still might have made the right decision. If you are successful in everything you try, you probably aren’t taking enough risks!

Brian Martin
bmartin@uow.edu.au

 

Judges and sexual harassment

Is Dyson Heydon, a former justice on the High Court, a serial sexual harasser? Maybe so, but there is more to consider: abuse of trust, outrage management techniques and official channels.


Dyson Heydon

Abuse of trust

In 1986, I joined the newly formed Sexual Harassment sub-committee at the University of Wollongong. Its aim was to oppose sexual harassment on campus. It was a sub-committee of the committee overseeing the Equal Employment Opportunity (EEO) unit. We were a small group, with members from the EEO unit, academics, research students and undergraduate students. We developed policy proposals, produced leaflets and held stalls at Orientation Week.

Some members of the committee, through their contacts, knew about harassment on campus. Hardly any students were willing to make formal complaints, which didn’t come to our group anyway. But EEO staff knew about patterns, and some other committee members did too.

For example, we heard that a particular lecturer was making unwelcome advances to undergraduate students, none of whom wanted to make a formal complaint. On the committee, we discussed options. We couldn’t approach, much less accuse, the lecturer, as that would violate the students’ confidentiality. We talked about putting graffiti in the women’s toilets. In the end, the EEO Officer decided to offer a workshop on sexual harassment to the entire faculty. In this way, we hoped, the message would get to the lecherous lecturer and his colleagues.

            In 1990, something happened that broadened our concerns. Two undergraduate students accused a man of rape. It turned out that the man, a PhD student, was their tutor in one of their classes. He later went to prison for rape. The Vice-Chancellor put out a statement raising concern about individuals who abuse their “positions of privilege” in relation to students who “may feel their academic progress depends upon compliance with the wishes of a staff member or members.”

On our committee, we took this on board and started investigating the issue of consensual sexual relationships between teachers and students. There were two main problems. One was conflict of interest. If a teacher had a close personal relationship with a student, then the teacher would likely be biased when marking the student’s work. Even if not, there might be a perception of bias.

The solution for conflict of interest is often straightforward. One of my colleagues was married to a student in her class. The relationship was known, and arrangements were made so that he was not in her tutorial group and she had nothing to do with any of his assignments.

However, we learned of cases in which such conflicts of interest were not addressed. In one instance, which I learned about years later, a senior academic was a supervisor for his wife, who was doing a PhD.

The second main problem with close relationships between staff and students was abuse of trust. One of the members of our committee knew of a male colleague who started a relationship with an undergraduate student in his class every year or two. The students who were dumped along the way were often distressed. Some dropped out of university.

Teachers are in a position of trust with students, trust that they will support and nurture their students’ knowledge, understanding and skills. Students often look up to their teachers as experienced and knowledgeable, sometimes even in awe. When a teacher uses this position of authority and status to cultivate a sexual relationship, it undermines the expected professional relationship: it abuses the trust implicit in the teacher-student relationship.

Unlike sexual harassment, abuse of trust isn’t illegal. However, it can be just as damaging.

            In learning about this sort of abuse of trust in university settings, one of our committee members came across a book by Peter Rutter titled Sex in the forbidden zone. The book’s subtitle listed several of the possibilities for abuse: When men in power — therapists, doctors, clergy, teachers and others — betray women’s trust. There is an implicit trust that a doctor, lawyer, teacher or boss will look after the interests of their patient, client, student or subordinate. In each case, there is a possibility of abuse of trust when the person with greater authority uses their position to promote a sexual or romantic relationship.

Heydon, as a judge, obviously was in a position of much greater authority than his associates. For him, or any other judge, to use their position to seek a sexual or romantic relationship is an abuse of trust.

In some cases, such relationships are consensual. A student might welcome, desire or even seek a sexual relationship with their teacher. Sometimes this works out well, leading to long-lasting relationships. However, there is still a serious risk of abuse of trust, as we learned from stories we heard on our committee. The solution for teachers is straightforward: if you want a close personal relationship with a student, wait until they’re no longer in your class or in any way subject to your authority or influence.

Imagine, for the sake of argument, that one of Heydon’s associates welcomed his advances and began a relationship with him. That would be a legal, consensual relationship, not harassment — and it would still be wrong. It would probably involve a conflict of interest and most likely an abuse of trust. In such cases, the onus is on the judge not to initiate such a relationship. Indeed, if an associate took the initiative, the judge should refuse.

Sexual harassment: outrage management

Years after being on the sexual harassment sub-committee, I started studying what happens when a powerful individual or group does something that others think is wrong. An example is the 1991 Dili massacre, when Indonesian troops shot and killed hundreds of peaceful protesters in East Timor’s capital city. Another example is the beating of Rodney King by Los Angeles police, also in 1991.


Still from George Holliday’s video of the beating of Rodney King

In these and many other instances, the perpetrator and allies use a variety of methods to reduce public outrage. They cover up the action, devalue the target, reinterpret the events by lying, blaming and reframing, use official channels to give an appearance of justice, and intimidate or reward people involved.

This dynamic applies to sexual harassment. Greg Scott and I examined the techniques used when Anita Hill alleged that Clarence Thomas, a nominee for the US Supreme Court, had harassed her years before.


Anita Hill

Greg and I found evidence of all of the usual techniques to reduce public outrage. For example, after Hill went public, she was the subject of a massive campaign of denigration, including publication of a book, The Real Anita Hill, filled with lies and derogatory material. (The author later recanted.) Thomas reframed Hill’s allegations as part of hearings that were a racial assault on him.

Paula McDonald at the Queensland University of Technology led a study of sexual harassment using this framework, examining testimony in court cases about sexual harassment. Court transcripts revealed that the same techniques were used in case after case.

For years, Heydon paid no penalty for his actions. The primary technique to reduce public outrage was cover-up. Heydon of course didn’t publicise his actions, but neither did others who knew about them. Many of them were afraid to say anything because they were worried about repercussions, for themselves rather than Heydon: their careers might be damaged. This is the technique of unspoken threats, a type of intimidation.

It is possible to counter the techniques that reduce outrage from injustice. The counter-methods are exposing the action, validating the target, interpreting events as unfair, mobilising support and resisting intimidation and rewards. These are the methods that made the Dili massacre, the beating of Rodney King and the sexual predation of Harvey Weinstein counterproductive for the attackers.

In Heydon’s case, outrage was stoked most of all by the breakthrough stories by journalists Kate McClymont and Jacqueline Maley. The stories were enabled by women willing to tell their stories. This was the counter-method of exposing the action.

            In the exposure, the women harassed by Heydon were given respect. In the coverage, they were presented as credible and as talented, conscientious individuals. This was the counter-method of validating the targets.

In the exposure, the events were portrayed as harassment and as wrong. This had particular resonance in the Heydon case because of his symbolic status as a high-level representative of justice and as a self-styled pillar of moral rectitude. This was the counter-method of interpreting events as unjust.

The coverage was enabled by women willing to come forward and tell their stories. The #MeToo movement was instrumental. It triggered a mobilisation of support for targets of harassment and assault.

Finally, several courageous women were willing to go public with their stories, despite the possible damage to their careers and reputations. This was the counter-method of resisting intimidation.

The exposure of Heydon’s harassment thus shows the relevance of all the counter-methods commonly involved in challenging a powerful perpetrator of something deemed wrong.

Official channels

In my just-published book titled Official Channels, I describe my experiences learning about the shortcomings of processes and agencies such as grievance procedures, regulatory bodies, ombudsmen, anti-corruption bodies and courts. Most of the workers in watchdog bodies are doing their best, but the system has inherent shortcomings.

One of the chapters in Official Channels is about sexual harassment. In Australia, like other countries, sexual harassment was a long-standing problem that came on the public agenda due to efforts of feminists. The main response has been setting up of laws and procedures to deal with the problem, but often these only give an illusion of protection. Decades later, sexual harassment and sexual assault remain serious problems.

After Heydon’s harassment was revealed to the public, the first response in many cases was to say that better processes are needed to deal with it. This is nearly always the number-one response. But why would better processes work now when they haven’t before? Furthermore, many of Heydon’s actions involved an abuse of trust, and there is no rule against abuse of trust.

I’m all in favour of more effective regulations, laws and watchdog bodies, but there’s a danger in thinking that this is enough. Several other options are neglected by comparison.

One important option is improved skills. Imagine that those around Heydon had been better prepared to expose and counter his behaviour. This doesn’t just mean the women he targeted, but others too, so-called bystanders, especially those who heard about his actions. Skills against sexual harassment include putting graffiti in women’s toilets — and in men’s toilets. They include being able to use anonymous remailers and set up secure websites. They include being able to make covert recordings, and being able to document events and convey them powerfully to others.


Martha Langelan’s book offers excellent practical advice

            This might sound like putting the onus for action on the target, in effect blaming the victim, but just as much onus needs to be put on others to provide support and take action. Bystander training is valuable in skill development.

Another important option is changing the culture. The legal profession is highly hierarchical, with judges at the apex. A more egalitarian system would reduce the power of elites, empower those lower down and enable stronger challenges to abusers.

Changing the culture might also mean changing expectations so that associates are treated as professionals rather than as personal assistants. It might even mean getting rid of the role of associates altogether, providing support for judges in other ways.

The point here is not to provide a blueprint but to note that there are options besides official channels. Improving skills and changing the culture might not be easy but they show quite a bit of promise, especially considering the failure of decades of official concern about sexual harassment. It is revealing that if official channels were effective, there would have been no need for the #MeToo movement — or for investigative journalists to expose people like Dyson Heydon.

Brian Martin
bmartin@uow.edu.au

Thanks to Sharon Callaghan and Qinqing Xu for valuable comments on a draft and to the many individuals over the years who have helped me learn about the issue of sexual harassment and what can be done about it.

Western civilisation: what about it?


Sappho, Ancient Greek poet

“Western civilisation” can be a contentious topic, in part because people interpret it in different ways. Many achievements have been attributed to “the West,” but it has many negatives too. It is not obvious how to assign responsibility for the positives and negatives. Often left out of debates about Western civilisation are alternatives and strategies to achieve them.

In some circles, if you refer to Western civilisation, people might think you are being pretentious, or wonder what you’re talking about. For some, though, the two-word phrase “Western civilisation” can pack an emotional punch.

Western civilisation can bring to mind famous figures such as Socrates, Michelangelo and Leonardo da Vinci — maybe even some women too — and high-minded concepts such as democracy and human rights. Increasing affluence fits in somewhere. Western civilisation is also associated with a sordid history of slavery, exploitation, imperialism, colonialism and warfare.

My aim here is to outline some of the issues involved.[1] I write this not as an expert in any particular relevant area, but rather as a generalist seeking to understand the issues. Whatever “Western civilisation” refers to, it is a vast topic, and no one can be an expert in every aspect. One of the areas I’ve studied in some depth is controversies, especially scientific controversies like those over nuclear power, pesticides and fluoridation. Some insights from controversy studies are relevant to debates over Western civilisation.

After outlining problems in the expression “Western civilisation,” I give an overview of positives and negatives associated with it. This provides a background for difficult questions concerning responsibility and implications.

Western? Civilisation?

In political and cultural discussions, “the Western world” has various meanings. It is often used to refer to Europe and to other parts of the world colonised by Europeans. This is just a convention and has little connection with the directions east and west, which in any case are relative. Europe is in the western part of the large land mass called Eurasia, so “Western” might make sense in this context. But after colonisation, some parts of the world elsewhere are counted as part of the “West,” including the United States, Canada, Australia and New Zealand. These are called settler colonies, where the immigrants from Europe eventually outnumbered the native inhabitants. However, South and Central American countries are also settler colonies but are less often listed as part of the West. So there is a bit of arbitrariness in defining the West.

The word “civilisation” has different meanings, and sometimes multiple meanings, in different contexts. For historians, civilisation refers to a complex society with established institutions such as governments, laws, commerce and rules of behaviour. A civilisation of this sort has a certain size, cohesion and organisation. The Roman empire is called a civilisation; hunter-gatherer societies are not.[2]

“Civilisation” also refers to being civilised, as opposed to being savage.[3] Being civilised suggests being rational and controlled rather than emotional and chaotic. It also suggests civility: politeness rather than crudity. A civilised person dresses properly, speaks appropriately and knows what rules to obey.

Because the word civilisation has multiple meanings and connotations, which vary from person to person, from context to context and from one time to another, some discussions about it mix emotional and logical matters. Contrary to its positive connotations, a civilisation, in the scholarly meaning, is not necessarily a good thing: it might be a dictatorial exploitative empire. In the everyday meaning of being civilised, it sounds better than being uncivilised. Empires that have caused unspeakable suffering sound better when they are called civilisations. Some mass murderers are, in everyday interactions, polite, rational and well-dressed: being civilised in this sense is no guarantee of moral worth.


Civilised?

Positives

Many of the features of human society that today are widely lauded were first developed in the West or were developed most fully in the West. These might be called the achievements or contributions of Western civilisation.

The ancient Greeks developed a form of collective decision-making in which citizens deliberated in open forums, reaching agreements that then became policy or practice.[4] This is commonly called democracy. In ancient Greece, women, slaves and aliens were excluded from this process, but the basic idea was elaborated there.

Many centuries later, several revolutions (including those in France and the US) overthrew autocracies and introduced a form of government in which citizens voted for representatives who would make decisions for the entire community. This was quite unlike democracy’s roots in ancient Greece, but today it is also commonly called democracy, sometimes with an adjective: liberal democracy or representative democracy. Voting initially was restricted to white male landowners and gradually extended to other sectors of the population.

Commonly associated with representative government are civil liberties: freedom of speech, freedom of association, freedom from arbitrary search, arrest and detention, freedom from cruel treatment. These freedoms, or rights, resulted from popular struggles against tyranny, and are commonly seen as a special virtue of the West, a model for the rest of the world. Struggles over these sorts of freedoms continue today, for example in campaigns against discrimination, surveillance, slavery and torture.

Another contribution from the West is art and, more generally, cultural creations, including architecture, sculpture, painting, music, dance and writing. While artistic traditions are found in societies across the world, some of these, for example ballet and classical music, have been developed in the West to elaborate forms that require enormous expertise at the highest levels, accompanied by long established training techniques for acquiring this expertise.[5]

In the West, manners have evolved in particular ways. On formal occasions, and in much of everyday behaviour, people are mostly polite in speech, conventional in dress, proper in their manner of eating, and modest in their excretions.[6]

The industrial revolution had its home in the West. The development and use of machinery, motorised transport, electricity and many other technological systems have made possible incredible productivity and greatly increased living standards. This has involved inventions and their practical implementation, namely innovation. The West has contributed many inventions and excelled in the process of innovation.

Modern systems of ownership, commercial exchange and employment, commonly called capitalism, developed most rapidly and intensively in the West and were then exported to the rest of the world.

Questioning the positives

The positives of Western civilisation can be questioned in two ways: are they really Western contributions, and are they really all that good?

Representative government is commonly described as “democracy,” but some commentators argue that it is a thin form of democracy, more akin to elected tyranny. It has little resemblance to democracy’s Athenian roots. The ancient Greeks used random selection for many official positions, with a fairly quick turnover, to ensure that those selected did not acquire undue power. This was in addition to the assembly in which every citizen could attend and vote. Arguably, the ancient Greeks had a more developed form of “direct democracy,” direct in the sense of not relying on elections and representatives.[7]


The kleroterion, used for randomly selecting officials in ancient Athens

However, if direct democracy is seen as the epitome of citizen participation, then note should be made of numerous examples from societies around the world, many of them long predating agriculture. Many nomadic and hunter-gatherer groups have been egalitarian, with no formal leaders.[8] They used forms of consensus decision-making that are now prized in many of today’s social movements. There are examples of societies with non-authoritarian forms of decision-making in Africa, Asia and the Americas.

The Iroquois Confederacy in North America had a well-developed decision-making process that predated white American settlers by hundreds of years and, via Benjamin Franklin, helped inspire US democratic principles and methods.[9] A full accounting of the contributions of non-Western societies to models of governance remains to be carried out.[10]

Modern-day civil liberties are needed to counter the repressive powers of the state. However, in egalitarian societies without states, civil liberties are implicit: members can speak and assemble without hindrance. From this perspective, “civilisation” involves citizens of a potentially repressive state congratulating themselves for managing to have a little bit of freedom.

The industrial revolution is commonly attributed to the special conditions in Europe, especially Britain. This can be questioned. It can be argued that Western industrial achievements were built on assimilating superior ideas, technologies and institutions from the East.[11]

As for the West’s cultural achievements, they need to be understood in the context of those elsewhere. Think of the pyramids in Egypt, the work of the Aztecs, the Taj Mahal. Think of highly developed artistic traditions in India, China and elsewhere.

Negatives

Western societies have been responsible for a great deal of killing, exploitation and oppression. Colonialism involved the conquest over native peoples in the Americas, Asia, Africa and Australasia. Europeans took possession over lands and expelled the people who lived there. In the course of European settlement, large numbers of indigenous people were killed or died of introduced diseases. The death toll was huge.[12]

In imperialism, which might be called non-settler colonialism, the European conquerors imposed their rule in damaging ways. They set up systems of control, including militaries, government bureaucracies and courts, that displaced traditional methods of social coordination and conflict resolution. They set up administrative boundaries that took little account of previously existing relationships between peoples. In South America, the administrative divisions established by Spanish and Portuguese conquerors became the basis for subsequent independent states.[13] In Rwanda, the Belgian conquerors implemented a formal racial distinction between Tutsis and Hutus, installing Tutsis in dominant positions, laying the basis for future enmity.[14]

Imperialism had a devastating impact on economic and social development. British rule over India impoverished the country, leading to a drastic decline in India’s wealth, while benefiting British industry.[15] Much of what was later called “underdevelopment” can be attributed to European exploitation of colonies.[16]

Another side to imperialism was slavery. Tens of millions of Africans were captured and transported to the Americas. Many died in the process, including millions in Africa itself.[17]

Imperialism and settler colonialism were responsible for the destruction of cultures around the world. The combination of conquest, killing, disease, exploitation, dispossession, divide-and-rule tactics and imposition of Western models undermined traditional societies. Some damaging practices were imported, including alcohol, acquisitiveness and violence. When Chinese leaders made attempts to stop opium addiction, British imperialists fought wars to maintain the opium trade. Colonial powers justified their activities as being part of a “civilising mission.”

It should be noted that many traditional cultures had their bad sides too, for example ruthless oppressors and harmful practices, including slavery and female genital mutilation. In some respects, Western domination brought improvements for populations, though whether these same improvements could have been achieved without oppression is another matter.

Colonialism was made possible not by cultural superiority but by superior military power, including weapons, combined with a willingness to kill. Europeans were able to subjugate much of the world’s population by force, not by persuasion or example.

In the past couple of centuries, the West has been a prime contributor to the militarisation of the world. Nuclear weapons were first developed in the West, and hold the potential for unparalleled destruction, a threat that still looms over the world. The only government to voluntarily renounce a nuclear weapons capacity is South Africa.

The problems with capitalism have been expounded at length. They include economic inequality, unsatisfying work, unemployment, consumerism, corporate corruption, encouragement of selfishness, and the production and promotion of harmful products such as cigarettes. Capitalist systems require or encourage people to move for economic survival or advancement, thereby breaking down traditional communities and fostering mental problems.

Industrialism, developed largely in the West, has had many benefits, but it also has downsides. It has generated enormous environmental impacts, including chemical contamination, species extinction and ocean pollution. Global warming is the starkest manifestation of uncontrolled industrialism.

Responsibility

What is responsible for the special features of Western civilisation, both positive and negative? One explanation is genetics. Western civilisation is commonly identified with white populations. Do white people have genes that make them more likely to create great works of art, or to be inventors, entrepreneurs or genocidal killers?

The problem with genetic explanations is that gene distributions in populations are too diverse to provide much guidance concerning what people do, especially what they do collectively. There is no evidence that Mozart or Hitler were genetically much different from their peers. There is too much variation between the achievements of brothers and sisters to attribute very much to genetics. Likewise, the rise and fall of civilisations is far too rapid for genetics to explain very much.


Stalin: genetically different?

More promising is to point to the way societies are organised. Social evolution is far more rapid than genetic evolution. Are the social structures developed in the West responsible for its beneficial and disastrous impacts?

The modern state is commonly said to have developed in Europe in the past few hundred years, in conjunction with the rise of modern military systems. To provide income for its bureaucratic apparatus, the state taxed the public, and to enforce its taxation powers, it expanded its military and police powers.[18] A significant step in this process was the French Revolution, which led to the development of mass armies, which proved superior to mercenary forces. The state system was adopted in other parts of the world, in part via colonialism and in part by example.

The state system can claim to have overcome some of the exploitation and oppression in the previous feudal system. It has also enabled massive investments in infrastructure, including in military systems, creating the possibility of ever more destructive wars as well as extensive surveillance. The French revolution also led to the introduction of the world’s first secret police, now institutionalised in most large states.[19]

If the West was the primary contributor to the contemporary state system, this is not necessarily good or bad. It has some positives but quite a few negatives.

A role for chance?

Perhaps what Western civilisation has done, positive and negative, shows nothing special about Western civilisation itself, but is simply a reflection of the capacities and tendencies of humans. Had things been a bit different, the same patterns might have occurred elsewhere in the world. In other words, the triumphs and tragedies of Western civilisation should be treated as human triumphs and tragedies, rather than reflecting anything special about people or institutions in the West.

On the positive side, it is apparent that people from any part of the world can attain the highest levels of achievement, whether in sport, science, heroism or service to the common good. The implication is that, in different circumstances, everything accomplished by lauded figures in the West could have been done by non-Westerners. Of course, there are many examples where this is the case anyway. Major steps in human social evolution — speech, fire, tools, agriculture — are either not attributed to a particular group, or not to the West. These developments are usually said to reflect human capacities. So why not say the same about what is attributed to a “civilisation”?

The same assessment can be made of the negatives of Western civilisation, including colonialism, militarism and industrialism. They might be said to reflect human capacities. Genocides have occurred in many parts of the world, and nearly every major government has set up military forces. Throughout the world, most people have eagerly joined industrial society, at least at the level of being consumers.

When something is seen as good, responsibility for it can be assigned in various ways. Leonardo da Vinci is seen as a genius. Does this reflect on him being a man or a person with opportunities? Is being white important? How should responsibility be assigned to the emergence of Hitler or Stalin?

Research on what is called “expert performance” shows that great achievements are the result of an enormous amount of a particular type of practice, and suggests that innate talent plays little role.[20] The human brain has enormous capacities, so the key is developing them in desirable ways. On the other hand, humans have a capacity for enormous cruelty and violence, and for tolerating it.[21]

Alternatives

For those critical of state systems, militarism and capitalism — or indeed anything seen as less than ideal — it is useful to point to alternatives.

One alternative is collective provision, in which communities cooperate to provide goods and services for all. This is a cooperative model, in contrast with the competitive individualistic model typical of capitalist markets.[22] In collective provision, “the commons” plays a key role: it is a facility available to all, like public libraries and parks. Online examples of commons are free software and Wikipedia, which are created by volunteers and available to all without payment or advertisements. Applied to decision-making, deliberative democracy is an alternative close to the cooperative approach.

The rise of capitalism involved the enclosure of lands that were traditionally used as commons. “Enclosure” here means takeover by private or government owners, and exclusion of traditional users. Contemporary proponents of the commons hark back to earlier times, before the enclosure process began.


Free software is a type of commons.

What is significant here is that commons historically, as highly cooperative spaces, developed in many places around the world. They are not a feature of a particular civilisation.

Another alternative is strategic nonviolent action, also called civil resistance.[23] Nonviolent action involves rallies, marches, strikes, boycotts, sit-ins and various other methods of social and political action. Nonviolent action is non-standard: it is defined as being different from conventional political action such as voting and electoral campaigning.

Social movements — anti-slavery, feminist, peace and environmental movements, among others — have relied heavily on nonviolent action. Studies show that nonviolent movements against repressive regimes are more likely to be effective than armed resistance.[24] Compared to the use of violence, nonviolent movements have many advantages: they enable greater participation, reduce casualties and, when successful, lead to greater freedom in the long term.

People have been using nonviolent action for centuries. However, the use of nonviolent action as a strategic approach to social change can be attributed to Mohandas Gandhi and the campaigns he led in South Africa and India.[25] Strategic nonviolent action has subsequently been taken up across the world.


Gandhi

So what?

Why should it make any difference whether Western civilisation is judged for its benefits or its harms? Why should people care about something so amorphous as “Western civilisation”?

Most people who live and work in Western countries make little significant contribution to something as massive as a civilisation. They might be likened to observers, analogous to spectators at a sporting match. Logically, there is no particular virtue in being a fan of a winning team. Similarly, there should be no particular glory in touting the achievements of the “civilisation” in which one lives. In practice, though, it seems as if some protagonists in the debate over Western civilisation do indeed identify with it.


Spectators watching gladiators

This is a psychological process called honour by association. It is apparent in all sorts of situations, for example when you tell others about the achievements of your family members or about meeting a famous person. There can be honour by association via the suburb in which you live, your occupation, your possessions, even the food you eat.

The point about honour by association is that, logically, it is not deserved. When, as a spectator, you bask in the glory of a winning team, you’ve done nothing particular noteworthy, aside perhaps from being part of a cheering crowd. The same can apply to being associated with the greatest accomplishments in the history of Western civilisation. If you are part of a long tradition of artistic, intellectual or entrepreneurial achievement, it sounds nice but says nothing about what you’ve done yourself. It is only honour by association.

The same applies to guilt by association, which might also be called dishonour by association. If your ancestors were racists or genocidal killers, why should that reflect on you?[26]

Another way to think about this is to note that no one chooses their own parents. Growing up as part of the culture in which one was born shows no special enterprise and should warrant no particular praise. Emigrants often show more initiative. For various reasons, they are not content with their place of birth and seek out more desirable locations to spend their lives and rear their children.

Why study Western civilisation?

Why study anything? Learning, in a systematic and rigorous fashion, has impacts independent of the subject studied. On the positive side, students learn how to think. In the humanities, they learn to think critically and to communicate in writing and speech. On the negative side, or ambiguously, they learn how to play the academic game, to be willing to subordinate their interests to an imposed syllabus, and to be obedient. Formal education has been criticised as preparation for being a reliable and obedient employee.[27]

More specifically, is there any advantage in studying Western civilisation rather than some other speciality? Proponents say students, and citizens, need to know more about the ideas and achievements that underpin the society in which they live. This is plausible. Critics say it is important to learn not only about the high points of Western civilisation but also about its dark sides. Many of the critics do precisely this, teaching about the history and cultural inheritance of colonialism and capitalism. Their concern about focusing on the greatest contributions from the West is that the negative sides receive inadequate attention.

There is another possible focus of learning: alternatives, in particular alternatives to current institutions and practices that would go further in achieving the highest ideals of Western and other cultures. For example, democracy, in the form of representative government, is studied extensively, but there is little attention to participatory alternatives such as workers’ control.[28] Formal learning in classrooms is studied extensively, but there is comparatively little attention to deprofessionalised learning.[29] Examples could be given in many fields: what exists is often taken as inevitable and desirable, while what does not exist is assumed to be utopian.

The next step after studying alternatives is studying strategies to move towards them. This is rare in higher education, though it is vitally important in social movements.[30]

Why study Western civilisation? One answer is to say, sure, let’s do it, but let’s also study desirable improvements or alternatives to Western civilisation, and how to bring them about.

Controversies over Western civilisation

Some controversies seem to persist indefinitely, regardless of arguments and evidence. The debate over fluoridation of public water supplies has continued, with most of the same claims, since the 1950s. There are several reasons why resolution of debates over Western civilisation is difficult.[31]

One factor is confirmation bias: people preferentially seek out information that supports their existing views, and they find reasons to dismiss or ignore contrary information.[32]

A second factor is the burden of proof. Typically, partisans on each side in a controversy assign responsibility to the other side for proving its case.

A third factor is paradigms, which are coherent sets of assumptions, beliefs and methods. The paradigms underpinning history and sociology are quite different from those used in everyday life.

A fourth factor is group dynamics. In polarised controversies, partisans mainly interact with those with whom they agree, except in hostile forums such as public debates.

A fifth factor is interests, which refer to the stakes that partisans and others have in the issues. Interests include jobs, profits, reputation and self-esteem. Interests, especially when they are substantial or “vested,” can influence individuals’ beliefs and actions.

The sixth and final factor is that controversies are not just about facts: they are also about values, for example about ethics and decision-making. This is true of scientific controversies and even more so of other sorts of controversies.

The upshot is that in a polarised controversy, partisans remain set in their positions, not budging on the basis of the arguments and evidence presented by opponents. It is rare for a leading figure to change their views. It is fairly uncommon for a partisan to try to spell out the strongest arguments for the contrary position. Instead, partisans typically highlight their own strongest points and attack the opponent’s weakest points.

My observation is that all these factors play a role in debates over Western civilisation. It is safe to predict that disagreements are unlikely to be resolved any time soon.

Acknowledgements

Over the years, many authors and colleagues have contributed to my understanding of issues relevant to this article.

Thanks to all those who provided comments on drafts: Paula Arvela, Anu Bissoonauth-Bedford, Sharon Callaghan, Lyn Carson, Martin Davies, Don Eldridge, Susan Engel, Anders Ericsson, Theo Farrell, Zhuqin Feng, Kathy Flynn, Xiaoping Gao, John Hobson, Dan Hutto, Bruce Johansen, Dirk Moses, Rosie Riddick, Nick Riemer, Denise Russell, Jody Watts, Robert Williams, Qinqing Xu and Hsiu-Ying Yang. None of these individuals necessarily agrees with anything in the article, especially considering that many commented only on particular passages.

Further comments are welcome, including suggestions for improving the text.

Brian Martin
bmartin@uow.edu.au

Footnotes

[1] My motivation for addressing this topic is the introduction of a degree in Western civilisation at the University of Wollongong and the opposition to it. I commented on this in “What’s the story with Ramsay?”, 7 March 2019, https://comments.bmartin.cc/2019/03/07/whats-the-story-with-ramsay/

[2] Thomas C. Patterson, Inventing Western Civilization (New York: Monthly Review Press, 1997), says the concept of civilisation, from the time it was first formulated in the 1760s and 1770s, has always referred to societies having a state and hierarchies based on class, sex and ethnicity. Often there is an accompanying assumption that these hierarchies are natural.

[3] On the idea of the savage as an enduring and damaging stereotype that serves as the antithesis of Western civilisation, see Robert A. Williams, Jr., Savage Anxieties: The Invention of Western Civilization (New York: Palgrave Macmillan, 2012).

[4] Mogens Herman Hansen, The Athenian Democracy in the Age of Demosthenes: Structure, Principles and Ideology (Oxford, UK: Basil Blackwell, 1991); Bernard Manin, The Principles of Representative Government (Cambridge: Cambridge University Press, 1997).

[5] Western classical music is not inherently superior to, say, Indian, Chinese, Japanese or Indonesian music. However, musical notation and public performance led in Western Europe to distinctive methods for training elite performers.

[6] Norbert Elias, The Civilizing Process: The History of Manners, volume 1 (New York: Urizen Books, 1978; originally published in 1939).

[7] David Van Reybrouck, Against Elections: The Case for Democracy (London: Bodley Head, 2016).

[8] Harold Barclay, People without Government (London: Kahn & Averill, 1982).

[9] For an account of academic and popular resistance to the idea that the Iroquois Confederacy influenced the US system of democracy, see Bruce E. Johansen with chapters by Donald A. Grinde, Jr. and Barbara A. Mann, Debating Democracy: Native American Legacy of Freedom (Santa Fe, NM: Clear Light Publishers, 1998).

[10] See Benjamin Isakhan and Stephen Stockwell, eds., The Edinburgh Companion to the History of Democracy (Edinburgh: Edinburgh University Press, 2012) for treatments of pre-Classical democracy, and much else.

[11] John M. Hobson, The Eastern Origins of Western Civilisation (Cambridge: Cambridge University Press, 2004).

[12] John H. Bodley, Victims of Progress (Menlo Park, CA: Cummings, 1975).

[13] Benedict Anderson, Imagined Communities: Reflections on the Origin and Spread of Nationalism (London: Verso, 1991, revised edition).

[14] Mahmood Mamdani, When Victims Become Killers: Colonialism, Nativism, and the Genocide in Rwanda (Princeton, NJ: Princeton University Press, 2001).

[15] Shashi Tharoor, Inglorious Empire: What the British Did to India (London: Penguin, 2017).

[16] Walter Rodney, How Europe Underdeveloped Africa (Washington, DC: Howard University Press, 1974).

[17] For a detailed account of the horrors of colonialism in the Congo, and of the struggles to set the narrative about what was happening, see Adam Hochschild, King Leopold’s Ghost: A Story of Greed, Terror, and Heroism in Colonial Africa (Boston: Houghton Mifflin, 1998).

[18] Bruce D. Porter, War and the Rise of the State: The Military Foundations of Modern Politics (New York: Free Press, 1994); Charles Tilly, Coercion, Capital, and European States, AD 990–1992 (Cambridge MA: Blackwell, 1992).

[19] Thomas Plate and Andrea Darvi, Secret Police: The Inside Story of a Network of Terror (London: Sphere, 1983).

[20] Anders Ericsson and Robert Pool, Peak: Secrets from the New Science of Expertise (London: Bodley Head, 2016).

[21] Steven James Bartlett, The Pathology of Man: A Study of Human Evil (Springfield, IL: Charles C. Thomas, 2005).

[22] Nathan Schneider, Everything for Everyone: The Radical Tradition that Is Shaping the Next Economy (New York: Nation Books, 2018).

[23] Gene Sharp, The Politics of Nonviolent Action (Boston: Porter Sargent, 1973).

[24] Erica Chenoweth and Maria J. Stephan, Why Civil Resistance Works: The Strategic Logic of Nonviolent Conflict (Columbia UP, New York, 2011).

[25] M. K. Gandhi, An Autobiography or the Story of My Experiments with Truth (Ahmedabad: Navajivan, 1940, second edition).

[26] This is different from institutional responsibility. When politicians give apologies for crimes committed by governments, they do so as representatives of their governments, not as personal perpetrators.

[27] Jeff Schmidt, Disciplined Minds: A Critical Look at Salaried Professionals and the Soul-battering System that Shapes their Lives (Lanham, MD: Rowman & Littlefield, 2000).

[28] Immanuel Ness and Dario Azzellini, eds., Ours to Master and to Own: Workers’ Control from the Commune to the Present (Chicago, IL: Haymarket Books, 2011).

[29] Ivan Illich, Deschooling Society (London: Calder and Boyars, 1971).

[30] For example, Chris Crass, Towards Collective Liberation: Anti-racist Organizing, Feminist Praxis, and Movement Building Strategy (Oakland, CA: PM Press, 2013)

[31] This section draws on ideas outlined in my article “Why do some scientific controversies persist despite the evidence?” The Conversation, 4 August 2014, http://theconversation.com/why-do-some-controversies-persist-despite-the-evidence-28954. For my other writings in the area, see “Publications on scientific and technological controversies,” https://www.bmartin.cc/pubs/controversy.html.

[32] Raymond S. Nickerson, “Confirmation bias: a ubiquitous phenomenon in many guises,” Review of General Psychology, 2(2), 1998, pp. 175–220.

A cult of smartness

In higher education, being smart is greatly prized. But over-valuing smartness has downsides.

Alexander Astin is a US academic with vast experience with higher education in the country. During his career, he visited hundreds of campuses and talked with thousands of students, academics and administrators. He became convinced that there is a fundamental malady in the system: an obsession with smartness.  

Astin summarises his concerns in a readable book titled Are you smart enough? How colleges’ obsession with smartness shortchanges students, published in 2016. His focus is entirely on the US but many of his assessments apply to Australia too.

Is your university prestigious?

University leaders greatly prize the status of their institutions. No surprise here. There is a widely known pecking order. Astin says that if you ask people in the US to name the best universities, they regularly come up with the same ones: Harvard, Yale, Princeton, Berkeley and so forth. The exact rankings might shift a bit over time, but the same ones appear in the top group. What is remarkable is that this order has hardly varied in half a century.

Would you like to be a Harvard graduate?

            In Australia, the same thing applies: those commonly considered the best are the Australian National University, Melbourne, Sydney and so on down the list. The stability of the priority order is remarkable when you compare it to corporations. Apple, Amazon and Google are near the top of the pile but didn’t exist decades ago. Not a single new university has shot into the top group.

            Next consider students. Most of them want to go to a prestigious university. They would rather go to Harvard than Idaho State, at least if they can get into Harvard. In Australia, students are attracted by the status of a university but also by the exclusiveness of a faculty. It’s higher status to study medicine or law than nursing or chemistry. Many high school students want to undertake the most exclusive degree they can. Why “waste” an ATAR (Australian Tertiary Admission Rank) of 99.9 on studying visual arts when you can do medicine?

            Student preferences are driven mostly by status, with very little attention to the quality of the education provided. The student quest for status is misguided in several ways. One misapprehension is that a high-status university provides a better education. Because status is built mostly on research performance, it does not necessarily correlate with the quality of teaching and the richness of the university experience.

            A second misapprehension is that getting a degree from a high-status university is a worthwhile investment. Universities regularly tout figures showing that graduates earn more over their lifetime than non-graduates. However, this is not a valid comparison, because if those who graduated had chosen a different path, they might have been just as successful. The point is that the qualities of the student may do more to determine their career success than the status of the university they attended or the advantages of the learning that it provided. The pay-off for attending a more selective university or undertaking a more exclusive degree may not be much at all.

            The message for students is straightforward: instead of pursuing status, develop your skills and productive habits.

Are you attracting the best students?

Every university seeks to recruit the best students it can. At the University of Wollongong, this is obvious enough. The prestige of degrees accords with how restrictive they are. Faculties makes special pitches to students with high ATARs. They can become a “Dean’s Scholar” with special advantages. Universities with more money offer undergraduate scholarships to top performing students. Astin summarises the collective experience: “Every college and university, no matter its size or research emphasis, seeks out smart students.” 

            So what? Astin has three responses. First, he notes that the mad scramble to recruit top students is silly from a system point of view. If the students are going to go somewhere, why not just allocate them randomly? The reason is that a university’s status depends on the perception that its students are smart.

A school touts having smart students

            Astin’s second response is that universities have become so obsessed with smartness that they pay more attention to recruiting top students than educating them. As he puts it, “if you look at our higher education system from an educational perspective, this preoccupation with enrolling smart students makes little sense, because the emphasis seems to be more on acquiring smart students than on educating them well …” He provides many telling examples. More on this later.

            His third response is to provide an analogy with the health system. If you are ill and go to a hospital’s emergency department, you will encounter a triage process. Your health problem will be assessed. If it is serious and urgent, you will be taken straight in for treatment. If it is not serious and not urgent, you will have to wait until the urgent cases have been dealt with. If it is nothing to worry about, you’ll be sent home. The health system puts most of its resources towards helping those with the worst health.

            This orientation can be criticised by arguing that far more should be spent on preventive health measures, for example addressing pollution and unhealthy diets. But even in preventive health areas, the emphasis is on measures that help the greatest number of people at lowest cost.

            In contrast, in higher education, most resources are directed towards those who are the highest performing, which means those who need the least support for learning. This is true in university entry, in provision of scholarships and in higher degrees. It is also true in classrooms where teachers give more attention and encouragement to the best students. 

            Astin points out that most teachers give more attention to what students know than to how much they have improved. Few teachers give tests at both the beginning and conclusion of courses in order to see what students have learned. Instead, they give tests to rank students, with the emphasis on seeing who is superior rather than focusing on improvement.

            He notes that giving grades on assignments “is of limited usefulness in helping students improve their performance.” Many of my colleagues give extensive comments on assignments, not just grades. But I’ve also noted that many students focus on the grades, not on using comments to improve. 

            Another shortcoming of most classes is that teachers do not require students to keep working on the same assignment. When students are assigned to write an essay, usually it is marked and then the student moves on to the next assignment. There is more learning when students are expected to consider feedback and work on improving the essay, submitting it again and, if needed, yet again. On the few occasions when I used this approach, I could see its great value. But alas, this requires more time and effort by the teacher and is more challenging when class sizes expand.

Are you a smart academic?

Among academics too, there is a cult of smartness. Those researchers who bring in loads of external money and build up empires of research students and postdocs are highly prized. There is no such glorification of outstanding teachers.

            The emphasis on being smart manifests itself in various ways. Astin says some academics are “maximisers” who seek to display how smart they are. Their questions at seminars are designed to show off their knowledge. Maximisers, when on committees, may become blockers. It’s easier to show your critical acumen by attacking someone else’s proposal than by presenting one’s own.

            Other academics, Astin’s “minimisers,” put a priority on hiding any suggestion that they lack intelligence. This is related to the “imposter syndrome,” in which individuals feel they are faking it and don’t really deserve to be among all those other brilliant colleagues. 

            How nice it would be if it were easier to acknowledge weaknesses and lack of knowledge, to say “I don’t know” and “I need to improve my skills.” 

            Astin lists a whole range of ways that the obsession with smartness affects academic work. It:

• “limits prospects for educational equity
• limits social welfare
• hinders academic governance 
• limits recognition of different forms of intelligence
• limits development of creativity, leadership, empathy, social responsibility, citizenship, self-understanding
• limits finding better methods of assessment” (pp. 100-101)

What to do?

For getting away from the obsession with smartness and helping students who need help the most, Astin offers four principles for helping “underprepared students.”

            The first is to promote engagement with learning, so students are motivated to study. Second is to foster peer interaction, so students learn from each other, including from more advanced students. Third is to have more interaction with academics. The fourth is to emphasise writing skills.

            All these are worthwhile. It’s possible to imagine a university that pioneers systematic peer learning, with students in classes helping each other learn, students in upper-level classes assisting those in lower-level classes, and all spending time assisting disadvantaged students in the community. There are elements of each of these in some places, but shifting universities in this sort of direction seems a mammoth task. As Astin shows all too well, the prestige ranking of US universities is built on and helps perpetuate the obsession with smartness, an obsession that affects students, academics and administrators.

            As critics have argued for decades, the education system serves not just to promote learning but to provide a rationale for social stratification. In other words, it justifies inequality: if you don’t succeed, it’s because you’re not smart enough. The implication of this critique is that changing the role of universities has to go hand in hand with challenging economic inequality. That’s a big task!

            It is still possible to innovate in small ways within universities, and there are options for individuals. Students can choose to attend less prestigious institutions or to undertake less exclusive degrees, thereby questioning the smartness hierarchy. Academics can introduce peer learning in their classes, expand outcomes beyond cognitive tasks and measure learning before and after teaching.

            Then there is the wider issue of the role of universities in society. If learning is the goal, why are degrees needed for certification? The radical alternative of de-schooling — learning by being part of a community designed for that purpose — can be reintroduced and updated for the digital age, in which access to abundant information is possible, and sorting through it and making sense of it are the greater challenges.

            In a way, the biggest indictment of higher education is that it is so difficult to promote educational alternatives, to test out different ways of organising learning and to imagine different ways of pursuing greater knowledge for social benefit. Nevertheless, there remains hope for change when critics like Astin offer the insights of a lifetime and encourage the rest of us to see what is all too familiar with different eyes.

Alexander Astin

Brian Martin
bmartin@uow.edu.au

What’s the story with Ramsay?

In December 2018, a partnership was announced between the Ramsay Centre and the University of Wollongong. The university would establish a degree in Western Civilisation funded by the centre.

An idyllic scene at the University of Wollongong

            The new degree was immediately controversial. In the previous months, there had been considerable publicity about proposed Ramsay-funded degrees in Western civilisation at the Australian National University and the University of Sydney. At both universities, many staff were opposed to the degrees. The ANU proposal did not go ahead, while the Sydney proposal was still being debated. Given this background, opposition to the degree at Wollongong was not surprising.

My aim here is to give a perspective on the controversy over the Ramsay-funded Western civilisation degree, especially as it has been played out at the University of Wollongong (UOW). I write as an academic at the university without a strong stake in the new degree, because I am retired and the issues involved do not impinge greatly on my main research areas. However, a number of my immediate colleagues have very strong views, and I have benefited from hearing their arguments, as well as the views of proponents of the degree.

            The next section gives a brief overview of the institutional context, which is useful for understanding both incentives and concerns associated with Ramsay funding. Following this is an introduction to the Ramsay Centre. Then I outline the major issues raised at the university: decision-making, the conservative connection, Western civilisation and equality of resourcing. The conclusion offers a few thoughts on the de-facto strategies of key players.

It would be possible to go into much greater depth. Relevant are issues concerning the aims of education, the funding of higher education, the impact of private funding and agendas, the question of Western civilisation and the role of political ideology. Others have more expertise on these and other issues, and I hope some of them will contribute to the discussion.

Background: the Australian university sector

Most Australian universities are funded by the federal government, but the funding environment has become increasingly challenging. In the 1980s, the government introduced tuition fees based on government zero-interest loans paid back as part of income tax only when a student’s income reached a moderate level. Introducing these fees provided universities a sizeable income stream, but not a bonanza, because the government cut its direct funding, while opening the gates to a massive expansion in student numbers over the following decades.

            The result was that academics were met with ever-increasing class sizes. The student-staff ratio dramatically increased, almost doubling in some fields. However, this wasn’t enough to fix the financial squeeze. University managements dealt with it in two main ways.

Students in the Hope Theatre at UOW

            Firstly, they aggressively recruited international students, who had to pay substantial tuition fees. International student fees were used to cross-subsidise other operations. Eventually, this income became Australia’s third largest export industry, after iron and coal.

            Secondly, teaching was increasingly carried out by “casual” staff, paid by the hour or on short-term contracts. University teaching was casualised almost as much as the fast food industry.

            Also beginning in the 1980s, the government pushed universities and other higher education institutions to amalgamate. Increased size, through amalgamations and student recruitment, became a goal, augmented by setting up of additional campuses in Australia and in other countries. Universities became big businesses, with budgets of many hundreds of millions of dollars.

            In higher management at Australian universities, finances became a preoccupation. All avenues for income are canvassed, though the options have been restricted mainly to government funding, student fees and research grants. The other side of the coin has been cost containment, including by increasing class sizes, cutting staff numbers and, as mentioned, relying ever more on casual staff for teaching.

            Unlike the US, in Australia there is no tradition of private support for universities. Gifts from alumni are welcome but are usually a tiny portion of income. Philanthropy is not prominent.

Enter Ramsay

Paul Ramsay

It was in this context that the Ramsay Centre for Western Civilisation entered the picture. Paul Ramsay made a fortune in private healthcare, including buying and running numerous hospitals.[1] He died in 2014, having bequeathed a portion of his estate to setting up university courses in Western civilisation, run with small classes in which students study great books, in the manner of a few other such courses in the US and elsewhere. The Ramsay Centre was set up to manage this bequest. In 2017, the Centre invited expressions of interest from Australian universities to receive funding to set up and run degrees in Western civilisation.

            The University of Wollongong was the first university to announce an agreement to set up such a degree. From the point of view of university managers, this was an attractive proposition. It would involve the largest ever injection of private money into an Australian university to fund a humanities programme, amounting to many tens of millions of dollars. It was enough to employ ten academics and give scholarships to dozens of undergraduates.

            Early in 2019, Professor Theo Farrell, executive dean of the Faculty of Law, Humanities and the Arts at UOW, outlined the financial benefits of the arrangement in meetings held to discuss the new degree.[2] The faculty was affected by a decline in the number of undergraduate students enrolling in arts degrees, a decline occurring across the state, not just at Wollongong.[3] The Ramsay-funded degree would have both direct and spinoff benefits financially. The students undertaking the degree would have to take a major or a double degree at the university, most likely in the faculty, giving a boost to enrolments.

            A secondary benefit was claimed: because the Ramsay-funded students had to have good results in high school and because they were being paid, they were more likely than other students to finish their degrees. If true, this would aid the faculty’s overall retention rate, something the government would favour.

            The Ramsay money would support the employment of ten academics and two professional staff. One of the academics is Dan Hutto, senior professor of philosophy, appointed head of the new School of Liberal Arts hosting the new degree. There are to be nine newly hired academics, all of them philosophers. Though hired for teaching, their relatively light teaching loads would free them up to do research. Their presence potentially could turn UOW into a philosophy powerhouse, beyond its current dynamism led by Hutto.

            From the point of view of its advocates, the new degree thus brought great advantages to the faculty and the university. It involved the injection of a large amount of money with spinoff benefits for the rest of the faculty. And it would position UOW as a prominent player internationally among great-books programmes and in philosophy.

            Acceptance of the degree was not straightforward. As soon as it was announced, academics and students expressed opposition. Here, I look at the grounds for opposition under several categories: decision-making, the conservative connection, Western civilisation and equality. In practice, these concerns are often mixed together.

Anti-Ramsay protest, UOW, 1 March 2019. Photo: Adam McLean, Illawarra Mercury

Decision-making

Discussions between the centre and UOW were carried out in secret. Only a few people at the university even knew negotiations were occurring. Critics decried the secrecy.

            University officials said, in defence, that these sorts of negotiations are carried out all the time, without any public announcement. Indeed, there are many examples in which major developments have been announced as fait accompli. For example, in November 2018 an announcement was made that the university had purchased colleges in Malaysia.[4] There was no protest about this; indeed, few took any notice.

            On the other hand, the Ramsay Centre was already controversial elsewhere, separately from Wollongong. As the Australian National University negotiated with the Ramsay Centre, there was considerable publicity, especially when university leaders decided against having a Western civilisation degree because of concerns about academic freedom. At the University of Sydney, major opposition emerged to a Ramsay-funded degree, with protests and much media coverage.

            In this context, the secrecy at UOW seemed anomalous. It was true that university management often proceeded on major initiatives without consultation with academic staff, but this was not a typical case: it was already known to be controversial.

The conservative connection

On the Ramsay Centre board are two prominent political conservatives: former prime ministers John Howard and Tony Abbott. For quite a few staff at UOW, the presence of Howard and Abbott tainted the Ramsay Centre and its funds.

            As explained by Farrell, the board of the Ramsay Centre has no input into what was taught in the degree. Negotiations with the centre were with two academics that it employed, Simon Haines and Stephen McInerney, not with the board.

            One of the concerns expressed about the degree was that Ramsay Centre representatives would be members of the selection committees for the newly hired academics. For many academics, the idea of non-academic ideologues sitting on academic selection committees was anathema. Farrell countered by emphasising that members of the Ramsay Centre Board, such as Howard and Abbott, would have nothing to do with appointments. Only the Ramsay academics would be involved. A typical selection committee would have the two Ramsay academics, one outside academic, up to six UOW academics including Farrell as chair of the committee. Farrell said that it was not unusual for non-UOW figures to sit on selection committees. In other words, there were many precedents for the processes relating to the new degree.

            Farrell noted that in his experience most selection committees operate by consensus, not voting, but that if it came to a vote, UOW members had the numbers. In response to a question about what the Ramsay academics would be looking for — the worry being that they would want candidates aligned with particular political positions — Farrell said that in his interactions so far with the Ramsay academics, their main concern was that the appointees be good teachers.

            At a meeting for faculty members about the new degree held on 11 February, Marcelo Svirsky, senior lecturer in International Studies, raised a concern about the reputational damage caused by the connection between Ramsay and the university. Farrell said the university’s reputation internationally would be enhanced via connections with Columbia University and other institutions with similar sorts of degrees. Such connections were important given how difficult it was to build affiliations with leading universities. Domestically, Farrell said that information about the content of the UOW degree was gaining traction in the media, counteracting earlier bad publicity about the proposed degrees at other universities. He explicitly denied any risk to reputation.

            It is fascinating to speculate what the response to the Ramsay money would have been had Howard and Abbott not been on the board. Many academics vehemently oppose the political positions of Howard and Abbott, making it difficult to accept any initiative associated with the two politicians. In the wider public, the involvement of Howard and Abbott mean the Ramsay Centre is inevitably caught up in the emotions associated with right-wing politics and the so-called culture wars.

            Would there be the same academic opposition to money coming from a centre linked to leading figures from green or socialist politics? This can only be surmised, because if a green-red twin of the Ramsay Centre were funding a degree, it would not be called a degree in Western civilisation.

Western civilisation

For academics in some sections of the humanities and social sciences, “Western civilisation” is a term of opprobrium, not endearment. It is useful to note that in several fields, critique is one of the standard tools: accepted ideas, practices and institutions are subject to critical scrutiny, often with assumptions and beliefs skewered. For example, in my field of science and technology studies, challenges to ideas such as scientific progress and “technology is neutral” are fundamental to much teaching and research. Yet, in the wider public, conventional ideas about science, technology and progress remain dominant. Therefore, teaching in the field necessarily involves questioning conventional thinking.

            For some, “Western civilisation” brings up images of Socrates, Michelangelo, Shakespeare and Einstein: great thinkers and creators from Europe. It also brings up images of parliamentary democracy, human rights and liberation from oppressive systems of domination. These are some of the positives of Western history and politics.

Plato

            There is also a seamier side to Western history and politics. Colonialism and imperialism sponsored by Western European states resulted in massive death, displacement and enslavement of Indigenous peoples. In Australia, white settlement caused death and the destruction of the culture of Aboriginal peoples.

            As well as the legacy of colonialism, the history of Europe has its own dark aspects, for example the Crusades, the Inquisition, the horrors of the industrial revolution and the Nazi genocide. A full account of Western cultures needs to address their damaging as well as their uplifting sides.

            While Western civilisation has been responsible for horrific deeds, these have been carried out with convenient rationales. Colonialism was seen by its defenders as part of a civilising mission, bringing enlightenment to savage peoples. Yet the aftermath of this mission continues to cause suffering. For example, in Rwanda, Belgian colonialists imposed the categories of Tutsi and Hutu on the population, helping lay the stage for the 1994 genocide. In Australia, poverty and incarceration of Aboriginal people are among the contemporary consequences of colonialism.

Not on the reading list for the degree in Western civilisation

            For many academics, it is imperative to challenge the glorified myth of the beneficence of Western culture. It is part of the scholarly quest to attain insight into what really happened, not just what is convenient to believe, and this often involves pointing to the unsavoury aspects of history and politics that others would rather ignore or downplay.

            In this context, the very label “Western civilisation” is an insult to some scholars in the area, because the term “civilisation” has positive connotations unlike, for example, “Western barbarism.” For scholars, the label “Western civilisation” suggests a focus only on one side of a complex and contentious past and legacy.

            Hutto, in presenting the subjects to be taught in UOW’s Western civilisation degree, emphasised that about half of them involved studying texts from other cultures, including texts concerning Buddhism, Islam and Indigenous cultures. To fully understand Western culture, it is valuable to appreciate other cultures: a respectful dialogue provides more insights than concentrating on Western items alone.

Printing was invented in China

            As well, some of the texts that Hutto proposed from Western writers offered critical perspectives on Western societies. In these ways, Hutto distanced the degree from Abbott’s claim that it would be for Western civilisation,[5] instead positioning it as something different. In Hutto’s view, the degree uses the study of great works of Western civilisation, in conversation with non-Western traditions, as a way for students to develop their critical capacities, using evidence and argument to back up their views. In short, Hutto’s aim for the degree is that students learn how to think, not what to think. Students are bound to be exposed to critical perspectives, including in the major or degree they are required to take in addition to the one in Western civilisation.

            The degree as designed by Hutto might clash with the conceptions of some Ramsay Centre board members. It might also clash with the public perception, at least as informed by media coverage, that the degree would be one-sided advocacy for Western contributions. Intriguingly, if Howard or Abbott were to express reservations about UOW’s degree, this would temper the media and public perceptions of one-sidedness.

            One of the problems with the concept of Western civilisation is that, in the public debate, it is seldom defined. Some critics might say that to talk of Western civilisation is a category mistake, attributing a reality to an abstraction whose meaning is contested. The variability of the meaning of “Western civilisation” may lie behind some of the disputes over the degree carrying this name.

Equality of resourcing

Ramsay’s large donation seems like a boon to a cash-strapped university, enabling the hiring of staff and the running of small classes that otherwise would be infeasible. On the other hand, UOW’s planned degree creates tensions between the privileged few and the rest.

UOW’s building 19, where the School of Liberal Arts will be located

The academics hired to teach the new degree would seem to have some extra benefits. In particular, they will be teaching small classes, of no more than ten students, of high-calibre students. In contrast, their colleagues, namely the rest of the academics in the faculty, are saddled with tutorial classes of 25, plus lectures sometimes with hundreds of students.

            For some academics, this contrast is a source of considerable disquiet. Imagine someone working in a field where offerings cover the same topics as proposed in the Western civilisation degree. They might well say, “We have the expertise and experience in the area. Why are we being squeezed while newcomers are given generous conditions to teach the same topics from a philosophical perspective?”

            There has been no formal response to questions of this type. One reply would be to say that there are all sorts of inequalities between staff, only some of which are related to merit. The most obvious inequality is between permanent and non-permanent teachers. Some of the teachers on casual appointments are just as qualified as those with continuing appointments. There are also inequalities between academics, especially in research. For example, some researchers are exempted from teaching on an official or de-facto basis.

            Academics tend to be highly sensitive to inequality in treatment, in part because professional status is so highly valued. There are regular disputes about workloads: seeing a colleague with a lighter teaching load can cause envy or resentment. That a whole group of new academics seems to receive special conditions can bring this sort of resentment to the fore.

            The students selected for scholarships to undertake the Western civilisation degree have to satisfy several conditions. They must be Australian citizens or permanent residents, young, recently completed high school and have obtained a high score in the examinations at the end of high school. In other words, mature-age students and international students are excluded from consideration. Scholarship students will receive an annual stipend of $27,000, paid for up to five years.[6]

            To some, the special privileges for scholarship students are unfair, especially the restriction to young Australian students. To this, a reply might be that inequalities between students are commonplace. The most obvious is between domestic and international students, the latter having to pay large tuition fees. Students on postgraduate scholarships are privileged too. This sometimes can be justified on merit, though the difference between students near the scholarship cut-off point may be tiny.

Tactics

To appreciate the struggle over the Ramsay-Centre-funded degree in Western civilisation at the University of Wollongong, it is useful to think of the key players as using tactics to counter the moves of their opponents. Thinking this way is a convenience and does not imply that players actually think in terms of a strategic encounter.

            The proponents of the degree seem to be driven by two main considerations: the availability of a large amount of private money to be injected into the humanities, and the opportunity to build a world-class philosophy unit. To acquire the Ramsay money and build the philosophy unit, it was useful to counter likely sources of opposition, in particular the opposition of academics in cognate units concerned about the ideological associations with the Ramsay Centre and the concept of Western civilisation.

            To forestall the sort of rancorous public debate that occurred at the Australian National University and Sydney University, which might scuttle the degree before it was agreed, the degree proponents negotiated in secret. This did indeed reduce public debate, but at the expense of a different source of concern, the secrecy itself.

            To counter concerns associated with the ideological associations with Ramsay and Western civilisation, Dan Hutto, designer of the degree, went to considerable effort to include in the core subjects respectful intellectual engagements with non-Western cultures, and to include negative as well as positive sides of Western culture.

One of Western civilisation’s technological innovations

            Critics and opponents of the degree were not mollified. Some simply ignored the innovative aspects of the subject offerings and assumed that any degree labelled “Western civilisation” must be an apologia for Western colonialism. Other opponents, though, focused on procedural matters, for example the fast-track approval of the degree despite its possible risk to the university’s reputation.

            One of the consequences of the degree is the introduction of a privileged stratum of staff, with much lighter teaching loads, and of students given scholarships to undertake the degree. For proponents of the degree, there is no easy way to address the associated staff and student inequality. However, this inequality has not played a significant role in the public debate. There are numerous other inequalities within universities, so perhaps the introduction of one more, despite its high profile, is not a likely trigger for public concern.

            One of the positive outcomes of the new degree is the debate it has stimulated. Hutto has grasped the opportunity by planning to have the students discuss, in their first week in the degree beginning in 2020, the debate about the degree itself. For those so inclined, the new degree provides a golden opportunity to articulate critiques of Western civilisation and make them available to staff and students in the new School of Liberal Arts. Although Tony Abbott claimed that the Ramsay-funded degrees would be for Western civilisation, it is quite possible that many of the degree graduates will develop a sophisticated understanding of Western civilisation. Perhaps, along the way, members of the public will learn more about both the high and low aspects of Western cultures.

            What would Paul Ramsay think of the furore over degrees in Western civilisation? Perhaps he would be bemused that his bequest is receiving much more attention than he ever sought for himself during his lifetime.

I thank the many individuals who have discussed the issues with me and who have offered comments on drafts.


[1] In the debate about Ramsay Centre funding, Paul Ramsay and Ramsay Health Care have scarcely been mentioned. Michael Wynne, a vigorous critic of corporate health care, developed an extensive website with information about numerous heathcare corporations in the US and Australia. While being critical of for-profit heathcare, Wynne has relatively generous comments about Paul Ramsay himself and about Ramsay Health Care, at least compared to other players in the corporate scene. See:

• “Corporate Medicine Web Site,” https://www.bmartin.cc/dissent/documents/health/

• “Ramsay Health Care”, https://www.bmartin.cc/dissent/documents/health/ramsay_main.html

• “Paul Ramsay”, https://www.bmartin.cc/dissent/documents/health/ramsay_leaders.html#Paul Ramsay

Wynne’s pages on Ramsay were last updated in 2005, but after this Paul Ramsay played a less direct role in Ramsay Health Care.

Screen shot from Michael Wynne’s website on corporate healthcare

[2] I attended meetings on 16 January and 11 February 2019 held for members of the Faculty of Law, Humanities and the Arts. Theo Farrell and Dan Hutto told about plans for the new degree and answered questions.

[3] Another factor, specific to UOW, was the setting up of a Faculty of Social Sciences that, despite its name, does not house the classic social sciences of sociology, political science and economics. This faculty set up a social science degree that is in direct competition with the arts degree, attracting students that otherwise would have contributed to the budget for the Faculty of Law, Humanities and the Arts.

[4] Andrew Herring, “University of Wollongong continues global expansion into Malaysia,” 19 November 2018, https://media.uow.edu.au/releases/UOW253448.html: The media release begins as follows: “The University of Wollongong (UOW) has continued its global expansion by acquiring the university colleges of Malaysian private education provider KDU from long-standing Malaysian investment company Paramount Corporation Berhad (PCB).

Subject to Malaysian Ministry of Education approval, the deal will see UOW wholly-owned subsidiary, UOW Global Enterprises, immediately acquire a substantive majority equity interest in the university colleges in Kuala Lumpur and Penang—including the new campus under construction in Batu Kawan.”

[5] Tony Abbott, “Paul Ramsay’s vision for Australia,” Quadrant Online, 24 May 2018, https://quadrant.org.au/magazine/2018/04/paul-ramsays-vision-australia/. Quite a few commentators blamed Abbott’s article for hindering acceptance of a Ramsay-funded degree at the Australian National University, e.g. Michael Galvin, “Abbott single-handedly destroys Ramsay Centre for Cheering On White People,” The Independent, 17 June 2018; Peter van Onselen, “Ramsay Centre has Tony Abbott to blame for ANU’s rejection,” The Australian, 9 June 2018. Note that the preposition for is contained in the full name of the centre: the Ramsay Centre for Western Civilisation.

[6] Entry to the degree course is open to students of any age, and to five non-residents. The conditions mentioned apply only to those receiving Ramsay scholarships, and even then exceptions can be made. An ATAR (Australian Tertiary Admission Rank) of 95 has been mentioned as an expectation for scholarship recipients. Other factors will be taken into account.

Brian Martin
bmartin@uow.edu.au

Snowflake journalists

Some Australian media outlets have been warning that university students are unduly protected from disturbing ideas. But are these same media outlets actually the ones that can’t handle disturbing ideas?

For years, I’ve been seeing stories in The Australian and elsewhere about problems in universities associated with political correctness (PC). The stories tell of students who demand to be warned about disturbing material in their classes, for example discussions of rape in a class on English literature. The students demand “trigger warnings” so they can avoid or prepare for potentially disturbing content. Detractors call them “snowflake students”: they are so delicate that, like a snowflake, they dissolve at exposure to anything slightly warm.

Former Labor Party leader Mark Latham, for example, referred to “the snowflake safe-space culture of Australian universities.”


Richard King

Richard King, the author of On Offence: The Politics of Indignation, reviewed Claire Fox’s book I Find that Offensive. King says that the principal target of Fox’s book “is ‘the snowflake generation’, which is to say the current crop of students, especially student activists, who keep up a constant, cloying demand for their own and others’ supervision. ‘Safe spaces’, ‘trigger warnings’ and ‘microaggressions’ are all symptoms of this trend.”

I treat these sorts of stories with a fair bit of scepticism. Sure, there are some incidents of over-the-top trigger warnings and demands for excessive protection. But are these incidents representative of what’s happening more generally?

Before accepting that this is a major problem, I want to see a proper study. A social scientist might pick a random selection of universities and classes, then interview students and teachers to find out whether trigger warnings are used, whether class discussions have been censored or inhibited, and so forth. I’ve never heard of any such study.

What remains is anecdote. Media stories are most likely to be about what is unusual and shocking. “Dog bites man” is not newsworthy but “man bites dog” might get a run.

Most of the Australian media stories about trigger warnings and snowflake students are about what’s happening in the US, with the suggestion that Australian students are succumbing to this dire malady of over-sensitivity.


Trigger warnings: Australian movie and video game classifications

My experience

There is a case for trigger warnings. Nevertheless, in thirty years of undergraduate teaching, I never saw any need for them — except when I asked students to use them.

For one assignment in my class “Media, war and peace,” students formed small groups to design an activity for the rest of the class. The activity had to address a concept or theory relating to war or peace, violence or nonviolence. Quite a few student groups chose the more gruesome topics of assassination, torture or genocide, and some of them showed graphic pictures of torture and genocidal killings.

Never did a single student complain about seeing images of torture and killing. Nevertheless, I eventually decided to request that the student groups provide warnings that some images might be disturbing. Thereafter, when groups provided warnings, no students ever excused themselves from the class. I was watching to see their reactions and never noticed anyone looking away.

This is just one teacher’s experience and can’t prove anything general. It seems to show that some Australian students appear pretty tough when it comes to seeing images of violence. Perhaps they have been desensitised by watching news coverage of wars and terrorist attacks.

However, appearances can be deceptive. My colleague Ika Willis pointed out to me that students may hide their distress, and that few would ever complain even if they were distressed. So how would I know whether any of my students were trauma survivors and were adversely affected? Probably I wouldn’t. That is an example of why making generalisations about trigger warnings based on limited evidence is unwise.

A journalist attends classes – covertly

On 8 August 2018, Sydney’s Daily Telegraph ran a front-page story attacking three academics at Sydney University for what they had said in their classes. The journalist, Chris Harris, wrote about what he had done this way: “The Daily Telegraph visited top government-funded universities in Sydney for a first-hand look at campus life …” This was a euphemistic way of saying that he attended several classes without informing the teachers that he was attending as a journalist, and covertly recorded lectures without permission. Only in a smallish tutorial class, in which the tutor knows all the students, would an uninvited visitor be conspicuous.


Chris Harris

Harris then wrote an expose, quoting supposedly outrageous statements made by three teachers. This was a typical example of a beat-up, namely a story based on trivial matters that are blown out of proportion. Just imagine: a teacher says something that, if taken out of context, can be held up to ridicule. Many teachers would be vulnerable to this sort of scandal-mongering.

One issue here is the ethics of covertly attending classes and then writing a story based on statements taken out of context. Suppose an academic covertly went into media newsrooms, recorded conversations and wrote a paper based on comments taken out of context. This would be a gross violation of research ethics and scholarly conventions. To collect information by visiting a newsroom would require approval from a university research ethics committee. Good scholarly practice would involve sending a draft of interview notes or the draft of a paper to those quoted. In a paper submitted for publication, the expectation would be that quotes fairly represent the issues addressed.


A typical Daily Telegraph front page

Where are the snowflake students?

So when Harris attended classes at universities in Sydney, did he discover lots of snowflake students who demanded to be protected by trigger warnings? He didn’t say, but it is clear that at least two individuals were highly offended: a journalist and an editor! They thought the classroom comments by a few academics were scandalous.

In a story by Rebecca Urban in The Australian following up the Telegraph expose, Fiona Martin’s passing comment about a cartoon by Bill Leak comes in for special attention. According to this story, “The Australian’s editor-in-chief Paul Whittaker described the comment as ‘appalling’ and ‘deeply disrespectful’.”


Paul Whittaker

So apparently News Corp journalists and editors are the real snowflakes, not being able to tolerate a few passing comments by academics that weren’t even intended for them or indeed for anyone outside the classroom. Or perhaps these journalists and editors are outraged on behalf of their readership, who they consider should be alerted to the dangerous and foolish comments being made in university classrooms.

Where in this process did the call for students to be tough and be exposed to vigorous discussion suddenly dissolve?

The contradiction is shown starkly in a 10 August letter to the editor of The Australian by Andrew Weeks. The letter was given the title “Bill Leak’s legacy is his courage in defending the right to free speech”. Weeks begins his letter by saying “I am unsure what is most disturbing about the abuse of sadly departed cartoonist Bill Leak by Fiona Martin.” After canvassing a couple of possibilities, he says “Perhaps it is the fact that Sydney University has supported its staffer, offering lip service in support of freedom of speech when that is exactly what is being endangered by the intolerance characteristic of so many university academics.”

The logic seems to be that freedom of speech of Bill Leak (or those like him) is endangered by an academic’s critical comment in a classroom, and that a university administration should not support academics who make adverse comments about Leak.


Bill Leak

Again it might be asked, what happened to the concern about the snowflake generation? The main snowflakes are, apparently, a journalist, an editor and some readers. Perhaps it would be wise in future for journalists to avoid visiting university classrooms so that they and their readers will not be disturbed by the strong views being expressed.

Final remarks

Universities do have serious problems, including a heavy reliance on casual teaching staff and lack of support for international students, both due to lack of money. More students report problems with anxiety and depression. There is also the fundamental issue of the purpose of higher education, which should not be reduced to job preparation. Instead of addressing these issues, News Corp newspapers seem more interested in the alleged danger, apparently most virulent in humanities disciplines, of political correctness.

My focus here is on an apparent contradiction or discrepancy in treatments of PC and “snowflake students” in The Australian and the Daily Telegraph. While decrying the rise of the so-called snowflake generation, journalists and editors seemed more upset than most students by comments made in university classrooms.

One other point is worth mentioning. If you want to inhibit vigorous classroom discussions of contentious issues, there’s no better way than spying on these discussions with the aim of exposing them for public condemnation. This suggests the value of a different sort of trigger warning: “There’s a journalist in the classroom!”

Further reading (mass media)

Josh Glancy, “Rise of the snowflake generation,” The Australian, 8-9 September 2018, pp. 15, 19.

Christopher Harris, “Degrees of hilarity” and “Bizarre rants of a class clown,” Daily Telegraph, 8 August 2018, pp. 4-5.

Amanda Hess, “How ‘snowflake’ became America’s inescapable tough-guy taunt,” New York Times Magazine, 13 June 2017.

Richard King, “Fiery blast aimed at ‘snowflake generation’,” The Australian, 1 April 2017, Review p. 22.

Mark Latham, “The parties are over,” Daily Telegraph, 9 January 2018, p. 13.

Bill Leak, “Suck it up, snowflakes,” The Australian, 11 March 2017, p. 15.

Rebecca Urban, “Uni backs staffer on secret suicide advice,” The Australian, 9 August 2018, p. 7; (another version) “University of Sydney stands by media lecturer following Bill Leak attack,” The Australian, 8 August 2018, online.

Further reading (scholarly)

Sigal R. Ben-Porath, Free Speech on Campus (University of Pennsylvania Press, 2017).

Emily J. M. Knox (ed.), Trigger Warnings: History, Theory, Context (Rowman & Littlefield, 2017).

Acknowledgements
Thanks to several colleagues for valuable discussions and to Tonya Agostini, Xiaoping Gao, Lynn Sheridan and Ika Willis for comments on a draft of this post. Chris Harris and Paul Whittaker did not respond to invitations to comment.

Brian Martin
bmartin@uow.edu.au

Write, write, write

Researchers need to write as part of their job. It’s remarkable how stressful this can be. There is help at hand, but you have to be willing to change your habits.

Writing is a core part of what is required to be a productive researcher. Over the years, I’ve discovered that for many of my colleagues it’s an agonising process. This usually goes back to habits we learned in school.

Sport, music and writing

Growing up, I shared a room with my brother Bruce. I was an early riser but he wasn’t. But then, in the 10th grade, he joined the track and cross-country teams. Early every morning he would roll out of bed, still groggy, change into his running gear and go for his daily training run. After school he worked out with the team. He went on to become a star runner. At university, while majoring in physics, he obtained a track scholarship.

As well, Bruce learned the French horn and I learned the clarinet. We had private lessons once a week and took our playing seriously, practising on assigned exercises every day. We each led our sections in the high school band.

I also remember writing essays for English class, postponing the work of writing and then putting in hours the night before an essay was due. At university, this pattern became worse. I pulled a few all-nighters. To stay awake, it was the only time in my life I ever drank coffee.

Back then, in the 1960s, if you wanted to become a good athlete, it was accepted that regular training was the way to go. It would have been considered foolish to postpone training until just before an event and then put in long hours. Similarly, it was accepted that if you wanted to become a better instrumentalist, you needed to practise regularly. It was foolish to imagine practising all night before a performance.

Strangely, we never applied this same idea to writing. Leaving an assignment until the night before was common practice. And it was profoundly dysfunctional.

Boice’s studies

Luckily for me, while doing my PhD I started working regularly. On a good day, I would spend up to four hours on my thesis topic. I also started working on a book. Somewhere along the line I began aiming to write 1000 words per day. It was exceedingly hard work and I couldn’t maintain it for week after week.


Robert Boice

In the 1980s, Robert Boice, a psychologist and education researcher, carried out pioneering studies into writing. He observed that most new academics had a hard time meeting the expectations of their job. They typically put most of their energy into teaching and neglected research, and felt highly stressed about their performance. Boice observed a pattern of procrastination and bingeing: the academics would postpone writing until a deadline loomed and then go into an extended period of getting out the words. However, these binges were so painful and exhausting that writing became associated with discomfort, thereby reinforcing the pattern. If writing is traumatic, then procrastination is the order of the day.

Procrastination and bingeing is just what I did in high school and undergraduate study. It’s what most academics did when they were younger, and they never learned a different pattern.

Boice observed that a small number of new academics were more relaxed and more productive. They didn’t binge. Instead, they would work on research or teaching preparation in brief sessions over many days, gradually moving towards a finished product. Boice had the idea that this approach to academic work could be taught, and carried out a number of experiments comparing different approaches to writing. (See his books Professors as Writers and Advice for New Faculty Members.)

In one study, there were three groups of low-productivity academics. Members of one group were instructed to write in their usual way (procrastinating and bingeing). They ended up with an average of 17 pages of new or revised text – in a year. That’s about half an article and far short of what was required to obtain tenure.

Members of the second group were instructed to write daily for short periods. In a year, they produced on average 64 pages of new or revised text. Members of the third group were instructed to write daily for short periods and were closely monitored by Boice. Their average annual total of new or revised text was 157 pages. This was a stunning improvement, though from a low baseline.

It didn’t surprise me too much. It was the difference between athletes who trained just occasionally, when they felt like it, and athletes who trained daily under the guidance of a coach. It was the difference between musicians who practised when they felt like it and musicians who practised daily on exercises assigned by their private teacher.

Gray and beyond

Decades later, in 2008, I came across Tara Gray’s wonderful book Publish & Flourish: Become a Prolific Scholar. In a brief and engaging style, she took Boice’s approach, extended it and turned it into a twelve-step programme to get away from procrastinating and bingeing. Immediately I tried it out. Instead of taking 90 minutes to write 1000 words, and doing this maybe one week out of three, I aimed at 20 minutes every day, producing perhaps 300 words. It was so easy! And it promised to result in 100,000 words per year, enough for a book or lots of articles.

Gray, adapting advice from Boice, recommends writing from the beginning of a project.  This is different from the usual approach of of reading everything about a topic and only then writing about it. For me, this actually reduces the amount of reading required, because I know far better what I’m looking for. Over the following years, I gradually changed my writing-research practice. Previously, writing an article happened late in a project. Now I write from the beginning, and there is more follow-up work. The follow-up work includes looking up references, doing additional reading, seeking comments on drafts from non-experts and then from experts. It’s much easier and quality is improved.

I introduced this approach to writing to each of my PhD students. Some of them were able to take it up, and for them I could give weekly guidance. I also set up a writing programme for colleagues and PhD students. Through these experiences I learned a lot about what can help researchers to become more productive. An important lesson is that most academics find it extremely difficult to change their writing habits. Many can’t do it at all. Research students seemed better able to change, perhaps because their habits are less entrenched and because they think of themselves as learners.


Tara Gray

With this newfound interest in helping improve research productivity, I looked for other sources of information. There is a lot of advice about how to become a better writer. Our writing programme was based on the work of Boice and Gray, so I looked especially at treatments that would complement their work. Excellent books include Paul Silvia’s How to Write a Lot and W. Brad Johnson and Carol A. Mullen’s Write to the Top! It was encouraging that most of these authors’ advice was similar to Boice’s and Gray’s. However, there seems to be very little research to back up the advice. Boice’s is still some of the best, with Gray’s research findings a welcome addition showing the value of regular writing.

Jensen

To these books, I now add Joli Jensen’s superb Write No Matter What, and not just because it has a wonderful title. Jensen, a media studies scholar at the University of Tulsa, draws on her own experience and years of effort helping her colleagues to become more productive. As I read her book, time after time I said to myself, “Yes, that’s exactly my experience.”

“Writing productivity research and advice can be summarized in a single sentence: In order to be productive we need frequent, low-stress contact with a writing project we enjoy.” (p. xi)      

Jensen excels in her exposition of the psychological barriers that academics experience when trying to write. She approaches this issue — one pioneered by Boice — through a series of myths, fantasies and fears. An example is the “magnum opus myth,” the idea held by many academics that they have to produce a masterpiece. This is profoundly inhibiting, because trying to write a bit of ordinary text feels so inadequate compared to the shining vision of the magnum opus. The way to avoid this discrepancy is to postpone writing, and keep postponing it.

Another damaging idea is that writing will be easier when other bothersome tasks are cleared out of the way. Jensen calls this the “cleared-desk fantasy.” It’s a fantasy because it’s impossible to finish other tasks, and new ones keep arriving: just check your in-box. Jensen says that writing has to take priority, to be done now, irrespective of other tasks that might seem pressing.

Then there is the myth of the perfect first sentence. Some writers spend ages trying to get the first sentence just right, imagining that perfecting it will unleash their energies for the rest of the article. This again is an illusion that stymies writing.

A colleague once told me how she was stuck writing the last sentence of a book review, with her fingers poised over the keyboard for an hour as she imagined what the author of the book she was reviewing would think. This relates to the perfect first sentence problem but also to Jensen’s “hostile reader fear.” Jensen also addresses the imposter syndrome: the fear that colleagues will discover you’re not a real scholar like them. Then there is the problem of comparing your work with others, usually with others who seem to be more productive. Upwards social comparison is a prescription for unhappiness and, in addition, can inhibit researchers. If others are so much better, why bother?


Joli Jensen

Write No Matter What is filled with valuable advice addressing all aspects of the writing process. Jensen offers three “taming techniques” to enable the time, space and energy for doing the craft work of writing. She has all sorts of practical advice to address problems that can arise with research projects, for example when you lose enthusiasm for a topic, when you lose the thread of what you’re trying to do, when your submissions are rejected (and subject to depressingly negative comments), when your project becomes toxic and needs to be dumped, and when you are working on multiple projects.

She says that writing can actually be harder when there’s more unstructured time to do it, something I’ve observed with many colleagues.

“When heading into a much-desired break, let go of the delusion that you will have unlimited time. Let go of vague intentions to write lots every day, or once you’ve cleared the decks, or once you’ve recovered from the semester. Acknowledge that academic writing is sometimes harder when we expect it to be easier, because we aren’t trying to balance it with teaching and service.” (p. 127)

Jensen is open about her own struggles. Indeed, the stories she tells about her challenges, and those of some of her colleagues, make Write No Matter What engaging and authentic. Her personal story is valuable precisely because she has experienced so many of the problems that other academics face.

With my experience of running a writing programme for a decade and helping numerous colleagues and research students with their writing, it is striking how few are willing to consider a new approach, how few are willing to admit they can learn something new and, for those willing to try, how difficult it is to change habits. Boice’s work has been available since the 1980s yet is not widely known. This would be like a successful sporting coach having superior training techniques and yet being ignored for decades.

To me, this testifies to the power of entrenched myths and practices in the academic system. Write No Matter What is a guide to an academic life that is both easier and more productive, but the barriers to shifting to this sort of life remain strong. In the spirit of moderation advocated by Boice, Gray and Jensen, read their books, but only a few pages per day. And write!

Brian Martin
bmartin@uow.edu.au