Brian Martin is professor of social sciences at the University of Wollongong, Australia, and vice president of Whistleblowers Australia. He is the author of a dozen books and hundreds of articles on dissent, nonviolence, scientific controversies, democracy, information issues, education and other topics.
Who is responsible for fake news? And what can be done about it?
• Trump offering free one-way tickets to Africa & Mexico for those who wanna leave America. • Police find 19 white female bodies in freezers with “Black Lives Matter” carved into skin. • Donald Trump protester speaks out: “I was paid $3,500 to protest Trump’s rally”.
During the 2016 US election campaign, teenagers in the town of Veles in Macedonia found a way to make some money by posting material on big social media sites. Facebook gets its income from advertisements, and gives tiny payments to suppliers of content. The teenagers could make money if their material attracted lots of readers, so they made up outrageous stories that they thought would find an audience.
Some fake stories, from the teenagers or others, do find an audience, like the ones listed above about Trump and Black Lives Matter, which were among the top 15 fake news stories in 2016.
Some made-up stories seem so plausible that readers share them with their friends. As the shares and retweets multiply, a story begins trending. It might even be reported in the mainstream media.
So who is responsible? The Macedonian teenagers, for sure. But they wouldn’t bother except for the economic model provided by social media. The advertisers on social media usually don’t care; they benefit when a story generates lots of clicks. The mass media are hurting financially and so do much less fact-checking, so bogus stories sometimes are run. Then there are the readers – that’s us – who think a story is worth sharing and don’t take the trouble to check whether it’s genuine.
James Ball is an
experienced journalist who cares about the news and is alarmed by its
corruption. In his book Post-Truth he
tells about the problem, those implicated in it, and what can be done about it.
The problem is far deeper than the spread of made-up stories. News can be distorted, one-sided and in other ways misleading. The label “fake news,” when used to refer to manufactured fantasies, is inadequate to capture the full extent of the problem.
Ball provides an informative and often eye-opening tour of the issues, giving numerous examples to illustrate his analysis and recommendations. Ball’s preferred term is “bullshit.” This refers to claims that are neither right nor wrong but rather indifferent to the truth. When bullshit fills the air, audiences may despair of figuring out what’s really going on and start distrusting every source of news, including the more established ones. The subtitle of Post-Truth is How Bullshit Conquered the World.
Ball starts with
the seemingly obligatory stories of Brexit and the election of Donald Trump,
canvassing the use of bullshit in these campaigns. He then examines six groups
involved in spreading bullshit. Politicians are important players, many of whom
want to manipulate audiences and use public relations, spin and other
techniques. Then there are “old media” – newspapers, television, radio – that
play a big role in propagating dubious stories. As the old media are squeezed
financially, they have less capacity to check sources and are more likely to
run fake stories.
Ball continues through
new media, fake media (such as created by the Macedonian teenagers) and social
media. Each of these helps spread bullshit. As in the case of old media,
economic imperatives are involved. Online, at least where advertising reigns
supreme, getting clicks is currency, so it becomes attractive to run or allow
stories with little checking.
The final of the six chapters on who is spreading bullshit is titled “… and you.” Audiences contribute to the problem. Ball cites the experience of a news operation created to provide quality news, with a more positive slant. It drew on research showing that people wanted more of this sort of news. The news operation soon folded. What people say they want (high quality news) is not necessarily what they actually end up buying or reading.
Audiences are attracted by scandals, gore, celebrity gossip and stories that reinforce their pre-existing views. The various forms of media, to survive financially, pander to these audience preferences. In this sort of environment, Ball says, bullshit thrives. Audiences are thus part of the problem.
A study of Twitter found that half the people who retweet a news story never bother even to read it. The implication is that people are reacting so quickly that they are driven by emotion rather than careful reflection. This is fertile ground for bullshit. It’s only possible to dream up some claim that appeals to readers’ gut reactions, namely something they’d like to believe is true, and then it starts spreading wildly with little or no scrutiny.
What to do
the problem, telling who is spreading bullshit and why, Ball turns to
solutions. One of them is fact-checking. Some large media organisations, such
as the New York Times, employ
fact-checkers, and there are now a number of independent bodies undertaking
valuable, but Ball says it’s not a full solution. One shortcoming is that fake
news runs far ahead of fact-checkers. Most news consumers read the politician’s
lie or the fake story and never get around to seeing what fact-checkers say
There’s also a
deeper matter: manufactured news items are only part of the problem. The majority
of suspect claims are some combination of right and wrong. They may be biased,
selective or misleading, and not easily amenable to fact-checking.
What else? Ball provides advice for politicians, media and news consumers. For example, one piece of advice for politicians is not to explain why the opponent’s claim is wrong, because this just highlights the claim. Explaining why alarms about terrorism are misleading only makes terrorism more salient. It’s better to reframe the issue, namely to provide a different narrative.
One of Ball’s
recommendations for media is to be careful about headlines, making sure they
capture the key ideas in stories. Because many readers share stories based
solely on headlines, some traditional headline-writing techniques need to be
Finally, Ball has recommendations for readers and voters. One of them is to put effort into thinking about stories and not just reacting to them emotionally. Another is to question the narratives that you believe as much or more as the ones you don’t believe. He also suggests learning basic statistics so you can assess claims made in the media.
All of Ball’s suggestions are worthwhile. If taken up, they would do a lot to change the media environment. But what would encourage people to follow his suggestions? Ball’s own analysis of the problem shows that politicians and the media are captives of large-scale processes, especially economic imperatives and audience emotional responses.
scholars and critics have been examining media cultures, especially the news,
showing all sorts of systemic problems. Journalists and editors treat events as
newsworthy when they conform to what are called “news values.” For example,
prominent people involved in conflicts are more newsworthy than ordinary people
behaving amicably. Hence, Trump’s campaign for a wall receives saturation
coverage while amicable relations between people living near the border between
Mexico and the US seldom warrant front-page media stories.
Ball doesn’t address the systemic biases in mass media coverage that pre-dated the rise in what he calls bullshit. His analysis is illuminating but needs to be supplemented.
Is there any hope?
A few readers of Post-Truth will take
up Ball’s suggestions, but for major change, collective action is necessary.
The lesson from history is that social movements are needed to bring about
change from below. An individual can seek to reduce personal greenhouse gas
emissions, but to tackle global warming, mass action is needed.
What sort of
collective action can make a difference regarding the news? There are signs in
what is already happening in circles where accurate information is vital.
When filter bubbles are needed
The mass media
have a strong preference for reporting events involving violence: “if it
bleeds, it leads.” This is frustrating for proponents of nonviolent action,
especially when there is little media coverage of a large peaceful protest or
the reports are about a minor scuffle rather than the issues at stake.
The methods of nonviolent action include strikes, boycotts, sit-ins, occupations and rallies. Research shows that nonviolent campaigns are more effective in overthrowing repressive regimes than armed struggle. Nonviolent action is the preferred approach of most social movements, including the labour, feminist, environmental and peace movements. Yet despite its effectiveness and widespread use, nonviolent action is marginalised in mass and social media coverage.
There’s an obvious reason for this. Nonviolent activists have no wealthy and powerful backers. In contrast, hundreds of billions of dollars annually are spent on militaries, with the full backing of governments and associated corporations. It is not surprising that media coverage follows power and money. Furthermore, the news values used by journalists to judge newsworthiness lead to a neglect of nonviolent alternatives.
In this context, nonviolent campaigns need to create their own news ecosystem, circulating information through sympathetic newsletters and websites. Getting rid of fake news and bullshit is fine for the dominant military approach but would do little to make audiences more aware of nonviolent options.
Voting is governments’ preferred method of citizen involvement in politics. Besides voting, there are numerous methods that enable citizens to participate in the decisions affecting their lives, such as initiatives and referendums. A method I find appealing is citizens juries, in which randomly selected citizens hear evidence and arguments about a contentious community issue, deliberate about it and make a recommendation.
However, alternatives to representative government have hardly any profile in the media. There is massive coverage of politicians, including their campaigning, policies, foibles and infighting, but almost none about participatory alternatives to the system in which elections and politicians are dominant. This means that campaigners for such alternatives need dedicated sources of information to find out about research and action, and to maintain their commitment.
Many social movements that are today considered progressive have struggled in the face of hostile media environments. Ball’s concerns about the rise of bullshit and the problems in gaining access to information are warranted. But for those seeking to challenge perspectives based on massive money, power and ideology, it has never been easy.
Trust is fundamental to human activities. How is it changing?
On a day-to-day basis, people put a lot of trust in others.
As I walk down a suburban street, I trust that a driver will follow the curve
of the road rather than drive straight into me. The driver trusts the engineers
who designed the car that it will not explode, at least not on purpose. Buying
an aspirin is premised on trusting the chemists and manufacturers that produced
When trust is betrayed, it is a major issue. When, last year in Australia, a few needles were discovered in strawberries and other fruit, it was national news. People normally assume that fruit purchased from a shop has not been tampered with.
Paedophilia in the churches was covered up for decades. When it was finally exposed, it destroyed a lot of trust in church leadership and the church as an institution.
knowledge is based on observation, experiment and theorising, but also relies
heavily on trust between scientists, who need to rely on each other to report
their findings truthfully. This helps explain the enormous condemnation of
scientific fraud, when scientists manipulate or fake their results.
In certain areas, public trust has plummeted in recent decades: trust in public institutions including government, corporations and the mass media. Opinion polls show large declines. In Australia, trust in financial institutions had been dropping due to scandals, and that was before the royal commission revealed widespread corruption. When people can’t trust their financial advisers, what should they do?
In order to ensure fairness and good practice, governments set up watchdog bodies such as ombudsmen, environmental protection authorities, anti-corruption commissions and auditor-generals. One of the casualties of the banking royal commission has been the credibility of financial watchdogs such as the Australian Securities & Investment Commission (ASIC). Rather than sniffing out bad practice, they were complacent. Whistleblowers reported problems, but ASIC ignored them. The message is that members of the public cannot rely on watchdog bodies to do their job.
Who can you
Rachel Botsman has written an insightful and engaging book titled Who Can You Trust? She argues that in human history there have been three types of trust.
local trust, based on personal experience in small communities. If someone you
know helps, or fails to help, in an hour of need, you can anticipate the same
thing in the future. Local trust is still relevant today, in families and
friendships. People learn who and when to trust through direct experience.
Next came institutional trust, in churches, militaries, governments, and professions such as medicine and engineering. People trusted those with greater authority to do the right thing. In the 1950s, high percentages of people in countries such as the US said they had a great deal of trust in their political leaders. However, institutional trust has taken a battering in recent decades.
“So why is trust in so many elite institutions collapsing at the same time? There are three key, somewhat overlapping, reasons: inequality of accountability (certain people are being punished for wrongdoing while others get a leave pass); twilight of elites and authority (the digital age is flattening hierarchies and eroding faith in experts and the rich and powerful); and segregated echo chambers (living in our cultural ghettoes and being deaf to other voices).” (p. 42)
Botsman writes about the rise of a third type of trust: distributed trust. People trust in systems that involve collective inputs, often anonymous.
Suppose you want to see a recently released film. If you
rely on local trust, you ask your friends what they thought of it. If you rely
on institutional trust, you see what the producers say about their own film:
read the advertisements. Or you can rely on distributed trust. For example, you
can look up the Internet Movie Database (IMDb) and see what different film
critics have said about the film, see what audience members have said about the
film and see the average rating audiences have given the film.
If you take
into account audience ratings from IMDb, you are trusting in two things. First,
you’re assuming that audience members have given honest ratings, and that the
film’s promoters aren’t gaming the system. Second, you’re assuming that IMDb’s
method of collecting and reporting ratings is honest. After all, IMDb might be
getting payoffs from movie producers to alter audience ratings.
Botsman says distributed trust seems to be reliant on technology but, ultimately, human judgement may be required. Of course, people design systems, so it’s necessary to trust the designers. However, after a while, when systems seem to be working, people forget about the designers and trust the technology.
One of Botsman’s examples is the self-driving car. Developers have put a lot of effort into figuring out what will make passenger/drivers feel safe in such cars. This sounds challenging. It turns out that the main problem is not building trust, because after being in a self-driving car it seems quite safe. The problem is that drivers become too trusting. Botsman thinks her young children will never learn to drive because self-driving cars will become so common.
Botsman has a fascinating chapters on the darknet, a part of the Internet frequented by buyers and sellers of illegal goods, among other nefarious activities. Suppose you want to buy some illegal drugs. You scroll through the various sellers and select your choice. How can you be sure you’ll receive the drugs you ordered (rather than adulterated goods) or that the seller won’t just run off with your money and not deliver the drugs? Botsman describes the trust-building mechanisms on the darknet. They include a rating service, rather like Amazon’s, and an escrow process: your payment is held by a third party until you’re satisfied with the goods. These darknet trust-enablers aren’t perfect, but they compare favourably with regular services. It turns out that trust is vital even when illegal goods are being bought and sold, and that reliable systems for building and maintaining trust are possible.
a high-rise apartment building called the Opal Tower had to be evacuated after
cracks were found in the construction. Experts debated when it was safe for
residents to return to their units. Some commentators blamed the government’s
system for checking compliance to building codes. Could trust in builders be
improved by learning from the systems used on the darknet?
Botsman’s special interest is in the blockchain. You might
have heard about the electronic currency called bitcoin. Used for purchases
online, it can provide anonymity, yet embedded in the code is a complete record
of every transaction. Furthermore, this record can be made public and inspected
by anyone. It’s as if a bank published online every transaction, with amounts
and dates, but without identifying who made them.
Botsman says bitcoin is a sideshow. The real innovation is the blockchain, the record-keeping code that enables reliable transactions without a middleman, such as a bank, taking a cut. It sounds remarkable, but blockchain-based operations have pitfalls. Botsman describes some disasters. When a new currency system was set up, someone found a glitch in the code and drained $60 million from the currency fund, one third of the total. The programmers and founders of the system were called in to intervene, which they did, preventing the extraction of currency.
seems not quite ready to provide a totally reliable trust system, one not
reliant on human intervention. But lots of people are working to achieve this
goal, as Botsman revealingly describes.
For me, the value of Who Can You Trust? is in highlighting the role of trust in contemporary life, especially as trust in institutions declines drastically. It made me think in a different direction: political alternatives.
The political philosophy of anarchism is based on the idea of self-management: people collectively make the crucial decisions affecting their lives without systems of hierarchy, namely without governments, corporations or other systems of domination. The usual idea is that there are assemblies, for example of workers who decide how to organise their work and what to produce. Assemblies elect delegates for coordination by higher-level groups.
This model of self-management relies on two types of trust. The assemblies have to be small enough for dialogue in a meeting and thus rely on local trust. The delegate structure parallels distributed trust, as long as the delegates remain bound by their assemblies and acquire no independent power
Another model is demarchy, which also dispenses with governments and corporations. In a local community, decision-making is carried out by citizens panels, with maybe 12 to 24 members each, whose members are selected randomly from volunteers. There could be panels for transport, manufacturing, art, education and a host of other topics. In essence, all the issues addressed by governments today are divided according to topic and allocated to randomly selected groups of citizens.
they are randomly selected, panel members have no mandate, so their terms are
limited. For coordination, experienced panel members would be elected or
randomly chosen for higher-level panels.
Demarchy relies on local trust, especially on the panels, and on distributed trust, namely trust in the system itself. This distributed trust is similar to the trust we have today in the jury system for criminal justice, in which randomly selected citizens deliberate together and make judgements. People trust a randomly selected person, who has no personal stake in the outcome, more than they are likely to trust a lawyer or a politician.
Botsman’s analysis of trust and technology raises a fascinating option: what would it mean to combine distributed trust based on technology with the local/distributed trust in political systems like anarchism and demarchy?
In December 2018, a partnership was announced between the Ramsay Centre and the University of Wollongong. The university would establish a degree in Western Civilisation funded by the centre.
The new degree was immediately controversial. In the previous months, there had been considerable publicity about proposed Ramsay-funded degrees in Western civilisation at the Australian National University and the University of Sydney. At both universities, many staff were opposed to the degrees. The ANU proposal did not go ahead, while the Sydney proposal was still being debated. Given this background, opposition to the degree at Wollongong was not surprising.
My aim here is to give a perspective on the controversy over the Ramsay-funded Western civilisation degree, especially as it has been played out at the University of Wollongong (UOW). I write as an academic at the university without a strong stake in the new degree, because I am retired and the issues involved do not impinge greatly on my main research areas. However, a number of my immediate colleagues have very strong views, and I have benefited from hearing their arguments, as well as the views of proponents of the degree.
The next section gives a brief overview of the institutional context, which is useful for understanding both incentives and concerns associated with Ramsay funding. Following this is an introduction to the Ramsay Centre. Then I outline the major issues raised at the university: decision-making, the conservative connection, Western civilisation and equality of resourcing. The conclusion offers a few thoughts on the de-facto strategies of key players.
It would be possible to go into much greater depth. Relevant are issues concerning the aims of education, the funding of higher education, the impact of private funding and agendas, the question of Western civilisation and the role of political ideology. Others have more expertise on these and other issues, and I hope some of them will contribute to the discussion.
Australian university sector
Most Australian universities are funded by the federal
government, but the funding environment has become increasingly challenging. In
the 1980s, the government introduced tuition fees based on government
zero-interest loans paid back as part of income tax only when a student’s
income reached a moderate level. Introducing these fees provided universities a
sizeable income stream, but not a bonanza, because the government cut its
direct funding, while opening the gates to a massive expansion in student
numbers over the following decades.
The result was that academics were met with ever-increasing class sizes. The student-staff ratio dramatically increased, almost doubling in some fields. However, this wasn’t enough to fix the financial squeeze. University managements dealt with it in two main ways.
aggressively recruited international students, who had to pay substantial
tuition fees. International student fees were used to cross-subsidise other
operations. Eventually, this income became Australia’s third largest export
industry, after iron and coal.
teaching was increasingly carried out by “casual” staff, paid by the hour or on
short-term contracts. University teaching was casualised almost as much as the
fast food industry.
beginning in the 1980s, the government pushed universities and other higher
education institutions to amalgamate. Increased size, through amalgamations and
student recruitment, became a goal, augmented by setting up of additional
campuses in Australia and in other countries. Universities became big
businesses, with budgets of many hundreds of millions of dollars.
management at Australian universities, finances became a preoccupation. All
avenues for income are canvassed, though the options have been restricted
mainly to government funding, student fees and research grants. The other side
of the coin has been cost containment, including by increasing class sizes,
cutting staff numbers and, as mentioned, relying ever more on casual staff for
US, in Australia there is no tradition of private support for universities.
Gifts from alumni are welcome but are usually a tiny portion of income.
Philanthropy is not prominent.
It was in this context that the Ramsay Centre for Western Civilisation entered the picture. Paul Ramsay made a fortune in private healthcare, including buying and running numerous hospitals. He died in 2014, having bequeathed a portion of his estate to setting up university courses in Western civilisation, run with small classes in which students study great books, in the manner of a few other such courses in the US and elsewhere. The Ramsay Centre was set up to manage this bequest. In 2017, the Centre invited expressions of interest from Australian universities to receive funding to set up and run degrees in Western civilisation.
University of Wollongong was the first university to announce an agreement to
set up such a degree. From the point of view of university managers, this was
an attractive proposition. It would involve the largest ever injection of
private money into an Australian university to fund a humanities programme,
amounting to many tens of millions of dollars. It was enough to employ ten
academics and give scholarships to dozens of undergraduates.
Early in 2019, Professor Theo Farrell, executive dean of the Faculty of Law, Humanities and the Arts at UOW, outlined the financial benefits of the arrangement in meetings held to discuss the new degree. The faculty was affected by a decline in the number of undergraduate students enrolling in arts degrees, a decline occurring across the state, not just at Wollongong. The Ramsay-funded degree would have both direct and spinoff benefits financially. The students undertaking the degree would have to take a major or a double degree at the university, most likely in the faculty, giving a boost to enrolments.
benefit was claimed: because the Ramsay-funded students had to have good results
in high school and because they were being paid, they were more likely than
other students to finish their degrees. If true, this would aid the faculty’s
overall retention rate, something the government would favour.
money would support the employment of ten academics and two professional staff.
One of the academics is Dan Hutto, senior professor of philosophy, appointed
head of the new School of Liberal Arts hosting the new degree. There are to be
nine newly hired academics, all of them philosophers. Though hired for
teaching, their relatively light teaching loads would free them up to do
research. Their presence potentially could turn UOW into a philosophy
powerhouse, beyond its current dynamism led by Hutto.
point of view of its advocates, the new degree thus brought great advantages to
the faculty and the university. It involved the injection of a large amount of
money with spinoff benefits for the rest of the faculty. And it would position
UOW as a prominent player internationally among great-books programmes and in
Acceptance of the degree was not straightforward. As soon as it was announced, academics and students expressed opposition. Here, I look at the grounds for opposition under several categories: decision-making, the conservative connection, Western civilisation and equality. In practice, these concerns are often mixed together.
Discussions between the centre and UOW were carried out in
secret. Only a few people at the university even knew negotiations were
occurring. Critics decried the secrecy.
officials said, in defence, that these sorts of negotiations are carried out
all the time, without any public announcement. Indeed, there are many examples
in which major developments have been announced as fait accompli. For example,
in November 2018 an announcement was made that the university had purchased colleges
There was no protest about this; indeed, few took any notice.
On the other hand, the Ramsay Centre was already controversial elsewhere, separately from Wollongong. As the Australian National University negotiated with the Ramsay Centre, there was considerable publicity, especially when university leaders decided against having a Western civilisation degree because of concerns about academic freedom. At the University of Sydney, major opposition emerged to a Ramsay-funded degree, with protests and much media coverage.
context, the secrecy at UOW seemed anomalous. It was true that university
management often proceeded on major initiatives without consultation with
academic staff, but this was not a typical case: it was already known to be
On the Ramsay Centre board are two prominent political conservatives: former prime ministers John Howard and Tony Abbott. For quite a few staff at UOW, the presence of Howard and Abbott tainted the Ramsay Centre and its funds.
explained by Farrell, the board of the Ramsay Centre has no input into what was
taught in the degree. Negotiations with the centre were with two academics that
it employed, Simon Haines and Stephen McInerney, not with the board.
One of the concerns expressed about the degree was that Ramsay Centre representatives would be members of the selection committees for the newly hired academics. For many academics, the idea of non-academic ideologues sitting on academic selection committees was anathema. Farrell countered by emphasising that members of the Ramsay Centre Board, such as Howard and Abbott, would have nothing to do with appointments. Only the Ramsay academics would be involved. A typical selection committee would have the two Ramsay academics, one outside academic, up to six UOW academics including Farrell as chair of the committee. Farrell said that it was not unusual for non-UOW figures to sit on selection committees. In other words, there were many precedents for the processes relating to the new degree.
noted that in his experience most selection committees operate by consensus,
not voting, but that if it came to a vote, UOW members had the numbers. In
response to a question about what the Ramsay academics would be looking for —
the worry being that they would want candidates aligned with particular
political positions — Farrell said that in his interactions so far with the
Ramsay academics, their main concern was that the appointees be good teachers.
meeting for faculty members about the new degree held on 11 February, Marcelo
Svirsky, senior lecturer in International Studies, raised a concern about the
reputational damage caused by the connection between Ramsay and the university.
Farrell said the university’s reputation internationally would be enhanced via
connections with Columbia University and other institutions with similar sorts
of degrees. Such connections were important given how difficult it was to build
affiliations with leading universities. Domestically, Farrell said that
information about the content of the UOW degree was gaining traction in the
media, counteracting earlier bad publicity about the proposed degrees at other
universities. He explicitly denied any risk to reputation.
It is fascinating to speculate what the response to the Ramsay money would have been had Howard and Abbott not been on the board. Many academics vehemently oppose the political positions of Howard and Abbott, making it difficult to accept any initiative associated with the two politicians. In the wider public, the involvement of Howard and Abbott mean the Ramsay Centre is inevitably caught up in the emotions associated with right-wing politics and the so-called culture wars.
be the same academic opposition to money coming from a centre linked to leading
figures from green or socialist politics? This can only be surmised, because if
a green-red twin of the Ramsay Centre were funding a degree, it would not be
called a degree in Western civilisation.
For academics in some sections of the humanities and social
sciences, “Western civilisation” is a term of opprobrium, not endearment. It is
useful to note that in several fields, critique is one of the standard tools:
accepted ideas, practices and institutions are subject to critical scrutiny,
often with assumptions and beliefs skewered. For example, in my field of
science and technology studies, challenges to ideas such as scientific progress
and “technology is neutral” are fundamental to much teaching and research. Yet,
in the wider public, conventional ideas about science, technology and progress
remain dominant. Therefore, teaching in the field necessarily involves
questioning conventional thinking.
For some, “Western civilisation” brings up images of Socrates, Michelangelo, Shakespeare and Einstein: great thinkers and creators from Europe. It also brings up images of parliamentary democracy, human rights and liberation from oppressive systems of domination. These are some of the positives of Western history and politics.
also a seamier side to Western history and politics. Colonialism and imperialism
sponsored by Western European states resulted in massive death, displacement
and enslavement of Indigenous peoples. In Australia, white settlement caused
death and the destruction of the culture of Aboriginal peoples.
As well as
the legacy of colonialism, the history of Europe has its own dark aspects, for
example the Crusades, the Inquisition, the horrors of the industrial revolution
and the Nazi genocide. A full account of Western cultures needs to address
their damaging as well as their uplifting sides.
While Western civilisation has been responsible for horrific deeds, these have been carried out with convenient rationales. Colonialism was seen by its defenders as part of a civilising mission, bringing enlightenment to savage peoples. Yet the aftermath of this mission continues to cause suffering. For example, in Rwanda, Belgian colonialists imposed the categories of Tutsi and Hutu on the population, helping lay the stage for the 1994 genocide. In Australia, poverty and incarceration of Aboriginal people are among the contemporary consequences of colonialism.
academics, it is imperative to challenge the glorified myth of the beneficence
of Western culture. It is part of the scholarly quest to attain insight into
what really happened, not just what is convenient to believe, and this often
involves pointing to the unsavoury aspects of history and politics that others
would rather ignore or downplay.
context, the very label “Western civilisation” is an insult to some scholars in
the area, because the term “civilisation” has positive connotations unlike, for
example, “Western barbarism.” For scholars, the label “Western civilisation”
suggests a focus only on one side of a complex and contentious past and legacy.
Hutto, in presenting the subjects to be taught in UOW’s Western civilisation degree, emphasised that about half of them involved studying texts from other cultures, including texts concerning Buddhism, Islam and Indigenous cultures. To fully understand Western culture, it is valuable to appreciate other cultures: a respectful dialogue provides more insights than concentrating on Western items alone.
some of the texts that Hutto proposed from Western writers offered critical perspectives
on Western societies. In these ways, Hutto distanced the degree from Abbott’s
claim that it would be for Western
instead positioning it as something different. In Hutto’s view, the degree uses
the study of great works of Western civilisation, in conversation with
non-Western traditions, as a way for students to develop their critical
capacities, using evidence and argument to back up their views. In short,
Hutto’s aim for the degree is that students learn how to think, not what to
think. Students are bound to be exposed to critical perspectives, including in
the major or degree they are required to take in addition to the one in Western
The degree as designed by Hutto might clash with the conceptions of some Ramsay Centre board members. It might also clash with the public perception, at least as informed by media coverage, that the degree would be one-sided advocacy for Western contributions. Intriguingly, if Howard or Abbott were to express reservations about UOW’s degree, this would temper the media and public perceptions of one-sidedness.
One of the
problems with the concept of Western civilisation is that, in the public
debate, it is seldom defined. Some critics might say that to talk of Western
civilisation is a category mistake, attributing a reality to an abstraction
whose meaning is contested. The variability of the meaning of “Western
civilisation” may lie behind some of the disputes over the degree carrying this
Ramsay’s large donation seems like a boon to a cash-strapped university, enabling the hiring of staff and the running of small classes that otherwise would be infeasible. On the other hand, UOW’s planned degree creates tensions between the privileged few and the rest.
The academics hired to teach the new degree would seem to have some extra benefits. In particular, they will be teaching small classes, of no more than ten students, of high-calibre students. In contrast, their colleagues, namely the rest of the academics in the faculty, are saddled with tutorial classes of 25, plus lectures sometimes with hundreds of students.
academics, this contrast is a source of considerable disquiet. Imagine someone
working in a field where offerings cover the same topics as proposed in the
Western civilisation degree. They might well say, “We have the expertise and
experience in the area. Why are we being squeezed while newcomers are given
generous conditions to teach the same topics from a philosophical perspective?”
been no formal response to questions of this type. One reply would be to say
that there are all sorts of inequalities between staff, only some of which are
related to merit. The most obvious inequality is between permanent and
non-permanent teachers. Some of the teachers on casual appointments are just as
qualified as those with continuing appointments. There are also inequalities
between academics, especially in research. For example, some researchers are
exempted from teaching on an official or de-facto basis.
tend to be highly sensitive to inequality in treatment, in part because
professional status is so highly valued. There are regular disputes about
workloads: seeing a colleague with a lighter teaching load can cause envy or
resentment. That a whole group of new academics seems to receive special
conditions can bring this sort of resentment to the fore.
students selected for scholarships to undertake the Western civilisation degree
have to satisfy several conditions. They must be Australian citizens or
permanent residents, young, recently completed high school and have obtained a
high score in the examinations at the end of high school. In other words,
mature-age students and international students are excluded from consideration.
Scholarship students will receive an annual stipend of $27,000, paid for up to
To some, the special privileges for scholarship students are unfair, especially the restriction to young Australian students. To this, a reply might be that inequalities between students are commonplace. The most obvious is between domestic and international students, the latter having to pay large tuition fees. Students on postgraduate scholarships are privileged too. This sometimes can be justified on merit, though the difference between students near the scholarship cut-off point may be tiny.
To appreciate the struggle over the Ramsay-Centre-funded degree in Western civilisation at the University of Wollongong, it is useful to think of the key players as using tactics to counter the moves of their opponents. Thinking this way is a convenience and does not imply that players actually think in terms of a strategic encounter.
proponents of the degree seem to be driven by two main considerations: the
availability of a large amount of private money to be injected into the
humanities, and the opportunity to build a world-class philosophy unit. To
acquire the Ramsay money and build the philosophy unit, it was useful to
counter likely sources of opposition, in particular the opposition of academics
in cognate units concerned about the ideological associations with the Ramsay
Centre and the concept of Western civilisation.
forestall the sort of rancorous public debate that occurred at the Australian
National University and Sydney University, which might scuttle the degree
before it was agreed, the degree proponents negotiated in secret. This did
indeed reduce public debate, but at the expense of a different source of
concern, the secrecy itself.
To counter concerns associated with the ideological associations with Ramsay and Western civilisation, Dan Hutto, designer of the degree, went to considerable effort to include in the core subjects respectful intellectual engagements with non-Western cultures, and to include negative as well as positive sides of Western culture.
opponents of the degree were not mollified. Some simply ignored the innovative
aspects of the subject offerings and assumed that any degree labelled “Western
civilisation” must be an apologia for Western colonialism. Other opponents,
though, focused on procedural matters, for example the fast-track approval of
the degree despite its possible risk to the university’s reputation.
One of the consequences of the degree is the introduction of a privileged stratum of staff, with much lighter teaching loads, and of students given scholarships to undertake the degree. For proponents of the degree, there is no easy way to address the associated staff and student inequality. However, this inequality has not played a significant role in the public debate. There are numerous other inequalities within universities, so perhaps the introduction of one more, despite its high profile, is not a likely trigger for public concern.
One of the
positive outcomes of the new degree is the debate it has stimulated. Hutto has
grasped the opportunity by planning to have the students discuss, in their
first week in the degree beginning in 2020, the debate about the degree itself.
For those so inclined, the new degree provides a golden opportunity to articulate
critiques of Western civilisation and make them available to staff and students
in the new School of Liberal Arts. Although Tony Abbott claimed that the
Ramsay-funded degrees would be for Western
civilisation, it is quite possible that many of the degree graduates will develop
a sophisticated understanding of Western civilisation. Perhaps, along the way,
members of the public will learn more about both the high and low aspects of
Paul Ramsay think of the furore over degrees in Western civilisation? Perhaps
he would be bemused that his bequest is receiving much more attention than he
ever sought for himself during his lifetime.
I thank the many individuals who have discussed the issues with me and who have offered comments on drafts.
 In the debate about Ramsay
Centre funding, Paul Ramsay and Ramsay Health Care have scarcely been
mentioned. Michael Wynne, a vigorous critic of corporate health care, developed
an extensive website with information about numerous heathcare corporations in
the US and Australia. While being critical of for-profit heathcare, Wynne has
relatively generous comments about Paul Ramsay himself and about Ramsay Health
Care, at least compared to other players in the corporate scene. See:
Wynne’s pages on Ramsay were last updated in 2005, but after this Paul Ramsay played a less direct role in Ramsay Health Care.
 I attended
meetings on 16 January and 11 February 2019 held for members of the Faculty of
Law, Humanities and the Arts. Theo Farrell and Dan Hutto told about plans for
the new degree and answered questions.
 Another factor,
specific to UOW, was the setting up of a Faculty of Social Sciences that,
despite its name, does not house the classic social sciences of sociology,
political science and economics. This faculty set up a social science degree
that is in direct competition with the arts degree, attracting students that
otherwise would have contributed to the budget for the Faculty of Law,
Humanities and the Arts.
 Andrew Herring, “University of Wollongong continues global expansion into Malaysia,” 19 November 2018, https://media.uow.edu.au/releases/UOW253448.html: The media release begins as follows: “The University of Wollongong (UOW) has continued its global expansion by acquiring the university colleges of Malaysian private education provider KDU from long-standing Malaysian investment company Paramount Corporation Berhad (PCB).
Subject to Malaysian Ministry of Education approval,
the deal will see UOW wholly-owned subsidiary, UOW Global Enterprises, immediately
acquire a substantive majority equity interest in the university colleges in
Kuala Lumpur and Penang—including the new campus under construction in Batu
 Tony Abbott, “Paul Ramsay’s vision for Australia,” Quadrant Online, 24 May 2018, https://quadrant.org.au/magazine/2018/04/paul-ramsays-vision-australia/. Quite a few commentators blamed Abbott’s article for hindering acceptance of a Ramsay-funded degree at the Australian National University, e.g. Michael Galvin, “Abbott single-handedly destroys Ramsay Centre for Cheering On White People,” The Independent, 17 June 2018; Peter van Onselen, “Ramsay Centre has Tony Abbott to blame for ANU’s rejection,” The Australian, 9 June 2018. Note that the preposition for is contained in the full name of the centre: the Ramsay Centre for Western Civilisation.
 Entry to the degree course is open to students of any age, and to five non-residents. The conditions mentioned apply only to those receiving Ramsay scholarships, and even then exceptions can be made. An ATAR (Australian Tertiary Admission Rank) of 95 has been mentioned as an expectation for scholarship recipients. Other factors will be taken into account.
Being the subject of news coverage can be both exciting and disturbing.
Have you ever been in the news? If you’re a politician, sports star or celebrity, of course you have, but for others it can be a rare experience. What does it feel like?
There is a vast amount of writing and research about the news. However, most of the research is from the point of view of either journalists or audiences. Surprisingly, few have bothered to interview so-called “ordinary people” appearing in the news. Ruth Palmer, in her new book Becoming the news: how ordinary people respond to the media spotlight,has addressed this omission. Her findings are fascinating.
My own experience is not typical. For decades I have regularly spoken with journalists, and I’ve written quite a few articles and letters to the editor published in newspapers. However, I do remember one of the first times I was on television. A television crew came to a friend’s house and I was interviewed on camera for half an hour. When the programme was broadcast, less than half a minute of the interview was used. That was when I concluded that television is the most manipulative of the mass media.
Palmer is now a professor of communications at IE University in Spain. In doing her PhD at Columbia University, she set out to study the experiences of people in the US who had been in news stories, usually without any initiative on their part, and this research became the basis for Becoming the News.
After the famous “miracle on the Hudson,” when a pilot landed a damaged passenger plane on the Hudson River with no loss of life, journalists sought the views of survivors. Palmer interviewed one of them, “Albert” (a pseudonym). Also among the 83 people she interviewed in 2009-2011 were “Deanne,” who witnessed an attempted suicide and was approached for comment, and “Alegra,” who miscarried due to a rare syndrome and agreed to speak to the media about it.
There were a variety of reasons why
Palmer’s interviewees had encounters with the media. What the interviewees had
in common was the novel experience of having their words or images conveyed to
a wide public in a story written by someone else — a journalist.
“Subjects imagined that those large audiences not only saw the coverage but also believed it. Based on their subsequent interactions with people who had seen them in the news, this usually proved to be true. This is the final factor that defines news subjecthood: being represented by a journalist in a mainstream news product means being represented in a product that makes authoritative truth claims.” (p. 8)
In many cases,
people make a voluntary choice to be in the news. A journalist rings and asks
for comments on an issue. It’s possible to say no, but many subjects agree, for
a variety of reasons. Some want to inform the public about an issue, like Alegra
who wanted to warn other mothers. Some seek publicity for their business or
Some people have just experienced something dramatic, like a plane crash or being shot in the street. Palmer calls the event leading to journalists being interested a “trigger.” If you’ve just been collateral damage in a shooting incident, or witnessed someone trying to kill themselves, you could be traumatised. It’s not an ideal time to be talking to a journalist, but even so many people agree: they are witnesses and are willing to give their point of view.
Journalists are hardly neutral in this process. They want a story. They learn how to encourage subjects to agree to comment and how to get them talking. In this sort of encounter, the journalist is highly skilled and experienced whereas the subject is unprepared and sometimes traumatised. Some journalists exploit people’s natural inclination to respond to questions.
In general, people agree to be
interviewed if they think they will benefit more than they will be harmed. Some
don’t want their stories told, especially if they have something to hide.
In my own experience, most journalists are straightforward in their dealings and competent in doing their jobs. For example, they ring me to comment about whistleblowing or plagiarism or some other topic about which I’ve written. My situation is different from that of the “ordinary people” Palmer writes about; I’ve never been approached after witnessing a crime or being in an accident.
Palmer’s findings about the accuracy of stories are especially interesting. First consider the point of view of journalists and editors: they put a premium on factual accuracy. Some high-prestige media, like the New York Times, employ fact-checkers to ensure accuracy in stories.
However, Palmer found that most
subjects were not too worried about factual errors, such as giving the wrong
street or even misspelling their names. They were far more disturbed by the
general impression given by the story, especially when it was different from
what, based on an interview, they had anticipated.
For an ordinary person to be
featured in the news means being singled out as special: in most cases, it adds
to the person’s status. This occurs despite the news media having a low
reputation generally. Many subjects were thrilled by the stories. They thought
the journalists had done a good job, and had given them greater visibility than
they could have achieved otherwise.
For subjects who wanted to promote a
business or a cause, media coverage provided more effective advertising and
legitimation than alternatives. Stories were especially credible because they
were written by someone else, a journalist.
Subjects reported that when family
members and friends saw their name in the newspaper, many of them bought copies
and sent congratulations. In quite a few cases, this response seemed
independent of what the story said. Being in the news was enough to be seen as
For a small minority, though, news
coverage was a disaster. This was mainly when the story was about something
disreputable, such as a crime, or simply cast them in a bad light. A university
student was quoted out of context in a way that made her look bad, and as a
result received abusive comments from peers and strangers alike.
Many subjects found it strange,
indeed unnerving, to see how they were portrayed in the news. It is difficult
enough for most people to appreciate how others see them. News coverage
provides one avenue.
The strangeness arises from a contrast of perspectives. Subjects knew about their own lives, of course. Then a sliver of their life is interpreted by someone else, a journalist, and presented to the world, so readers would assume that that is what they are like. Subjects could examine the coverage and contrast it with their own self-perception. Added to this was the knowledge that many other people, people who didn’t know them otherwise, were forming their opinions of them based on this particular portrayal.
With time and experience, people can
get used to media coverage of themselves. Palmer’s subjects, though, were
newcomers to the experience.
according to Palmer, evaluate their reporting mainly in terms of accuracy and
ethical process. Subjects, while deeming these facets important, were much more
concerned with the overall orientation of the coverage and with its impact on
audiences. These are aspects given less attention in US media coverage. If a
journalist had to worry about the impact of coverage on the life of an
interviewee, this could lead to a type of self-censorship.
Long after journalists have moved on
to other stories, subjects may be coping with the impact of being in the news.
This is exacerbated by the indefinite online availability of stories.
Pre-Internet, media coverage would come and go, with impacts being localised in
time and space. With online stories, the coverage can have a long-term impact
via search engines.
One of Palmer’s subjects, “Rich,” had been arrested for having kidnapped a politician’s wife, a story given local media coverage. Later, he was released because he had nothing to do with the kidnapping. However, his exoneration was not newsworthy. His employer believed Rich was innocent but fired him anyway, because clients might find the damaging media stories online. Three years later, Palmer reports, Rich was still unemployed.
Rich’s disastrous experience with media coverage highlights something relevant to most of Palmer’s subjects: they realised that journalists and editors had far more power than they did. In effect, they were at the mercy of journalists, who could decide how to frame stories, enhancing or damaging their reputations.
Journalists have a lot of power because their stories can have a wide and long-lasting impact. Furthermore, this power is mostly unaccountable: in the face of unwelcome coverage, the ordinary person has little recourse aside from expensive services to manage online reputations. That journalists have a lot of power does not mesh easily with journalists’ own self-image. They are pressured to produce ever more stories with fewer resources. If anything, they see themselves as courageous champions of the underdog, holding the spotlight to the wrongdoings of powerholders, in the tradition of what is called the fourth estate. With this self-image, it is easy to forget that media coverage can have drastic impacts on subjects of the coverage and that their relationship with those subjects is quite unequal.
Insights about the news
Based on her interviews and other research, Palmer offers a set of lessons for journalists and subjects. To these I would add a few suggestions for consumers of the news, reading about someone who is portrayed as, for example, a hero, an innovator, a victim or a crook.
It’s useful to remember that media
portrayals can, at best, capture only one aspect of a person’s life. So try not
to assume that coverage defines a person. This is especially important when the
treatment is negative. This is a warning not to engage in social media mobbing
without full information.
In one instance in which I came under attack in a newspaper, there were numerous hostile social media comments. I received a number of hostile emails as well as favourable ones. Most disturbingly, I received just one request for more information. The lesson is that if you see negative coverage about someone and don’t know them, then refrain from joining in an attack; instead, ask them for their side of the story. Or ask someone else who might have independent information.
If a friend of yours is in the media, you might congratulate them, assuming the coverage is positive. You might also take extra care and talk to them about the issues involved.
On quite a few occasions, acquaintances have said to me, “I heard you on the radio.” Sometimes, not remembering the interview, I say, “What was I talking about?” Usually they can’t remember. This experience accords with Palmer’s observation that media coverage conveys status independently of the content of the coverage. So when you hear someone you know on the radio, you might like to strike up a conversation about the topic. You might learn something extra. But be careful: they might be sick of the topic and want to talk about anything else.
There’s an old saying in media studies: “Newspapers don’t tell people what to think; they do tell people what to think about.” Keep this in mind when you respond to media stories and try, at least occasionally, to explore what wasn’t in the news.
Here’s the “deep story” that Palmer’s subjects felt was true about the mass media:
“The news media is extremely powerful — much, much more powerful than most citizens. Journalists are primarily motivated by profit and status, rather than public service. And yet, outrageously, journalists claim the mantle of public defender. Thus hypocrisy and the potential for abuse define the news media’s relationship to the public.” (p. 214)
To achieve happiness, can it be useful to pursue pain and discomfort?
Many people make enormous efforts to avoid stress and strain. They will search for a convenient parking space rather than walk a few hundred meters. When the temperature gets too hot or cold, they turn on the cooling or heating. For headaches, there are analgesics. For emotional pain, therapy or maybe a stiff drink.
While avoiding pain, people often pursue pleasure. This can be comfortable chairs, tasty food, thinking positive thoughts and becoming absorbed in social media. Pleasure is commonly seen as the opposite of pain.
But what if much of this quest is misguided? That is the argument presented by Brock Bastian in his new book The Other Side of Happiness. Bastian, a psychology researcher at the University of Melbourne, reports on studies by himself and others that support a seemingly counter-intuitive conclusion: pain can be a route to true happiness.
Bastian begins by noting a curious phenomenon. Despite the apparent vanquishing of both physical and emotional pain, levels of anxiety and depression in young people seem to be increasing. I noticed this among students in my classes. Colleagues who deal with student issues tell me the entire university sector is affected. Richard Eckersley has written about the problems affecting young people who, despite reporting high happiness levels, seem to suffer inordinately high levels of psychological distress.
Bastian reports on something else: the pursuit of pain. You might ask, who, except for masochists, would voluntarily seek painful experiences? Actually, quite a few do. Running a marathon is gruelling, yet surprising numbers of people see this as a worthwhile goal. Likewise climbing mountains. Eating a hot chilli pepper can be bracing. Some people get a thrill out of scary rides or jumping out of aeroplanes, even though (or because) these cause a huge adrenaline rush.
There are also painful emotional experiences. For some, singing in front of others requires enormous courage, yet this is undertaken voluntarily. Others find it nerve-racking to approach someone they revere.
How should a psychologist go about doing controlled studies of how people handle pain, both physical and emotional? It’s hardly feasible to have subjects scale mountain cliffs or have an audience with the Queen.
For physical pain, one ingenious method is to ask subjects to hold their hands in a bucket of ice water. This is quite painful but not harmful. Before or after the ice water treatment (or, for controls, some other activity that isn’t painful), subjects then are asked to do other tasks. The way they react to these tasks reveals something about the role of pain.
For example, one experiment used a task that tested generosity, such as donating to a worthy cause. What do you think: would experiencing physical pain make people more or less generous? (The answer: more generous.)
For emotional pain, a clever technique is to simulate ostracism. In a computer game, subjects find they are being left out of the interaction by the other players. So strong is the urge to be included in a group that even in this short simulation being neglected is a distressing experience.
As well as studies in the lab, psychologists also undertake survey research. For example, one finding is that early stress in a marriage can make it resilient in the face of future challenges, and lead to greater satisfaction.
Based on a wide range of evidence, from lab studies to studies of trauma victims, Bastian concludes that it’s better to encounter some adversity in our lives. It shouldn’t be overwhelming, just enough to build the capacity to overcome it. In this process, we become emotionally stronger. Conversely, hiding from pain gives it extra power to cause distress.
“The key to healthy psychological functioning is exposure. If we want to be happy, we cannot afford to hide from our challenges and surround ourselves in protective layers of comfort. To achieve emotional stability and the capacity to handle challenges when they arise, we may be well advised to occasionally seek out discomfort and to take ourselves outside our proverbial comfort zones more often than we do.” (p. 95)
Bringing people together
In 1980, Lindy Chamberlain’s baby Azaria was taken away by a dingo. In television interviews, she put on a brave face, hiding her grief. Unfortunately, this was damaging to her credibility, because not showing emotions makes others think you deserve your pain.
On the other hand, expressing your physical or emotional pain triggers support from others. This is observed in the outpouring of generosity after disasters. It is also observed in combat, which bonds fighters together.
Support from people you know or trust makes a difference: it actually reduces the pain. Bastian notes that even a photo of a loved one can have this effect. It is not surprising, then, that experiencing pain encourages people to seek social connections.
Keep a photo of your loved one handy
There is another fascinating social effect of hardship: studies show it can promote creativity. So perhaps there is some truth in the stereotypical image of the struggling artist. Bastian concludes, “We need to endure the challenge of sometimes stressful, novel and potentially threatening environments to foster true originality.” (p. 125)
This idea might be used to justify unpleasant working conditions, and precarious employment. On the other hand, it could also justify reducing executive salaries and putting political leaders in small, cramped offices.
There’s an important qualification that needs to be emphasised. When discomfort is voluntary, then inhibiting desires can improve performance. An example is uncomfortable yoga postures, which can help train the mind to focus. But involuntary discomfort, for example chronic pain, reduces performance. The implication is that imposed pain should be reduced or relieved, while there should be more opportunities for voluntary discomfort.
Bastian cites eye-opening data showing that people in poorer countries report greater meaning in their lives. Perhaps this should not be such a surprise given the number of well-off people who seem to lack purpose, spending time on fleeting pleasures rather than pursuing deeper connections. Note that country comparisons can be misleading and that having a meaningful life is not the same as being happy.
Negative experiences, including being reminded of death, trigger a search for meaning, leading to a greater sense of purpose that isn’t there when there is no suffering. Bastian describes research on an earthquake emergency. People who had thoughts of dying during the earthquake were more likely to shift their priorities from extrinsic to intrinsic ones. This meant, for example, putting less priority on income and possessions and more on relationships and beliefs. Bastian concludes, “The more we consciously engage with our own mortality the more likely we are to focus on things that matter; to seek out things that are ultimately likely to provide more depth in our lives.” (p. 170)
The Other Side of Happiness provides a powerful counter to the usual emphases in society, in which the priority is seeking pleasure and reducing pain. It also puts a somewhat different perspective on happiness research. Happiness researchers have challenged the usual emphasis on possessions, income, good looks and education, saying that, outside of poverty, they have only a limited impact on wellbeing. Instead, changing one’s thoughts and behaviours has greater impact, for example expressing gratitude, being mindful, being optimistic, building relationships and helping others.
However, happiness research gives little attention to the benefits of physical and emotional pain. This is addressed by implication in recommendations for physical activity, building resilience and pursuing a purpose. However, the painful sides to these activities are seldom emphasised, perhaps because it is not easy to sell a recommendation for seeking pain rather than pleasure.
Yet that is exactly Bastian’s recommendation. He says there is a need to recognise that stress, struggle and pain can bring happiness. Examples include intense exercise, having children, working hard and helping others. The key is to recognise the process, namely to see the positive side of negatives.
The takeaway message: seek out calculated risks and challenges, and let your children do the same. Search for discomfort and embrace feelings of sorrow and loss. Recognise that experiencing and valuing unpleasant experiences can be a path to greater satisfaction.
Some Australian media outlets have been warning that university students are unduly protected from disturbing ideas. But are these same media outlets actually the ones that can’t handle disturbing ideas?
For years, I’ve been seeing stories in The Australian and elsewhere about problems in universities associated with political correctness (PC). The stories tell of students who demand to be warned about disturbing material in their classes, for example discussions of rape in a class on English literature. The students demand “trigger warnings” so they can avoid or prepare for potentially disturbing content. Detractors call them “snowflake students”: they are so delicate that, like a snowflake, they dissolve at exposure to anything slightly warm.
Former Labor Party leader Mark Latham, for example, referred to “the snowflake safe-space culture of Australian universities.”
Richard King, the author of On Offence: The Politics of Indignation, reviewed Claire Fox’s book I Find that Offensive. King says that the principal target of Fox’s book “is ‘the snowflake generation’, which is to say the current crop of students, especially student activists, who keep up a constant, cloying demand for their own and others’ supervision. ‘Safe spaces’, ‘trigger warnings’ and ‘microaggressions’ are all symptoms of this trend.”
I treat these sorts of stories with a fair bit of scepticism. Sure, there are some incidents of over-the-top trigger warnings and demands for excessive protection. But are these incidents representative of what’s happening more generally?
Before accepting that this is a major problem, I want to see a proper study. A social scientist might pick a random selection of universities and classes, then interview students and teachers to find out whether trigger warnings are used, whether class discussions have been censored or inhibited, and so forth. I’ve never heard of any such study.
What remains is anecdote. Media stories are most likely to be about what is unusual and shocking. “Dog bites man” is not newsworthy but “man bites dog” might get a run.
Most of the Australian media stories about trigger warnings and snowflake students are about what’s happening in the US, with the suggestion that Australian students are succumbing to this dire malady of over-sensitivity.
Trigger warnings: Australian movie and video game classifications
There is a case for trigger warnings. Nevertheless, in thirty years of undergraduate teaching, I never saw any need for them — except when I asked students to use them.
For one assignment in my class “Media, war and peace,” students formed small groups to design an activity for the rest of the class. The activity had to address a concept or theory relating to war or peace, violence or nonviolence. Quite a few student groups chose the more gruesome topics of assassination, torture or genocide, and some of them showed graphic pictures of torture and genocidal killings.
Never did a single student complain about seeing images of torture and killing. Nevertheless, I eventually decided to request that the student groups provide warnings that some images might be disturbing. Thereafter, when groups provided warnings, no students ever excused themselves from the class. I was watching to see their reactions and never noticed anyone looking away.
This is just one teacher’s experience and can’t prove anything general. It seems to show that some Australian students appear pretty tough when it comes to seeing images of violence. Perhaps they have been desensitised by watching news coverage of wars and terrorist attacks.
However, appearances can be deceptive. My colleague Ika Willis pointed out to me that students may hide their distress, and that few would ever complain even if they were distressed. So how would I know whether any of my students were trauma survivors and were adversely affected? Probably I wouldn’t. That is an example of why making generalisations about trigger warnings based on limited evidence is unwise.
A journalist attends classes – covertly
On 8 August 2018, Sydney’s Daily Telegraph ran a front-page story attacking three academics at Sydney University for what they had said in their classes. The journalist, Chris Harris, wrote about what he had done this way: “The Daily Telegraph visited top government-funded universities in Sydney for a first-hand look at campus life …” This was a euphemistic way of saying that he attended several classes without informing the teachers that he was attending as a journalist, and covertly recorded lectures without permission. Only in a smallish tutorial class, in which the tutor knows all the students, would an uninvited visitor be conspicuous.
Harris then wrote an expose, quoting supposedly outrageous statements made by three teachers. This was a typical example of a beat-up, namely a story based on trivial matters that are blown out of proportion. Just imagine: a teacher says something that, if taken out of context, can be held up to ridicule. Many teachers would be vulnerable to this sort of scandal-mongering.
One issue here is the ethics of covertly attending classes and then writing a story based on statements taken out of context. Suppose an academic covertly went into media newsrooms, recorded conversations and wrote a paper based on comments taken out of context. This would be a gross violation of research ethics and scholarly conventions. To collect information by visiting a newsroom would require approval from a university research ethics committee. Good scholarly practice would involve sending a draft of interview notes or the draft of a paper to those quoted. In a paper submitted for publication, the expectation would be that quotes fairly represent the issues addressed.
A typical Daily Telegraph front page
Where are the snowflake students?
So when Harris attended classes at universities in Sydney, did he discover lots of snowflake students who demanded to be protected by trigger warnings? He didn’t say, but it is clear that at least two individuals were highly offended: a journalist and an editor! They thought the classroom comments by a few academics were scandalous.
In a story by Rebecca Urban in The Australian following up the Telegraph expose, Fiona Martin’s passing comment about a cartoon by Bill Leak comes in for special attention. According to this story, “The Australian’s editor-in-chief Paul Whittaker described the comment as ‘appalling’ and ‘deeply disrespectful’.”
So apparently News Corp journalists and editors are the real snowflakes, not being able to tolerate a few passing comments by academics that weren’t even intended for them or indeed for anyone outside the classroom. Or perhaps these journalists and editors are outraged on behalf of their readership, who they consider should be alerted to the dangerous and foolish comments being made in university classrooms.
Where in this process did the call for students to be tough and be exposed to vigorous discussion suddenly dissolve?
The contradiction is shown starkly in a 10 August letter to the editor of The Australian by Andrew Weeks. The letter was given the title “Bill Leak’s legacy is his courage in defending the right to free speech”. Weeks begins his letter by saying “I am unsure what is most disturbing about the abuse of sadly departed cartoonist Bill Leak by Fiona Martin.” After canvassing a couple of possibilities, he says “Perhaps it is the fact that Sydney University has supported its staffer, offering lip service in support of freedom of speech when that is exactly what is being endangered by the intolerance characteristic of so many university academics.”
The logic seems to be that freedom of speech of Bill Leak (or those like him) is endangered by an academic’s critical comment in a classroom, and that a university administration should not support academics who make adverse comments about Leak.
Again it might be asked, what happened to the concern about the snowflake generation? The main snowflakes are, apparently, a journalist, an editor and some readers. Perhaps it would be wise in future for journalists to avoid visiting university classrooms so that they and their readers will not be disturbed by the strong views being expressed.
Universities do have serious problems, including a heavy reliance on casual teaching staff and lack of support for international students, both due to lack of money. More students report problems with anxiety and depression. There is also the fundamental issue of the purpose of higher education, which should not be reduced to job preparation. Instead of addressing these issues, News Corp newspapers seem more interested in the alleged danger, apparently most virulent in humanities disciplines, of political correctness.
My focus here is on an apparent contradiction or discrepancy in treatments of PC and “snowflake students” in The Australian and the Daily Telegraph. While decrying the rise of the so-called snowflake generation, journalists and editors seemed more upset than most students by comments made in university classrooms.
One other point is worth mentioning. If you want to inhibit vigorous classroom discussions of contentious issues, there’s no better way than spying on these discussions with the aim of exposing them for public condemnation. This suggests the value of a different sort of trigger warning: “There’s a journalist in the classroom!”
Further reading (mass media)
Josh Glancy, “Rise of the snowflake generation,” The Australian, 8-9 September 2018, pp. 15, 19.
Christopher Harris, “Degrees of hilarity” and “Bizarre rants of a class clown,” Daily Telegraph, 8 August 2018, pp. 4-5.
Richard King, “Fiery blast aimed at ‘snowflake generation’,” The Australian, 1 April 2017, Review p. 22.
Mark Latham, “The parties are over,” Daily Telegraph, 9 January 2018, p. 13.
Bill Leak, “Suck it up, snowflakes,” The Australian, 11 March 2017, p. 15.
Rebecca Urban, “Uni backs staffer on secret suicide advice,” The Australian, 9 August 2018, p. 7; (another version) “University of Sydney stands by media lecturer following Bill Leak attack,” The Australian, 8 August 2018, online.
Further reading (scholarly)
Sigal R. Ben-Porath, Free Speech on Campus (University of Pennsylvania Press, 2017).
Emily J. M. Knox (ed.), Trigger Warnings: History, Theory, Context (Rowman & Littlefield, 2017).
Acknowledgements Thanks to several colleagues for valuable discussions and to Tonya Agostini, Xiaoping Gao, Lynn Sheridan and Ika Willis for comments on a draft of this post. Chris Harris and Paul Whittaker did not respond to invitations to comment.
According to mainstream scientists, HIV transmission in Africa operates differently than elsewhere. An alternative view has been systematically ignored and silenced.
HIV prevalence in Africa
AIDS is the most deadly new disease in humans, with the estimated death toll exceeding 30 million. In order to restrain the spread of the infective agent HIV, scientists have tried to figure out how it spreads. The consensus is that HIV is most contagious via blood-to-blood exposures, such as through shared injecting needles, and in comparison the risks of transmission via heterosexual sex and childbirth are small.
However, there’s a mystery in relation to Africa. The scientific consensus is that in Africa, unlike elsewhere, HIV spreads mainly through heterosexual sex. Why should this be?
My own interest in research on AIDS derives from a different controversy, the one over the origin of AIDS. The standard view is that AIDS first appeared in Africa and was due to a chimpanzee virus, called a simian immunodeficiency virus or SIV, that got into a human, where it was called a human immunodeficiency virus or HIV. Chimps have quite a few SIVs, but these don’t hurt them presumably because they have been around long enough for the population to adapt to them, in the usual evolutionary manner. There are various species of chimps, and when a chimp is exposed to an unfamiliar SIV, it can develop AIDS-like symptoms.
So the question is, how did a chimp SIV enter the human species and become transmissible? The orthodox view is that this occurred when a hunter was butchering a chimp and got chimp blood in a cut, or perhaps when a human was bitten by a chimp, or perhaps through rituals in which participants injected chimp blood.
In 1990, I began corresponding with an independent scholar named Louis Pascal who had written papers arguing that transmissible HIV could have entered humans through a polio vaccination campaign in what is present-day Congo, in which nearly a million people were given a live-virus polio vaccine that had been grown on monkey kidneys. The campaign’s time, 1957 to 1960, and location, central Africa, coincided with the earliest known HIV-positive blood samples and the earliest known AIDS cases.
Despite the plausibility and importance of Pascal’s ideas, no journal would publish his articles, so I arranged for his major article to be published in a working-paper series at the University of Wollongong. Independently of this, the polio-vaccine theory became big news. Later, writer Edward Hooper carried out exhaustive investigations, collected much new evidence and wrote a mammoth book, The River, that put the theory on the scientific agenda. Over the years, I wrote quite a few articles about the theory, not to endorse it but to argue that it deserved attention and that scientific and medical researchers were treating it unfairly.
In the course of this lengthy controversy — which is not over — I became increasingly familiar with the techniques used by mainstream scientists to discredit a rival, unwelcome alternative view. I had been studying this, on and off, since the early 1980s; the origin-of-AIDS saga made me even more attuned to how dissenting ideas and researchers can be discredited.
With this background, when I read John Potterat’s chapter “Why Africa?” it was like he was providing a front-row seat for a tutorial on how an unwelcome view can be marginalised. I saw one familiar technique after another.
I’m not here to say that Potterat’s view is correct. Furthermore, unlike the origin-of-AIDS debate, I haven’t studied writings about HIV transmission in Africa. What I do here is outline Potterat’s account of his experiences and comment on the techniques used to dismiss or discredit the ideas he and his collaborators presented to the scientific community.
HIV is infectious, so it is important to know exactly how it gets from one person to another. Knowing transmission routes is the basis for developing policies and advice to prevent the spread of the virus.
In Seeking the Positives, Potterat tells about his personal journey in scientific work. It was unusual. With a degree in medieval history, he ended up with a job in Colorado Springs (a moderate-sized town in Colorado) tracking down networks of people with sexually transmitted diseases (STDs). Learning from his mentors, the approach he developed and pursued with vigour was to interview infected individuals, find out their sexual or injecting-drug partners and proceed to build up a database revealing the interactions that spread the disease. The military base near the city meant there were lots of prostitutes (some permanent, some seasonal) and STDs to track. This sort of shoe-leather investigation (seeking those positive for disease) led to many insights reflected in a vigorous publication programme. For the Colorado Springs research team, AIDS became a key focus from the 1980s on.
When submitting a paper to a scientific journal, editors and reviewers are supposed to assess it on its merits. It should not matter whether an author has a PhD in epidemiology from Oxford or no degree at all. The test is the quality of the paper. Potterat became the author of dozens of scientific papers. However, his unusual background may have been held against him in certain circles.
In Seeking the Positives, Potterat doesn’t tell that much about his team’s clients/informants. Sensitively interviewing prostitutes, partners of prostitutes, drug users, gay men and others would have been a fascinating topic in itself, but Potterat focuses on the research side of the story.
A diagram from one of Potterat’s papers
You might think that contact tracing is an obvious way to study the transmission of disease, especially a new disease for which the patterns of contagion are not fully understood. But what Potterat’s team was doing was unusual: mainstream AIDS researchers pursued other approaches. Because the mainstream researchers had lots of research money, they didn’t take kindly to a small, non-prestigious team doing something different.
Mainstream groups, both researchers and activists, raised a series of objections to HIV contact tracing. First they said there was no reason for contact tracing unless there was a test for HIV. Second, after a test became available in 1985, they said tracing would allow the government to compile lists of homosexuals. Third, they said that without effective treatment, notifying individuals would distress them and lead to suicides. Fourth, after the drug AZT became available in 1987, they said contact tracing would be too expensive.
The interesting thing here is that none of the objections was backed by any evidence. Potterat says that in his team’s studies nearly all of those approached for contact tracing were very helpful.
“Contact tracing was generally opposed by AIDS activists, by civil libertarians, and (disappointingly) by many public health workers, who were often influenced by political correctness and by not wanting to offend strident constituencies.” (pp. 68-69)
Later, mainstream public health officials in the US took the line that AIDS was a danger to the heterosexual population, not just to gays and injecting drug users. If HIV was highly contagious in the wider population, this lowered the stigma attached to gays and injecting drug users, and coincidentally made it possible to attract more funding to counter the disease, a worthy objective. However, contact tracing showed that HIV transmission was far higher in specific populations. This was another reason the research by Potterat’s group, published in mainstream journals, didn’t lead to changes in research priorities more generally.
HIV transmission in Africa
In 2000, Potterat was approached by David Gisselquist about the spread of AIDS in Africa, questioning the usual explanations for why the mechanisms were claimed to be different from those in Western countries. After his retirement the following year, Potterat and some of his collaborators joined with Gisselquist in examining the studies that had been made.
The orthodox view was that in Africa, uniquely, HIV transmission occurs primarily through heterosexual sexual activity. This, according to Potterat et al., was based on assumptions about high frequencies of sexual interactions and high numbers of partners, neither of which were supported by evidence. They said the evidence suggested that sexual activity in Africa was much like elsewhere in the world.
In this was the case, the orthodox view couldn’t explain HIV transmission in Africa, so what could? The answer, according to Potterat and his collaborators, was skin-puncture transmission that occurred when contaminated needles were reused during health-care interventions such as blood testing, vaccinations and dental work, plus tattooing and traditional medical practices. This was heresy. It was also important for public health. Potterat writes, “Only when people have accurate knowledge of HIV modes of transmission can they make good decisions to protect themselves and their families from inadvertent infection.” (p. 200)
Potterat’s team wrote dozens of papers, but they had a hard time getting them published in top journals, where orthodoxy had its strongest grip. Nevertheless, they were quite successful in publishing in reputable journals of slightly lower standing.
The most common response was to ignore their work. Even though Potterat et al. had poked large holes in the orthodox view, orthodoxy was safe if the critique was given no attention.
Another response was to try to prevent publication of orthodoxy-challenging research. One study was by a team, not Potterat’s, involving Janet St. Lawrence, then at the Centers for Disease Control and Prevention (CDC), and her colleagues. According to Potterat, St. Lawrence’s CDC superiors asked her not to publish the paper, but she refused. The paper was rejected by several journals, and then submitted to the International Journal of STD & AIDS. After peer review and acceptance, the CDC applied pressure on the editor to withdraw acceptance, but he refused. This is just one example of efforts made to block publication of dissenting research findings.
Janet St. Lawrence
“… it does not engender trust in the official view to know that our informal group has solid evidence of several instances by international health agencies actively working to suppress findings supportive of non-sexual transmission and to discourage research into non-sexual transmission.” (p. 221)
Another tactic was to misrepresent views. On 14 March 2003, the World Health Organisation held a meeting of experts to, as stated in a memo to participants, “bring together the leading epidemiological and modeling experts with Gisselquist and Potterat.” Potterat was dismayed by the consultation: data disagreeing with the orthodox view was dismissed. After the meeting, a statement was put out by WHO presented as representing a consensus. Actually, this so-called consensus statement did not represent everyone’s viewpoints, and was actually finalised prior to the conclusion of the meeting. (This was an exact parallel to what happened at an origin-of-AIDS conference.)
Potterat was surprised and disappointed to be subject to ad hominem comments, otherwise known as verbal abuse. He writes:
“Among other, less printable, things I was called ‘Africa’s Newest Plague’; ‘Core Stigmatizer’; ‘Linus Pauling—in his later years’ (when Pauling was thought to be advancing crackpot ideas); and [a reward being offered] ‘for his head on a platter’.” (pp. 193-194)
Potterat was surprised at this invective because none of his team had imagined the resistance and anger their work would trigger among mainstream agencies and researchers. He was disappointed because many of the comments came from colleagues he had previously admired.
Researchers into the dynamics of science have coined the term “undone science” to refer to research that could be done and that people are asking to be done, but nevertheless is not carried out. A common reason is that the findings might turn out to be unwelcome to powerful groups. Governments and industry, through their control over most research funding, can stifle a potential challenge to orthodoxy by refusing to do or fund relevant research.
Undone science is most common in areas where citizen groups are calling out for investigations, for example on the environmental effects of mining in a particular area or the health effects of a new chemical. Three research students who I supervised used the idea of undone science as a key framework for their theses, on drugs for macular degeneration, on vaccination policy, and on the cause of the cancer afflicting Tasmanian devils. My former PhD student Jody Warren and I, drawing on our previous work, wrote a paper pointing to undone science in relation to three new diseases. With this experience, I was attuned to notice cases of undone science in whatever I read. In Potterat’s chapter “Why Africa?” there were many striking examples.
In their papers, Potterat and his colleagues presented findings but, as is usual in scientific papers, acknowledged shortcomings. In one case, to counter criticisms, they reviewed research on the efficiency of HIV transmission by skin-puncturing routes, while admitting that new studies were needed to obtain better data. Potterat concludes, “To my knowledge, such studies have not been fielded.” (p. 199)
In another study, on discrepancies in studies of Hepatitis-C strains and patterns, Potterat writes, “In the intervening decade, however, no studies had been fielded to resolve these uncertainties.” (p. 199)
Potterat and his collaborators were unable to obtain external funding to carry out studies to test their hypotheses. So Potterat used his own money for a small study of HIV transmission in Africa. “Yet this pilot study supported our contentions and should have provoked the conducting of larger studies to confirm our findings. Regrettably, this did not happen.” (p. 205)
As stated earlier, I am not in a position to judge research about transmission of HIV in Africa. I approach the issue through Potterat’s account of the tactics used by supporters of orthodoxy against a contrary perspective. The tactics, according to him, included ignoring contrary findings, denigrating the researchers who presented them, putting out a misleading consensus statement, and refusing to fund research to investigate apparent discrepancies. I was struck by the remarkable similarity of these tactics to those used against other challenges to scientific and public-health orthodoxy. This does not prove that the dissident viewpoint is correct but is strong evidence that it has not been treated fairly. To be treated fairly is usually all that dissident scientists ask for. The hostile treatment and failure to undertake research (“undone science”) suggest that defenders of orthodoxy are, at some level, afraid the challengers might be right.
Potterat nicely summarises the multiple reasons why the findings by him and his colleagues were resisted.
“By their own admission, the international agencies feared that our work would cause Africans to lose trust in modern health care, especially childhood immunizations, as well as undermine safer sex initiatives. (Recall that their condom campaigns were also aimed at curtailing rapid population growth in sub-Saharan Africa.) We speculate that disbelief on the part of HIV researchers that medical care in Africa could be harming patients may have been a significant factor in their defensive posture. We were also impugning the quality of their scientific research and potentially threatening their livelihoods. In addition, our analyses also directly threatened the politically correct view that AIDS was not just a disease of gay men and injecting drug users, but also of heterosexuals. Lastly, our data were undermining the time-honored belief about African promiscuity, a notion that may well have initially contributed to the (pre)conception that AIDS was thriving in Africa because of it.” (p. 194)
The depressing lesson from this saga, and from the many others like it, is that science can be subject to the same sorts of groupthink, intolerance of dissent, and defence of privilege that afflict other domains such as politics. To get to the bottom of long-standing scientific disputes by trying to understand the research is bound to be time-consuming and very difficult, something few people have the time or interest to pursue. I aim at something easier: observation of the tactics used in the dispute. This doesn’t enable me to determine which side is right but does give a strong indication of whether the dispute is being pursued fairly.
Some of the things we learn in economics classes may not be as simple as they seem.
One of the basics taught in economics classes is that the price of a good is determined by supply and demand. There’s a curve showing how the supply varies with the price and another curve showing how the demand varies with the price. Where the two curves intersect determines the price.
This seems plausible, but I must admit I never thought deeply about these curves. So I thought I’d try to figure out what the supply curve might look like for product I know something about: printed books.
What happens to the supply of a book when the price goes up? Well, most books are sold at the same price until they go out of print: the price doesn’t go up at all. Sometimes remaining print copies are “remaindered”: they are sold off to discount booksellers at a rock-bottom price, maybe $1 each, and the discount booksellers mark them up a bit, maybe to $2 or $5, though they are still a bargain compared to the original retail price.
When the book goes out of print, then it’s usually possible to buy used copies, for example via booksellers who sell through Amazon. If there’s still a demand, then some suppliers jack up the price. On the other hand, if the demand is great enough, the publisher may do another print run, so then the supply is suddenly increased.
So what does the supply curve look like? Here’s what the textbooks say.
Whoops. The textbook curve doesn’t seem to fit what happens with printed books. Instead of supply increasing as the price increases, the price stays the same and then, if there’s still a demand for the book as supply dwindles, the price goes up.
Ryan on economics
If this is as confusing to you as it was to me, then read Michael Ryan’s book The Truth about Economics. Ryan was a high tech executive who decided to become a school teacher in Texas and, after having to teach economics, became sceptical about some of the basics. To him, the claims in the standard textbooks used in the US didn’t make sense. So he wrote a book to explain, in simple terms, what’s going on.
According to Ryan, the supply and demand curves in textbooks like Paul Samuelson’s Economics and Gregory Mankiw’s Principles of Economics simply don’t apply in many situations. The texts say that the curves apply when other things are equal. The trouble is that other things often aren’t equal.
Even worse, Ryan shows that some of the data provided in the textbooks to show the operation of supply and demand are made up. Rather than using actual data from markets, the numbers in the texts are chosen to give the right answer, namely the answer that agrees with the theory. In other words, the authors work backwards from the theory to generate data that shows that the theory works.
If your mind goes blank at the sight of a table of numbers or a graph, you will find Ryan’s book challenging. Actually, though, it is easier than most economics texts, not to mention econometrics research papers, because Ryan patiently explains what is going on.
The Truth about Economics made me think for myself about supply and demand. I already knew about some of the abuses involving pharmaceutical drugs. Some companies exploit their patent-protected monopolies over drugs by raising prices unscrupulously, even though it costs no more to make the drug than before. When patients desperately need the drug, the demand is inelastic: it doesn’t change much even when the price goes up.
Another example Ryan uses is buying a car. He notes that there are different price ranges. You might be in the market for a low-cost Nissan Versa or a top-of-the-range Porsche. Most buyers only want one car, and want it within a particular price range. The result is a very different sort of supply-and-demand diagram.
As well as markets for goods and services, there are also markets for labour. Ryan analyses what US textbooks say and is withering in his criticism. He presents arguments showing why raising the minimum wage makes almost no difference to unemployment rates. Instead, raising the minimum wage benefits workers at the expense of owners and managers. Ryan points to the ideological role of conventional economic theory, at least as presented in US textbooks. He quotes from standard texts to show the authors’ hostility to trade unions. This raises the suspicion that some facets of economics texts are more a glorification of capitalism than a neutral presentation.
Could it be that generations of students have studiously learned about supply and demand curves and never questioned whether they actually described what happens in real markets? That is what Ryan claims. He presents ideas about groupthink to explain the economics profession’s continued commitment to a model based on questionable assumptions and for which there are so many counterexamples — such as book sales. As for students, Ryan believes most are too young and inexperienced to question textbooks, or they just suppress their rebellious thoughts.
I’m not here to endorse Ryan’s critique. Instead, I recommend it as a way to encourage you to think for yourself about markets.
Ryan argues that high-school students should be given the option of taking courses in financial literacy, learning the basics of bookkeeping and profit-and-loss statements. Financial literacy, he says, is far more relevant to the lives of students when they are in jobs and perhaps running their own businesses. However, Ryan is excessively optimistic to imagine that, based on his analysis, a movement will arise to introduce financial literacy courses throughout the US.
Learning about economics
You may have no interest in economics, but if you do, what’s the best way to learn about it? I’d say there are three things to look at.
First are expositions of the dominant neoclassical perspective. Currently in the US the most popular textbook is Gregory Mankiw’s Principles of Economics, so you could start there, but just about any basic text would be fine.
Second, to avoid simply accepting standard ideas without question, you can also look at critiques. Ryan’s The Truth about Economics is one possibility. Steve Keen’s Debunking Economics is a more advanced analysis.
Contrary perspectives have been put forward for decades. Ryan quotes from early economists like H. L. Moore, who challenged Alfred Marshall’s dominant ideas about supply and demand. That was in the early 1900s.
You can dip into the large literature on political economy, which is based on the idea that the economic system cannot be understood separately from the political system. John Kenneth Galbraith in a number of books, for example Economics and the Public Purpose, showed the value judgements built into orthodox economics.
Those with a mathematical bent can appreciate John Blatt’s 1983 book Dynamical Economic Systems in which he showed that the assumption of equilibrium in markets, an assumption that underlies a vast body of econometric theory, is untenable. Blatt also wrote a paper, “The utility of being hanged on the gallows,” that challenged the assumptions underlying utility theory, central to much work in economics.
Third, it is illuminating to look at alternatives to standard economic theory. For example, you can read about local currencies, which provide a radically different way of thinking about markets.
A more radical alternative is the sharing economy based on an expansion of the commons, with production done collaboratively without pay, as with free software. Then there are models or visions of economic systems that avoid reliance on organised violence. Current market systems do not qualify because the power of the state is required to protect private property. I have found only four models or visions of economic systems that could operate without organised violence: Gandhian economics, anarchism, voluntaryism and demarchy.
If you’re going to study standard views, critiques and alternatives, what’s the best order to approach them? My suggestion would be to look at all three in tandem, because each throws light on the others.
It is not surprising that the discipline of economics is, to a considerable extent, a reflection and legitimation of the existing economic system. Challenges are needed to orthodox economic theory as part of challenges to the dominant economic system. That means there is still quite a lot worth investigating, indeed entire realms. Most likely, though, improved theory will depend on economic alternatives becoming a reality.
Researchers need to write as part of their job. It’s remarkable how stressful this can be. There is help at hand, but you have to be willing to change your habits.
Writing is a core part of what is required to be a productive researcher. Over the years, I’ve discovered that for many of my colleagues it’s an agonising process. This usually goes back to habits we learned in school.
Sport, music and writing
Growing up, I shared a room with my brother Bruce. I was an early riser but he wasn’t. But then, in the 10th grade, he joined the track and cross-country teams. Early every morning he would roll out of bed, still groggy, change into his running gear and go for his daily training run. After school he worked out with the team. He went on to become a star runner. At university, while majoring in physics, he obtained a track scholarship.
As well, Bruce learned the French horn and I learned the clarinet. We had private lessons once a week and took our playing seriously, practising on assigned exercises every day. We each led our sections in the high school band.
I also remember writing essays for English class, postponing the work of writing and then putting in hours the night before an essay was due. At university, this pattern became worse. I pulled a few all-nighters. To stay awake, it was the only time in my life I ever drank coffee.
Back then, in the 1960s, if you wanted to become a good athlete, it was accepted that regular training was the way to go. It would have been considered foolish to postpone training until just before an event and then put in long hours. Similarly, it was accepted that if you wanted to become a better instrumentalist, you needed to practise regularly. It was foolish to imagine practising all night before a performance.
Strangely, we never applied this same idea to writing. Leaving an assignment until the night before was common practice. And it was profoundly dysfunctional.
Luckily for me, while doing my PhD I started working regularly. On a good day, I would spend up to four hours on my thesis topic. I also started working on a book. Somewhere along the line I began aiming to write 1000 words per day. It was exceedingly hard work and I couldn’t maintain it for week after week.
In the 1980s, Robert Boice, a psychologist and education researcher, carried out pioneering studies into writing. He observed that most new academics had a hard time meeting the expectations of their job. They typically put most of their energy into teaching and neglected research, and felt highly stressed about their performance. Boice observed a pattern of procrastination and bingeing: the academics would postpone writing until a deadline loomed and then go into an extended period of getting out the words. However, these binges were so painful and exhausting that writing became associated with discomfort, thereby reinforcing the pattern. If writing is traumatic, then procrastination is the order of the day.
Procrastination and bingeing is just what I did in high school and undergraduate study. It’s what most academics did when they were younger, and they never learned a different pattern.
Boice observed that a small number of new academics were more relaxed and more productive. They didn’t binge. Instead, they would work on research or teaching preparation in brief sessions over many days, gradually moving towards a finished product. Boice had the idea that this approach to academic work could be taught, and carried out a number of experiments comparing different approaches to writing. (See his books Professors as Writers and Advice for New Faculty Members.)
In one study, there were three groups of low-productivity academics. Members of one group were instructed to write in their usual way (procrastinating and bingeing). They ended up with an average of 17 pages of new or revised text – in a year. That’s about half an article and far short of what was required to obtain tenure.
Members of the second group were instructed to write daily for short periods. In a year, they produced on average 64 pages of new or revised text. Members of the third group were instructed to write daily for short periods and were closely monitored by Boice. Their average annual total of new or revised text was 157 pages. This was a stunning improvement, though from a low baseline.
It didn’t surprise me too much. It was the difference between athletes who trained just occasionally, when they felt like it, and athletes who trained daily under the guidance of a coach. It was the difference between musicians who practised when they felt like it and musicians who practised daily on exercises assigned by their private teacher.
Gray and beyond
Decades later, in 2008, I came across Tara Gray’s wonderful book Publish & Flourish: Become a Prolific Scholar. In a brief and engaging style, she took Boice’s approach, extended it and turned it into a twelve-step programme to get away from procrastinating and bingeing. Immediately I tried it out. Instead of taking 90 minutes to write 1000 words, and doing this maybe one week out of three, I aimed at 20 minutes every day, producing perhaps 300 words. It was so easy! And it promised to result in 100,000 words per year, enough for a book or lots of articles.
Gray, adapting advice from Boice, recommends writing from the beginning of a project. This is different from the usual approach of of reading everything about a topic and only then writing about it. For me, this actually reduces the amount of reading required, because I know far better what I’m looking for. Over the following years, I gradually changed my writing-research practice. Previously, writing an article happened late in a project. Now I write from the beginning, and there is more follow-up work. The follow-up work includes looking up references, doing additional reading, seeking comments on drafts from non-experts and then from experts. It’s much easier and quality is improved.
I introduced this approach to writing to each of my PhD students. Some of them were able to take it up, and for them I could give weekly guidance. I also set up a writing programme for colleagues and PhD students. Through these experiences I learned a lot about what can help researchers to become more productive. An important lesson is that most academics find it extremely difficult to change their writing habits. Many can’t do it at all. Research students seemed better able to change, perhaps because their habits are less entrenched and because they think of themselves as learners.
With this newfound interest in helping improve research productivity, I looked for other sources of information. There is a lot of advice about how to become a better writer. Our writing programme was based on the work of Boice and Gray, so I looked especially at treatments that would complement their work. Excellent books include Paul Silvia’s How to Write a Lot and W. Brad Johnson and Carol A. Mullen’s Write to the Top! It was encouraging that most of these authors’ advice was similar to Boice’s and Gray’s. However, there seems to be very little research to back up the advice. Boice’s is still some of the best, with Gray’s research findings a welcome addition showing the value of regular writing.
To these books, I now add Joli Jensen’s superb Write No Matter What, and not just because it has a wonderful title. Jensen, a media studies scholar at the University of Tulsa, draws on her own experience and years of effort helping her colleagues to become more productive. As I read her book, time after time I said to myself, “Yes, that’s exactly my experience.”
“Writing productivity research and advice can be summarized in a single sentence: In order to be productive we need frequent, low-stress contact with a writing project we enjoy.” (p. xi)
Jensen excels in her exposition of the psychological barriers that academics experience when trying to write. She approaches this issue — one pioneered by Boice — through a series of myths, fantasies and fears. An example is the “magnum opus myth,” the idea held by many academics that they have to produce a masterpiece. This is profoundly inhibiting, because trying to write a bit of ordinary text feels so inadequate compared to the shining vision of the magnum opus. The way to avoid this discrepancy is to postpone writing, and keep postponing it.
Another damaging idea is that writing will be easier when other bothersome tasks are cleared out of the way. Jensen calls this the “cleared-desk fantasy.” It’s a fantasy because it’s impossible to finish other tasks, and new ones keep arriving: just check your in-box. Jensen says that writing has to take priority, to be done now, irrespective of other tasks that might seem pressing.
Then there is the myth of the perfect first sentence. Some writers spend ages trying to get the first sentence just right, imagining that perfecting it will unleash their energies for the rest of the article. This again is an illusion that stymies writing.
A colleague once told me how she was stuck writing the last sentence of a book review, with her fingers poised over the keyboard for an hour as she imagined what the author of the book she was reviewing would think. This relates to the perfect first sentence problem but also to Jensen’s “hostile reader fear.” Jensen also addresses the imposter syndrome: the fear that colleagues will discover you’re not a real scholar like them. Then there is the problem of comparing your work with others, usually with others who seem to be more productive. Upwards social comparison is a prescription for unhappiness and, in addition, can inhibit researchers. If others are so much better, why bother?
Write No Matter What is filled with valuable advice addressing all aspects of the writing process. Jensen offers three “taming techniques” to enable the time, space and energy for doing the craft work of writing. She has all sorts of practical advice to address problems that can arise with research projects, for example when you lose enthusiasm for a topic, when you lose the thread of what you’re trying to do, when your submissions are rejected (and subject to depressingly negative comments), when your project becomes toxic and needs to be dumped, and when you are working on multiple projects.
She says that writing can actually be harder when there’s more unstructured time to do it, something I’ve observed with many colleagues.
“When heading into a much-desired break, let go of the delusion that you will have unlimited time. Let go of vague intentions to write lots every day, or once you’ve cleared the decks, or once you’ve recovered from the semester. Acknowledge that academic writing is sometimes harder when we expect it to be easier, because we aren’t trying to balance it with teaching and service.” (p. 127)
Jensen is open about her own struggles. Indeed, the stories she tells about her challenges, and those of some of her colleagues, make Write No Matter What engaging and authentic. Her personal story is valuable precisely because she has experienced so many of the problems that other academics face.
With my experience of running a writing programme for a decade and helping numerous colleagues and research students with their writing, it is striking how few are willing to consider a new approach, how few are willing to admit they can learn something new and, for those willing to try, how difficult it is to change habits. Boice’s work has been available since the 1980s yet is not widely known. This would be like a successful sporting coach having superior training techniques and yet being ignored for decades.
To me, this testifies to the power of entrenched myths and practices in the academic system. Write No Matter What is a guide to an academic life that is both easier and more productive, but the barriers to shifting to this sort of life remain strong. In the spirit of moderation advocated by Boice, Gray and Jensen, read their books, but only a few pages per day. And write!
On 1 June this year, I received an email from Hildie Spautz. She wrote that her father, Michael E. Spautz, had died the previous day.
Michael Spautz, 1970s
I had only met Michael once, in 1981, and had not corresponded with him for a decade. But I knew a lot about his story.
Hildie was writing to me because she had found articles I had written about Michael’s difficulties at the University of Newcastle. I was one of the few who showed any sympathy for Michael’s concerns.
Hildie and her sister Laura, who each live in the US, were going through Michael’s belongings. He had vast numbers of paper files. Would I like to have them, or did I know anyone who would? My immediate response to both questions was no.
Michael’s death made me reflect on the events that derailed much of his life. Be prepared. This story does not have a happy ending. It is a story of wasted effort and dysfunction. There are, though, some useful lessons. I for one learned a lot from it.
The Spautz case
Spautz was originally from the US. He took a job in Australia at the University of Newcastle, where he was a senior lecturer in the Commerce Department. There were no particular dramas until 1978, after the appointment of a second professor in the department, Alan Williams.
Alan J Williams
In Australia at the time, the main academic ranks were lecturer, senior lecturer, associate professor, and professor. Relatively few academics reach the rank of professor, and decades ago it often came along with the role of the head of a department. To be a professor usually meant having an outstanding record in research or sometimes administration.
Williams, though, had far less than an outstanding record. He had recently received his PhD and had published two articles in management journals. Even though commerce was not then as research-intensive as disciplines like chemistry or sociology, nevertheless Williams’ record was decidedly lightweight for a professorial appointment. The back story was that the department was having trouble finding a suitable candidate and, it was suggested, made an inferior appointment rather than lose funding for the position.
Spautz had not been an applicant for the position when Williams applied, but had applied for it in earlier rounds when no appointment was made. Initially, there were no tensions between Spautz and Williams. However, after Williams was made head of a section within the department, Spautz began raising concerns. Alerted by two colleagues to problems with Williams’ research, Spautz started digging further.
Williams, in his PhD thesis, had studied the owners of small businesses, in particular their psychological problems. His argument was that such problems made the businesses more likely to fail. Spautz – who had a background in psychology – argued that the reverse process could have been responsible: when businesses struggle and fail, their owners are more likely to suffer psychologically. Spautz therefore claimed that Williams’ research was flawed due to “inverted causality”: he had mixed up cause and effect. Spautz also questioned some of the statistical methods used by Williams.
It is nothing special that scholarly research has shortcomings. Many academics exert great efforts in trying to find flaws in previous studies. This is part of the process of testing data and theory that is supposed to lead to reliable knowledge. In this context, Spautz’ critique of Williams’ research was nothing out of the ordinary.
However, it is uncommon for an academic to undertake a detailed critique of the work of an immediate colleague and then to do something about it. Academics often gripe about the weaknesses, irrelevance or unwarranted recognition of their colleagues’ research, especially colleagues who are arrogant or who seem to have gained unfair preferment. But griping is usually the extent of it. To openly criticise the work of an immediate colleague can be seen as disloyal. In some cases in which an academic speaks out about a colleague’s scientific fraud, it is the whistleblower who comes under attack by administrators.
Spautz, though, seemed to have few inhibitions in challenging the quality of Williams’ research. Spautz began his challenge in a conventional, scholarly way. He took his criticisms directly to Williams and to others in the Commerce Department, but obtained no support. He wrote a rebuttal of Williams’ published papers and sent it to the journals where those papers had been published. However, the editor was not interested. This should not have been surprising. If an article has had no particular impact, few editors would be keen on publishing a detailed rebuttal years later. This might be considered a shortcoming of the system of journal publication. It is far easier to publish an original study, with new data and findings, than a replication of a previous study, whether or not the replication supports the original study.
Williams had recently received his PhD from the University of Western Australia. Later on, Spautz wrote to UWA raising his concerns about shortcomings in Williams’ thesis. The Vice-Chancellor replied saying that this was a matter for the examiners of the thesis. Neither the identity of the examiners nor their reports were publicly available, as is usual in Australian universities. There is no standard institutional process for questioning the work in a thesis.
Spautz was stymied. He had tried the official channels for questioning Williams’ work and been blocked. This was long before the Internet, otherwise he could have posted his criticisms online.
There was one other institutional channel to be tried: the University of Newcastle itself. But Spautz’ complaints led nowhere.
Plagiarism, a scholarly sin
Along the way, Spautz added another claim to his allegations about Williams’ thesis: that it involved plagiarism, namely the use of other people’s words or ideas without appropriate acknowledgement. In the eyes of many academics, plagiarism is a cardinal sin, deserving the most severe condemnation. When undergraduate students are detected plagiarising in their assignments, they may be given a mark of zero or even referred to a student misconduct committee. (On the other hand, some teachers treat much undergraduate student plagiarism not as cheating but as a matter of not understanding proper citation practices.)
The plagiarism in Williams’ thesis is a subtle type, which can be called plagiarism of secondary sources. Williams gave references to a range of articles and books. Spautz was able to deduce that in quite a few cases Williams apparently had not actually looked at these articles and books himself, but had instead copied the references from a later publication, a “secondary source.” This sort of plagiarism basically involves copying references used by another author but not citing that author. It’s a common sort of plagiarism in many academic works. It is hard to prove, but in this instance Spautz was a super-sleuth, finding secondary sources and subtle clues that Williams had relied on these secondary sources, as I verified for myself.
Personally, having studied plagiarism, I don’t think this should be a hanging offence. However, because plagiarism has such a terrible reputation, especially plagiarism by academics, it would have been embarrassing for a university inquiry into Williams’ thesis to acknowledge any sort of plagiarism at all.
The snowflake campaign
Spautz started writing memos, in the form of typed or handwritten statements, mimeographed or photocopied. He put them in the mailboxes of academics on campus. This was his “campaign for justice.” It is accurately described as a campaign, because Spautz produced memo after memo, sometimes every day. He also called his efforts the “snowflake campaign” because there were so many white memos that they could be likened to flakes of snow landing on (or littering) the campus.
Spautz’s efforts drew the attention of the administration, and an inquiry was set up. Spautz’s aim was for his allegations about Williams’ research to be investigated. However, the inquiry instead focused on Spautz’s behaviour. Basically, he was told to shut up.
Spautz was not deterred by the admonitions from the inquiry, and continued his campaign. There was a second inquiry. Then in May 1980 the Council, the university’s governing body, dismissed Spautz. This was news: in Australia it is quite rare for a tenured academic to be fired. Furthermore, the circumstances in Spautz’s case were quite unusual.
From Spautz’s point of view, he had concerns about Williams’ research, had tried to raise them with Williams, journal editors and university administrators, and had been fobbed off, told to shut up and then dismissed. He wasn’t going to shut up, and dismissal just made him more determined to expose what he saw as injustice.
From the point of view of university administrators, Spautz was an annoyance. The solution was to go through some formal processes and then, when Spautz didn’t cooperate, to take the ultimate step of dismissing him. If administrators thought that this would be the end of the matter, they were wrong. Most dismissed academics are humiliated and go quietly. Others take legal action for dismissal, hoping to receive some compensation. (Reinstatement is exceedingly rare.)
Spautz never hired a plane to distribute his memos
Spautz was not like most other academics. He continued his campaign, and greatly expanded it. He continued production of memos, distributed to people on campus and numerous others beyond, including journalists. He heard about my work on suppression of dissent and contacted me in June 1980. I was henceforth on his mailing list.
Spautz expanded his allegations, claiming that various individuals were involved in a criminal conspiracy. He launched court cases, and more court cases. In the following years, at one point he was unable to pay court costs and was sent to prison. After 56 days, a judge found he had been falsely imprisoned. This was grist for more legal actions, and he later obtained compensation. Eventually he was declared a vexatious litigant. This was the only thing that stopped his decades of legal cases against various individuals he accused of wrong actions.
Michael Spautz, 1980s
The verdict: what a waste!
There are no winners in this story. From the time of his dismissal in 1980 until his death this year, he devoted most of his effort to his self-styled campaign for justice. For four decades he was obsessed, initially with the shortcomings of Williams’ research and then with the aftermath of his dismissal. Prior to this quest, Spautz had been a productive scholar, teaching undergraduates and authoring quite a few publications.
When I met him in 1981, I told him it would be better to put effort into writing up his story, and that pursuing action through the courts was likely to be futile. Others told him similar things. But he didn’t listen. He was convinced his course of action was the right one.
Alan Williams was another victim. He was unlucky to become the target of Spautz’s campaign. In another way, Williams was unlucky to have been appointed as a professor at the University of Newcastle on a thin research record, which made him vulnerable.
The University of Newcastle paid a severe penalty too. Spautz’s campaign brought it unwelcome attention, and several senior figures at the university had to spend considerable time dealing with Spautz’s charges against them. There were occasional news reports about Spautz’s legal cases. For a university administration, this is not a desired sort of media coverage.
University of Newcastle campus: a desired image
More damaging was the effect of the dismissal on the academic culture at the university. Although many staff found Spautz’s behaviour objectionable, many also were disturbed by his dismissal. The executive of staff association produced an informative report.
When I visited the campus in 1981, a year after Spautz had been dismissed, I could sense fear. Some staff did not want even to discuss Spautz, as if that would taint them and make them vulnerable. Openly expressing disagreement with the dismissal was felt to be risky, perhaps because they might be next. Spautz was unbowed by his dismissal, but it frightened many others.
Social, academic and legal systems are not designed to address cases such as this. When Spautz started raising concerns about Williams’ research, there was no one in a position of authority who was able or willing to step in and cut to the core issues he raised. At the University of Newcastle, all that administrators did was set up committees of inquiry that focused on Spautz’s behaviour. In many cases, such committees work well for their purposes, but they were manifestly inadequate to address Spautz and his campaign. The individuals involved in all these arenas were well meaning and following typical protocols. It was not a failure by individuals so much as a failure of the system.
Similarly, the legal system was not a good place to address Spautz’s concerns. It’s possible to imagine a more flexible system that would refer Spautz to a wise intervener who would look at the original grievance, namely the one not addressed by the university, and deal with it at the source. But of course the legal system is about applying the law, not about finding creative solutions to problems. As a result, the legal system suffered, with lawyers, judges and others spending a huge amount of time and money dealing with Spautz’s unending cases and appeals.
Would mediation have helped?
If systems are ill designed, then even the most well-meaning individuals can be caught up in them. Most people are likely to blame Spautz, but blame doesn’t provide any answers, just a feeling of superiority.
Occasionally in any society, there will be individuals who become obsessed about particular things. There is still much to be learned about how to find ways to channel obsessions into productive channels.
What I learned
Though the saga of Spautz’s ill-fated campaign for justice had no winners, I learned a lot from it. I studied Spautz’s allegations about Williams’ plagiarism, and to put them in context I read a lot about plagiarism more generally. I wrote a paper titled “Plagiarism, incompetence and responsibility” (and have now added links to numerous relevant documents). That paper was rejected by the first nine journals to which I submitted it. The tenth journal accepted a drastically revised version. From this experience, I learned how difficult it is to publish, in a scholarly journal, a discussion of an actual case involving allegations of incompetence and plagiarism. I talked with one journal editor on the phone. He told me that he would have liked to publish my article but the editorial committee, taking into account legal advice, decided not to proceed. They were worried about being sued.
I wrote a different (and less felicitous) article about the way Spautz’s actions were dealt with at the University of Newcastle. This was published in Vestes, the journal of the Federation of Australian University Staff Associations, FAUSA (which later became a union, the National Tertiary Education Union). It was delayed for a year due to concerns about legal action. It seems that writing about actual cases can be worrisome.
Most of all I learned about the failure of official channels. Spautz tried quite a few: journals, university administrations, courts. None of them worked well, certainly not for him. This was my first immersion in a case that showed clearly the shortcomings of formal procedures. This stood me in good stead when, over a decade later, I became involved in Whistleblowers Australia and talked to numerous whistleblowers. They told the same story: when they took their concerns to bosses, boards of management, ombudsmen and courts, they were regularly disappointed.
Official channels work fine in many circumstances, and most of the people on appeal committees and working in agencies are concerned and hard-working. But when a person with less power tries to challenge one with more power, or challenge the entire system, it is usually a hopeless cause. So that’s what for many years I have told whistleblowers and what I’ve written in my book giving advice to whistleblowers. Yes, you might be very lucky and find justice in official channels, but don’t count on it. Indeed, you should assume they won’t provide the justice you’re looking for. Although Spautz never learned that lesson, he taught it to me, and for that I am thankful.
Michael Spautz, 2011
Michael’s daughters Hildie and Laura had the unwelcome and overwhelming task of clearing his belongings from his unit, including accumulated files about his campaign that filled seven book cases (that’s cases, not shelves). Perhaps, whimsically, the files could have been placed as a display in a museum as a testament to the futility of spending years seeking justice through formal channels, with the message for those who might follow his path, “If at first you don’t succeed, then try something else.”
Don Parkes in his book Doctored!, mentioned above, made the following comments (page 12).
During the mid 1980s and through the 1990s, if one had an academic problem that required administrative attention, then at the University of Newcastle NSW, too often, one became ‘the problem’. As a serious enough problem one could end up in gaol, as was the case for Dr. Michael Spautz. Vice Chancellors and others will not give much attention to you, will not treat you as a colleague, or pay much real attention to the problem that you have raised: you become the problem and that is how they relate to you. Nevertheless, it is really quite easy to overcome the predicament: cooperate; just leave it to the powers that be: promotion and positive references await for such cooperation.
At about the time that our story was kicking in, Dr. Michael Spautz was sent to prison for 76 days in the high security, 150-year-old Maitland NSW gaol. He was an American, a Senior Lecturer in the Faculty of Economics and Commerce. Spautz fought the University all the way to the High Court of Australia because he was not satisfied that due process had been followed in the handling of reports of alleged plagiarism in the work of a newly appointed professor. Spautz was required to undergo psychiatric assessment and was eventually dismissed. He continued the fight.
Maitland gaol was a nasty place, high security prisons are nasty places, usually for nasty people. Dr. Spautz was not a nasty person. I knew him for many years and have often looked back, with some shame at my ‘bystander role’: though he was always openly welcome in my office; we met where and as we wished and together with my good friend Richard Dear from the university’s computer centre, we gave him many sheets of computer print-out paper on which to ‘roneo’ copy his ‘in vita veritas’ letters distributed to hundreds of staff and students. The reason for his imprisonment was clamed to be non-payment of an account. That’s believable? Technically probably ‘yes’, it is believable: but it was draconian, a ‘teach him a lesson’ sort of punishment. The university was well connected.
Fourteen years later, in 1996, he received a paltry sum of $75,000 for wrongful imprisonment; he was never reinstated in the University.