In higher education, being smart is greatly prized. But over-valuing smartness has downsides.
Alexander Astin is a US academic with vast experience with higher education in the country. During his career, he visited hundreds of campuses and talked with thousands of students, academics and administrators. He became convinced that there is a fundamental malady in the system: an obsession with smartness.
Astin summarises his concerns in a readable book titled Are you smart enough? How colleges’ obsession with smartness shortchanges students, published in 2016. His focus is entirely on the US but many of his assessments apply to Australia too.
Is your university prestigious?
University leaders greatly prize the status of their institutions. No surprise here. There is a widely known pecking order. Astin says that if you ask people in the US to name the best universities, they regularly come up with the same ones: Harvard, Yale, Princeton, Berkeley and so forth. The exact rankings might shift a bit over time, but the same ones appear in the top group. What is remarkable is that this order has hardly varied in half a century.
In Australia, the same thing applies: those commonly considered the best are the Australian National University, Melbourne, Sydney and so on down the list. The stability of the priority order is remarkable when you compare it to corporations. Apple, Amazon and Google are near the top of the pile but didn’t exist decades ago. Not a single new university has shot into the top group.
Next consider students. Most of them want to go to a prestigious university. They would rather go to Harvard than Idaho State, at least if they can get into Harvard. In Australia, students are attracted by the status of a university but also by the exclusiveness of a faculty. It’s higher status to study medicine or law than nursing or chemistry. Many high school students want to undertake the most exclusive degree they can. Why “waste” an ATAR (Australian Tertiary Admission Rank) of 99.9 on studying visual arts when you can do medicine?
Student preferences are driven mostly by status, with very little attention to the quality of the education provided. The student quest for status is misguided in several ways. One misapprehension is that a high-status university provides a better education. Because status is built mostly on research performance, it does not necessarily correlate with the quality of teaching and the richness of the university experience.
A second misapprehension is that getting a degree from a high-status university is a worthwhile investment. Universities regularly tout figures showing that graduates earn more over their lifetime than non-graduates. However, this is not a valid comparison, because if those who graduated had chosen a different path, they might have been just as successful. The point is that the qualities of the student may do more to determine their career success than the status of the university they attended or the advantages of the learning that it provided. The pay-off for attending a more selective university or undertaking a more exclusive degree may not be much at all.
The message for students is straightforward: instead of pursuing status, develop your skills and productive habits.
Are you attracting the best students?
Every university seeks to recruit the best students it can. At the University of Wollongong, this is obvious enough. The prestige of degrees accords with how restrictive they are. Faculties makes special pitches to students with high ATARs. They can become a “Dean’s Scholar” with special advantages. Universities with more money offer undergraduate scholarships to top performing students. Astin summarises the collective experience: “Every college and university, no matter its size or research emphasis, seeks out smart students.”
So what? Astin has three responses. First, he notes that the mad scramble to recruit top students is silly from a system point of view. If the students are going to go somewhere, why not just allocate them randomly? The reason is that a university’s status depends on the perception that its students are smart.
Astin’s second response is that universities have become so obsessed with smartness that they pay more attention to recruiting top students than educating them. As he puts it, “if you look at our higher education system from an educational perspective, this preoccupation with enrolling smart students makes little sense, because the emphasis seems to be more on acquiring smart students than on educating them well …” He provides many telling examples. More on this later.
His third response is to provide an analogy with the health system. If you are ill and go to a hospital’s emergency department, you will encounter a triage process. Your health problem will be assessed. If it is serious and urgent, you will be taken straight in for treatment. If it is not serious and not urgent, you will have to wait until the urgent cases have been dealt with. If it is nothing to worry about, you’ll be sent home. The health system puts most of its resources towards helping those with the worst health.
This orientation can be criticised by arguing that far more should be spent on preventive health measures, for example addressing pollution and unhealthy diets. But even in preventive health areas, the emphasis is on measures that help the greatest number of people at lowest cost.
In contrast, in higher education, most resources are directed towards those who are the highest performing, which means those who need the least support for learning. This is true in university entry, in provision of scholarships and in higher degrees. It is also true in classrooms where teachers give more attention and encouragement to the best students.
Astin points out that most teachers give more attention to what students know than to how much they have improved. Few teachers give tests at both the beginning and conclusion of courses in order to see what students have learned. Instead, they give tests to rank students, with the emphasis on seeing who is superior rather than focusing on improvement.
He notes that giving grades on assignments “is of limited usefulness in helping students improve their performance.” Many of my colleagues give extensive comments on assignments, not just grades. But I’ve also noted that many students focus on the grades, not on using comments to improve.
Another shortcoming of most classes is that teachers do not require students to keep working on the same assignment. When students are assigned to write an essay, usually it is marked and then the student moves on to the next assignment. There is more learning when students are expected to consider feedback and work on improving the essay, submitting it again and, if needed, yet again. On the few occasions when I used this approach, I could see its great value. But alas, this requires more time and effort by the teacher and is more challenging when class sizes expand.
Are you a smart academic?
Among academics too, there is a cult of smartness. Those researchers who bring in loads of external money and build up empires of research students and postdocs are highly prized. There is no such glorification of outstanding teachers.
The emphasis on being smart manifests itself in various ways. Astin says some academics are “maximisers” who seek to display how smart they are. Their questions at seminars are designed to show off their knowledge. Maximisers, when on committees, may become blockers. It’s easier to show your critical acumen by attacking someone else’s proposal than by presenting one’s own.
Other academics, Astin’s “minimisers,” put a priority on hiding any suggestion that they lack intelligence. This is related to the “imposter syndrome,” in which individuals feel they are faking it and don’t really deserve to be among all those other brilliant colleagues.
How nice it would be if it were easier to acknowledge weaknesses and lack of knowledge, to say “I don’t know” and “I need to improve my skills.”
Astin lists a whole range of ways that the obsession with smartness affects academic work. It:
• “limits prospects for educational equity • limits social welfare • hinders academic governance • limits recognition of different forms of intelligence • limits development of creativity, leadership, empathy, social responsibility, citizenship, self-understanding • limits finding better methods of assessment” (pp. 100-101)
What to do?
For getting away from the obsession with smartness and helping students who need help the most, Astin offers four principles for helping “underprepared students.”
The first is to promote engagement with learning, so students are motivated to study. Second is to foster peer interaction, so students learn from each other, including from more advanced students. Third is to have more interaction with academics. The fourth is to emphasise writing skills.
All these are worthwhile. It’s possible to imagine a university that pioneers systematic peer learning, with students in classes helping each other learn, students in upper-level classes assisting those in lower-level classes, and all spending time assisting disadvantaged students in the community. There are elements of each of these in some places, but shifting universities in this sort of direction seems a mammoth task. As Astin shows all too well, the prestige ranking of US universities is built on and helps perpetuate the obsession with smartness, an obsession that affects students, academics and administrators.
As critics have argued for decades, the education system serves not just to promote learning but to provide a rationale for social stratification. In other words, it justifies inequality: if you don’t succeed, it’s because you’re not smart enough. The implication of this critique is that changing the role of universities has to go hand in hand with challenging economic inequality. That’s a big task!
It is still possible to innovate in small ways within universities, and there are options for individuals. Students can choose to attend less prestigious institutions or to undertake less exclusive degrees, thereby questioning the smartness hierarchy. Academics can introduce peer learning in their classes, expand outcomes beyond cognitive tasks and measure learning before and after teaching.
Then there is the wider issue of the role of universities in society. If learning is the goal, why are degrees needed for certification? The radical alternative of de-schooling — learning by being part of a community designed for that purpose — can be reintroduced and updated for the digital age, in which access to abundant information is possible, and sorting through it and making sense of it are the greater challenges.
In a way, the biggest indictment of higher education is that it is so difficult to promote educational alternatives, to test out different ways of organising learning and to imagine different ways of pursuing greater knowledge for social benefit. Nevertheless, there remains hope for change when critics like Astin offer the insights of a lifetime and encourage the rest of us to see what is all too familiar with different eyes.
In December 2018, a partnership was announced between the Ramsay Centre and the University of Wollongong. The university would establish a degree in Western Civilisation funded by the centre.
The new degree was immediately controversial. In the previous months, there had been considerable publicity about proposed Ramsay-funded degrees in Western civilisation at the Australian National University and the University of Sydney. At both universities, many staff were opposed to the degrees. The ANU proposal did not go ahead, while the Sydney proposal was still being debated. Given this background, opposition to the degree at Wollongong was not surprising.
My aim here is to give a perspective on the controversy over the Ramsay-funded Western civilisation degree, especially as it has been played out at the University of Wollongong (UOW). I write as an academic at the university without a strong stake in the new degree, because I am retired and the issues involved do not impinge greatly on my main research areas. However, a number of my immediate colleagues have very strong views, and I have benefited from hearing their arguments, as well as the views of proponents of the degree.
The next section gives a brief overview of the institutional context, which is useful for understanding both incentives and concerns associated with Ramsay funding. Following this is an introduction to the Ramsay Centre. Then I outline the major issues raised at the university: decision-making, the conservative connection, Western civilisation and equality of resourcing. The conclusion offers a few thoughts on the de-facto strategies of key players.
It would be possible to go into much greater depth. Relevant are issues concerning the aims of education, the funding of higher education, the impact of private funding and agendas, the question of Western civilisation and the role of political ideology. Others have more expertise on these and other issues, and I hope some of them will contribute to the discussion.
Australian university sector
Most Australian universities are funded by the federal
government, but the funding environment has become increasingly challenging. In
the 1980s, the government introduced tuition fees based on government
zero-interest loans paid back as part of income tax only when a student’s
income reached a moderate level. Introducing these fees provided universities a
sizeable income stream, but not a bonanza, because the government cut its
direct funding, while opening the gates to a massive expansion in student
numbers over the following decades.
The result was that academics were met with ever-increasing class sizes. The student-staff ratio dramatically increased, almost doubling in some fields. However, this wasn’t enough to fix the financial squeeze. University managements dealt with it in two main ways.
aggressively recruited international students, who had to pay substantial
tuition fees. International student fees were used to cross-subsidise other
operations. Eventually, this income became Australia’s third largest export
industry, after iron and coal.
teaching was increasingly carried out by “casual” staff, paid by the hour or on
short-term contracts. University teaching was casualised almost as much as the
fast food industry.
beginning in the 1980s, the government pushed universities and other higher
education institutions to amalgamate. Increased size, through amalgamations and
student recruitment, became a goal, augmented by setting up of additional
campuses in Australia and in other countries. Universities became big
businesses, with budgets of many hundreds of millions of dollars.
management at Australian universities, finances became a preoccupation. All
avenues for income are canvassed, though the options have been restricted
mainly to government funding, student fees and research grants. The other side
of the coin has been cost containment, including by increasing class sizes,
cutting staff numbers and, as mentioned, relying ever more on casual staff for
US, in Australia there is no tradition of private support for universities.
Gifts from alumni are welcome but are usually a tiny portion of income.
Philanthropy is not prominent.
It was in this context that the Ramsay Centre for Western Civilisation entered the picture. Paul Ramsay made a fortune in private healthcare, including buying and running numerous hospitals. He died in 2014, having bequeathed a portion of his estate to setting up university courses in Western civilisation, run with small classes in which students study great books, in the manner of a few other such courses in the US and elsewhere. The Ramsay Centre was set up to manage this bequest. In 2017, the Centre invited expressions of interest from Australian universities to receive funding to set up and run degrees in Western civilisation.
University of Wollongong was the first university to announce an agreement to
set up such a degree. From the point of view of university managers, this was
an attractive proposition. It would involve the largest ever injection of
private money into an Australian university to fund a humanities programme,
amounting to many tens of millions of dollars. It was enough to employ ten
academics and give scholarships to dozens of undergraduates.
Early in 2019, Professor Theo Farrell, executive dean of the Faculty of Law, Humanities and the Arts at UOW, outlined the financial benefits of the arrangement in meetings held to discuss the new degree. The faculty was affected by a decline in the number of undergraduate students enrolling in arts degrees, a decline occurring across the state, not just at Wollongong. The Ramsay-funded degree would have both direct and spinoff benefits financially. The students undertaking the degree would have to take a major or a double degree at the university, most likely in the faculty, giving a boost to enrolments.
benefit was claimed: because the Ramsay-funded students had to have good results
in high school and because they were being paid, they were more likely than
other students to finish their degrees. If true, this would aid the faculty’s
overall retention rate, something the government would favour.
money would support the employment of ten academics and two professional staff.
One of the academics is Dan Hutto, senior professor of philosophy, appointed
head of the new School of Liberal Arts hosting the new degree. There are to be
nine newly hired academics, all of them philosophers. Though hired for
teaching, their relatively light teaching loads would free them up to do
research. Their presence potentially could turn UOW into a philosophy
powerhouse, beyond its current dynamism led by Hutto.
point of view of its advocates, the new degree thus brought great advantages to
the faculty and the university. It involved the injection of a large amount of
money with spinoff benefits for the rest of the faculty. And it would position
UOW as a prominent player internationally among great-books programmes and in
Acceptance of the degree was not straightforward. As soon as it was announced, academics and students expressed opposition. Here, I look at the grounds for opposition under several categories: decision-making, the conservative connection, Western civilisation and equality. In practice, these concerns are often mixed together.
Discussions between the centre and UOW were carried out in
secret. Only a few people at the university even knew negotiations were
occurring. Critics decried the secrecy.
officials said, in defence, that these sorts of negotiations are carried out
all the time, without any public announcement. Indeed, there are many examples
in which major developments have been announced as fait accompli. For example,
in November 2018 an announcement was made that the university had purchased colleges
There was no protest about this; indeed, few took any notice.
On the other hand, the Ramsay Centre was already controversial elsewhere, separately from Wollongong. As the Australian National University negotiated with the Ramsay Centre, there was considerable publicity, especially when university leaders decided against having a Western civilisation degree because of concerns about academic freedom. At the University of Sydney, major opposition emerged to a Ramsay-funded degree, with protests and much media coverage.
context, the secrecy at UOW seemed anomalous. It was true that university
management often proceeded on major initiatives without consultation with
academic staff, but this was not a typical case: it was already known to be
On the Ramsay Centre board are two prominent political conservatives: former prime ministers John Howard and Tony Abbott. For quite a few staff at UOW, the presence of Howard and Abbott tainted the Ramsay Centre and its funds.
explained by Farrell, the board of the Ramsay Centre has no input into what was
taught in the degree. Negotiations with the centre were with two academics that
it employed, Simon Haines and Stephen McInerney, not with the board.
One of the concerns expressed about the degree was that Ramsay Centre representatives would be members of the selection committees for the newly hired academics. For many academics, the idea of non-academic ideologues sitting on academic selection committees was anathema. Farrell countered by emphasising that members of the Ramsay Centre Board, such as Howard and Abbott, would have nothing to do with appointments. Only the Ramsay academics would be involved. A typical selection committee would have the two Ramsay academics, one outside academic, up to six UOW academics including Farrell as chair of the committee. Farrell said that it was not unusual for non-UOW figures to sit on selection committees. In other words, there were many precedents for the processes relating to the new degree.
noted that in his experience most selection committees operate by consensus,
not voting, but that if it came to a vote, UOW members had the numbers. In
response to a question about what the Ramsay academics would be looking for —
the worry being that they would want candidates aligned with particular
political positions — Farrell said that in his interactions so far with the
Ramsay academics, their main concern was that the appointees be good teachers.
meeting for faculty members about the new degree held on 11 February, Marcelo
Svirsky, senior lecturer in International Studies, raised a concern about the
reputational damage caused by the connection between Ramsay and the university.
Farrell said the university’s reputation internationally would be enhanced via
connections with Columbia University and other institutions with similar sorts
of degrees. Such connections were important given how difficult it was to build
affiliations with leading universities. Domestically, Farrell said that
information about the content of the UOW degree was gaining traction in the
media, counteracting earlier bad publicity about the proposed degrees at other
universities. He explicitly denied any risk to reputation.
It is fascinating to speculate what the response to the Ramsay money would have been had Howard and Abbott not been on the board. Many academics vehemently oppose the political positions of Howard and Abbott, making it difficult to accept any initiative associated with the two politicians. In the wider public, the involvement of Howard and Abbott mean the Ramsay Centre is inevitably caught up in the emotions associated with right-wing politics and the so-called culture wars.
be the same academic opposition to money coming from a centre linked to leading
figures from green or socialist politics? This can only be surmised, because if
a green-red twin of the Ramsay Centre were funding a degree, it would not be
called a degree in Western civilisation.
For academics in some sections of the humanities and social
sciences, “Western civilisation” is a term of opprobrium, not endearment. It is
useful to note that in several fields, critique is one of the standard tools:
accepted ideas, practices and institutions are subject to critical scrutiny,
often with assumptions and beliefs skewered. For example, in my field of
science and technology studies, challenges to ideas such as scientific progress
and “technology is neutral” are fundamental to much teaching and research. Yet,
in the wider public, conventional ideas about science, technology and progress
remain dominant. Therefore, teaching in the field necessarily involves
questioning conventional thinking.
For some, “Western civilisation” brings up images of Socrates, Michelangelo, Shakespeare and Einstein: great thinkers and creators from Europe. It also brings up images of parliamentary democracy, human rights and liberation from oppressive systems of domination. These are some of the positives of Western history and politics.
also a seamier side to Western history and politics. Colonialism and imperialism
sponsored by Western European states resulted in massive death, displacement
and enslavement of Indigenous peoples. In Australia, white settlement caused
death and the destruction of the culture of Aboriginal peoples.
As well as
the legacy of colonialism, the history of Europe has its own dark aspects, for
example the Crusades, the Inquisition, the horrors of the industrial revolution
and the Nazi genocide. A full account of Western cultures needs to address
their damaging as well as their uplifting sides.
While Western civilisation has been responsible for horrific deeds, these have been carried out with convenient rationales. Colonialism was seen by its defenders as part of a civilising mission, bringing enlightenment to savage peoples. Yet the aftermath of this mission continues to cause suffering. For example, in Rwanda, Belgian colonialists imposed the categories of Tutsi and Hutu on the population, helping lay the stage for the 1994 genocide. In Australia, poverty and incarceration of Aboriginal people are among the contemporary consequences of colonialism.
academics, it is imperative to challenge the glorified myth of the beneficence
of Western culture. It is part of the scholarly quest to attain insight into
what really happened, not just what is convenient to believe, and this often
involves pointing to the unsavoury aspects of history and politics that others
would rather ignore or downplay.
context, the very label “Western civilisation” is an insult to some scholars in
the area, because the term “civilisation” has positive connotations unlike, for
example, “Western barbarism.” For scholars, the label “Western civilisation”
suggests a focus only on one side of a complex and contentious past and legacy.
Hutto, in presenting the subjects to be taught in UOW’s Western civilisation degree, emphasised that about half of them involved studying texts from other cultures, including texts concerning Buddhism, Islam and Indigenous cultures. To fully understand Western culture, it is valuable to appreciate other cultures: a respectful dialogue provides more insights than concentrating on Western items alone.
some of the texts that Hutto proposed from Western writers offered critical perspectives
on Western societies. In these ways, Hutto distanced the degree from Abbott’s
claim that it would be for Western
instead positioning it as something different. In Hutto’s view, the degree uses
the study of great works of Western civilisation, in conversation with
non-Western traditions, as a way for students to develop their critical
capacities, using evidence and argument to back up their views. In short,
Hutto’s aim for the degree is that students learn how to think, not what to
think. Students are bound to be exposed to critical perspectives, including in
the major or degree they are required to take in addition to the one in Western
The degree as designed by Hutto might clash with the conceptions of some Ramsay Centre board members. It might also clash with the public perception, at least as informed by media coverage, that the degree would be one-sided advocacy for Western contributions. Intriguingly, if Howard or Abbott were to express reservations about UOW’s degree, this would temper the media and public perceptions of one-sidedness.
One of the
problems with the concept of Western civilisation is that, in the public
debate, it is seldom defined. Some critics might say that to talk of Western
civilisation is a category mistake, attributing a reality to an abstraction
whose meaning is contested. The variability of the meaning of “Western
civilisation” may lie behind some of the disputes over the degree carrying this
Ramsay’s large donation seems like a boon to a cash-strapped university, enabling the hiring of staff and the running of small classes that otherwise would be infeasible. On the other hand, UOW’s planned degree creates tensions between the privileged few and the rest.
The academics hired to teach the new degree would seem to have some extra benefits. In particular, they will be teaching small classes, of no more than ten students, of high-calibre students. In contrast, their colleagues, namely the rest of the academics in the faculty, are saddled with tutorial classes of 25, plus lectures sometimes with hundreds of students.
academics, this contrast is a source of considerable disquiet. Imagine someone
working in a field where offerings cover the same topics as proposed in the
Western civilisation degree. They might well say, “We have the expertise and
experience in the area. Why are we being squeezed while newcomers are given
generous conditions to teach the same topics from a philosophical perspective?”
been no formal response to questions of this type. One reply would be to say
that there are all sorts of inequalities between staff, only some of which are
related to merit. The most obvious inequality is between permanent and
non-permanent teachers. Some of the teachers on casual appointments are just as
qualified as those with continuing appointments. There are also inequalities
between academics, especially in research. For example, some researchers are
exempted from teaching on an official or de-facto basis.
tend to be highly sensitive to inequality in treatment, in part because
professional status is so highly valued. There are regular disputes about
workloads: seeing a colleague with a lighter teaching load can cause envy or
resentment. That a whole group of new academics seems to receive special
conditions can bring this sort of resentment to the fore.
students selected for scholarships to undertake the Western civilisation degree
have to satisfy several conditions. They must be Australian citizens or
permanent residents, young, recently completed high school and have obtained a
high score in the examinations at the end of high school. In other words,
mature-age students and international students are excluded from consideration.
Scholarship students will receive an annual stipend of $27,000, paid for up to
To some, the special privileges for scholarship students are unfair, especially the restriction to young Australian students. To this, a reply might be that inequalities between students are commonplace. The most obvious is between domestic and international students, the latter having to pay large tuition fees. Students on postgraduate scholarships are privileged too. This sometimes can be justified on merit, though the difference between students near the scholarship cut-off point may be tiny.
To appreciate the struggle over the Ramsay-Centre-funded degree in Western civilisation at the University of Wollongong, it is useful to think of the key players as using tactics to counter the moves of their opponents. Thinking this way is a convenience and does not imply that players actually think in terms of a strategic encounter.
proponents of the degree seem to be driven by two main considerations: the
availability of a large amount of private money to be injected into the
humanities, and the opportunity to build a world-class philosophy unit. To
acquire the Ramsay money and build the philosophy unit, it was useful to
counter likely sources of opposition, in particular the opposition of academics
in cognate units concerned about the ideological associations with the Ramsay
Centre and the concept of Western civilisation.
forestall the sort of rancorous public debate that occurred at the Australian
National University and Sydney University, which might scuttle the degree
before it was agreed, the degree proponents negotiated in secret. This did
indeed reduce public debate, but at the expense of a different source of
concern, the secrecy itself.
To counter concerns associated with the ideological associations with Ramsay and Western civilisation, Dan Hutto, designer of the degree, went to considerable effort to include in the core subjects respectful intellectual engagements with non-Western cultures, and to include negative as well as positive sides of Western culture.
opponents of the degree were not mollified. Some simply ignored the innovative
aspects of the subject offerings and assumed that any degree labelled “Western
civilisation” must be an apologia for Western colonialism. Other opponents,
though, focused on procedural matters, for example the fast-track approval of
the degree despite its possible risk to the university’s reputation.
One of the consequences of the degree is the introduction of a privileged stratum of staff, with much lighter teaching loads, and of students given scholarships to undertake the degree. For proponents of the degree, there is no easy way to address the associated staff and student inequality. However, this inequality has not played a significant role in the public debate. There are numerous other inequalities within universities, so perhaps the introduction of one more, despite its high profile, is not a likely trigger for public concern.
One of the
positive outcomes of the new degree is the debate it has stimulated. Hutto has
grasped the opportunity by planning to have the students discuss, in their
first week in the degree beginning in 2020, the debate about the degree itself.
For those so inclined, the new degree provides a golden opportunity to articulate
critiques of Western civilisation and make them available to staff and students
in the new School of Liberal Arts. Although Tony Abbott claimed that the
Ramsay-funded degrees would be for Western
civilisation, it is quite possible that many of the degree graduates will develop
a sophisticated understanding of Western civilisation. Perhaps, along the way,
members of the public will learn more about both the high and low aspects of
Paul Ramsay think of the furore over degrees in Western civilisation? Perhaps
he would be bemused that his bequest is receiving much more attention than he
ever sought for himself during his lifetime.
I thank the many individuals who have discussed the issues with me and who have offered comments on drafts.
 In the debate about Ramsay
Centre funding, Paul Ramsay and Ramsay Health Care have scarcely been
mentioned. Michael Wynne, a vigorous critic of corporate health care, developed
an extensive website with information about numerous heathcare corporations in
the US and Australia. While being critical of for-profit heathcare, Wynne has
relatively generous comments about Paul Ramsay himself and about Ramsay Health
Care, at least compared to other players in the corporate scene. See:
Wynne’s pages on Ramsay were last updated in 2005, but after this Paul Ramsay played a less direct role in Ramsay Health Care.
 I attended
meetings on 16 January and 11 February 2019 held for members of the Faculty of
Law, Humanities and the Arts. Theo Farrell and Dan Hutto told about plans for
the new degree and answered questions.
 Another factor,
specific to UOW, was the setting up of a Faculty of Social Sciences that,
despite its name, does not house the classic social sciences of sociology,
political science and economics. This faculty set up a social science degree
that is in direct competition with the arts degree, attracting students that
otherwise would have contributed to the budget for the Faculty of Law,
Humanities and the Arts.
 Andrew Herring, “University of Wollongong continues global expansion into Malaysia,” 19 November 2018, https://media.uow.edu.au/releases/UOW253448.html: The media release begins as follows: “The University of Wollongong (UOW) has continued its global expansion by acquiring the university colleges of Malaysian private education provider KDU from long-standing Malaysian investment company Paramount Corporation Berhad (PCB).
Subject to Malaysian Ministry of Education approval,
the deal will see UOW wholly-owned subsidiary, UOW Global Enterprises, immediately
acquire a substantive majority equity interest in the university colleges in
Kuala Lumpur and Penang—including the new campus under construction in Batu
 Tony Abbott, “Paul Ramsay’s vision for Australia,” Quadrant Online, 24 May 2018, https://quadrant.org.au/magazine/2018/04/paul-ramsays-vision-australia/. Quite a few commentators blamed Abbott’s article for hindering acceptance of a Ramsay-funded degree at the Australian National University, e.g. Michael Galvin, “Abbott single-handedly destroys Ramsay Centre for Cheering On White People,” The Independent, 17 June 2018; Peter van Onselen, “Ramsay Centre has Tony Abbott to blame for ANU’s rejection,” The Australian, 9 June 2018. Note that the preposition for is contained in the full name of the centre: the Ramsay Centre for Western Civilisation.
 Entry to the degree course is open to students of any age, and to five non-residents. The conditions mentioned apply only to those receiving Ramsay scholarships, and even then exceptions can be made. An ATAR (Australian Tertiary Admission Rank) of 95 has been mentioned as an expectation for scholarship recipients. Other factors will be taken into account.
Some Australian media outlets have been warning that university students are unduly protected from disturbing ideas. But are these same media outlets actually the ones that can’t handle disturbing ideas?
For years, I’ve been seeing stories in The Australian and elsewhere about problems in universities associated with political correctness (PC). The stories tell of students who demand to be warned about disturbing material in their classes, for example discussions of rape in a class on English literature. The students demand “trigger warnings” so they can avoid or prepare for potentially disturbing content. Detractors call them “snowflake students”: they are so delicate that, like a snowflake, they dissolve at exposure to anything slightly warm.
Former Labor Party leader Mark Latham, for example, referred to “the snowflake safe-space culture of Australian universities.”
Richard King, the author of On Offence: The Politics of Indignation, reviewed Claire Fox’s book I Find that Offensive. King says that the principal target of Fox’s book “is ‘the snowflake generation’, which is to say the current crop of students, especially student activists, who keep up a constant, cloying demand for their own and others’ supervision. ‘Safe spaces’, ‘trigger warnings’ and ‘microaggressions’ are all symptoms of this trend.”
I treat these sorts of stories with a fair bit of scepticism. Sure, there are some incidents of over-the-top trigger warnings and demands for excessive protection. But are these incidents representative of what’s happening more generally?
Before accepting that this is a major problem, I want to see a proper study. A social scientist might pick a random selection of universities and classes, then interview students and teachers to find out whether trigger warnings are used, whether class discussions have been censored or inhibited, and so forth. I’ve never heard of any such study.
What remains is anecdote. Media stories are most likely to be about what is unusual and shocking. “Dog bites man” is not newsworthy but “man bites dog” might get a run.
Most of the Australian media stories about trigger warnings and snowflake students are about what’s happening in the US, with the suggestion that Australian students are succumbing to this dire malady of over-sensitivity.
Trigger warnings: Australian movie and video game classifications
There is a case for trigger warnings. Nevertheless, in thirty years of undergraduate teaching, I never saw any need for them — except when I asked students to use them.
For one assignment in my class “Media, war and peace,” students formed small groups to design an activity for the rest of the class. The activity had to address a concept or theory relating to war or peace, violence or nonviolence. Quite a few student groups chose the more gruesome topics of assassination, torture or genocide, and some of them showed graphic pictures of torture and genocidal killings.
Never did a single student complain about seeing images of torture and killing. Nevertheless, I eventually decided to request that the student groups provide warnings that some images might be disturbing. Thereafter, when groups provided warnings, no students ever excused themselves from the class. I was watching to see their reactions and never noticed anyone looking away.
This is just one teacher’s experience and can’t prove anything general. It seems to show that some Australian students appear pretty tough when it comes to seeing images of violence. Perhaps they have been desensitised by watching news coverage of wars and terrorist attacks.
However, appearances can be deceptive. My colleague Ika Willis pointed out to me that students may hide their distress, and that few would ever complain even if they were distressed. So how would I know whether any of my students were trauma survivors and were adversely affected? Probably I wouldn’t. That is an example of why making generalisations about trigger warnings based on limited evidence is unwise.
A journalist attends classes – covertly
On 8 August 2018, Sydney’s Daily Telegraph ran a front-page story attacking three academics at Sydney University for what they had said in their classes. The journalist, Chris Harris, wrote about what he had done this way: “The Daily Telegraph visited top government-funded universities in Sydney for a first-hand look at campus life …” This was a euphemistic way of saying that he attended several classes without informing the teachers that he was attending as a journalist, and covertly recorded lectures without permission. Only in a smallish tutorial class, in which the tutor knows all the students, would an uninvited visitor be conspicuous.
Harris then wrote an expose, quoting supposedly outrageous statements made by three teachers. This was a typical example of a beat-up, namely a story based on trivial matters that are blown out of proportion. Just imagine: a teacher says something that, if taken out of context, can be held up to ridicule. Many teachers would be vulnerable to this sort of scandal-mongering.
One issue here is the ethics of covertly attending classes and then writing a story based on statements taken out of context. Suppose an academic covertly went into media newsrooms, recorded conversations and wrote a paper based on comments taken out of context. This would be a gross violation of research ethics and scholarly conventions. To collect information by visiting a newsroom would require approval from a university research ethics committee. Good scholarly practice would involve sending a draft of interview notes or the draft of a paper to those quoted. In a paper submitted for publication, the expectation would be that quotes fairly represent the issues addressed.
A typical Daily Telegraph front page
Where are the snowflake students?
So when Harris attended classes at universities in Sydney, did he discover lots of snowflake students who demanded to be protected by trigger warnings? He didn’t say, but it is clear that at least two individuals were highly offended: a journalist and an editor! They thought the classroom comments by a few academics were scandalous.
In a story by Rebecca Urban in The Australian following up the Telegraph expose, Fiona Martin’s passing comment about a cartoon by Bill Leak comes in for special attention. According to this story, “The Australian’s editor-in-chief Paul Whittaker described the comment as ‘appalling’ and ‘deeply disrespectful’.”
So apparently News Corp journalists and editors are the real snowflakes, not being able to tolerate a few passing comments by academics that weren’t even intended for them or indeed for anyone outside the classroom. Or perhaps these journalists and editors are outraged on behalf of their readership, who they consider should be alerted to the dangerous and foolish comments being made in university classrooms.
Where in this process did the call for students to be tough and be exposed to vigorous discussion suddenly dissolve?
The contradiction is shown starkly in a 10 August letter to the editor of The Australian by Andrew Weeks. The letter was given the title “Bill Leak’s legacy is his courage in defending the right to free speech”. Weeks begins his letter by saying “I am unsure what is most disturbing about the abuse of sadly departed cartoonist Bill Leak by Fiona Martin.” After canvassing a couple of possibilities, he says “Perhaps it is the fact that Sydney University has supported its staffer, offering lip service in support of freedom of speech when that is exactly what is being endangered by the intolerance characteristic of so many university academics.”
The logic seems to be that freedom of speech of Bill Leak (or those like him) is endangered by an academic’s critical comment in a classroom, and that a university administration should not support academics who make adverse comments about Leak.
Again it might be asked, what happened to the concern about the snowflake generation? The main snowflakes are, apparently, a journalist, an editor and some readers. Perhaps it would be wise in future for journalists to avoid visiting university classrooms so that they and their readers will not be disturbed by the strong views being expressed.
Universities do have serious problems, including a heavy reliance on casual teaching staff and lack of support for international students, both due to lack of money. More students report problems with anxiety and depression. There is also the fundamental issue of the purpose of higher education, which should not be reduced to job preparation. Instead of addressing these issues, News Corp newspapers seem more interested in the alleged danger, apparently most virulent in humanities disciplines, of political correctness.
My focus here is on an apparent contradiction or discrepancy in treatments of PC and “snowflake students” in The Australian and the Daily Telegraph. While decrying the rise of the so-called snowflake generation, journalists and editors seemed more upset than most students by comments made in university classrooms.
One other point is worth mentioning. If you want to inhibit vigorous classroom discussions of contentious issues, there’s no better way than spying on these discussions with the aim of exposing them for public condemnation. This suggests the value of a different sort of trigger warning: “There’s a journalist in the classroom!”
Further reading (mass media)
Josh Glancy, “Rise of the snowflake generation,” The Australian, 8-9 September 2018, pp. 15, 19.
Christopher Harris, “Degrees of hilarity” and “Bizarre rants of a class clown,” Daily Telegraph, 8 August 2018, pp. 4-5.
Richard King, “Fiery blast aimed at ‘snowflake generation’,” The Australian, 1 April 2017, Review p. 22.
Mark Latham, “The parties are over,” Daily Telegraph, 9 January 2018, p. 13.
Bill Leak, “Suck it up, snowflakes,” The Australian, 11 March 2017, p. 15.
Rebecca Urban, “Uni backs staffer on secret suicide advice,” The Australian, 9 August 2018, p. 7; (another version) “University of Sydney stands by media lecturer following Bill Leak attack,” The Australian, 8 August 2018, online.
Further reading (scholarly)
Sigal R. Ben-Porath, Free Speech on Campus (University of Pennsylvania Press, 2017).
Emily J. M. Knox (ed.), Trigger Warnings: History, Theory, Context (Rowman & Littlefield, 2017).
Acknowledgements Thanks to several colleagues for valuable discussions and to Tonya Agostini, Xiaoping Gao, Lynn Sheridan and Ika Willis for comments on a draft of this post. Chris Harris and Paul Whittaker did not respond to invitations to comment.
Researchers need to write as part of their job. It’s remarkable how stressful this can be. There is help at hand, but you have to be willing to change your habits.
Writing is a core part of what is required to be a productive researcher. Over the years, I’ve discovered that for many of my colleagues it’s an agonising process. This usually goes back to habits we learned in school.
Sport, music and writing
Growing up, I shared a room with my brother Bruce. I was an early riser but he wasn’t. But then, in the 10th grade, he joined the track and cross-country teams. Early every morning he would roll out of bed, still groggy, change into his running gear and go for his daily training run. After school he worked out with the team. He went on to become a star runner. At university, while majoring in physics, he obtained a track scholarship.
As well, Bruce learned the French horn and I learned the clarinet. We had private lessons once a week and took our playing seriously, practising on assigned exercises every day. We each led our sections in the high school band.
I also remember writing essays for English class, postponing the work of writing and then putting in hours the night before an essay was due. At university, this pattern became worse. I pulled a few all-nighters. To stay awake, it was the only time in my life I ever drank coffee.
Back then, in the 1960s, if you wanted to become a good athlete, it was accepted that regular training was the way to go. It would have been considered foolish to postpone training until just before an event and then put in long hours. Similarly, it was accepted that if you wanted to become a better instrumentalist, you needed to practise regularly. It was foolish to imagine practising all night before a performance.
Strangely, we never applied this same idea to writing. Leaving an assignment until the night before was common practice. And it was profoundly dysfunctional.
Luckily for me, while doing my PhD I started working regularly. On a good day, I would spend up to four hours on my thesis topic. I also started working on a book. Somewhere along the line I began aiming to write 1000 words per day. It was exceedingly hard work and I couldn’t maintain it for week after week.
In the 1980s, Robert Boice, a psychologist and education researcher, carried out pioneering studies into writing. He observed that most new academics had a hard time meeting the expectations of their job. They typically put most of their energy into teaching and neglected research, and felt highly stressed about their performance. Boice observed a pattern of procrastination and bingeing: the academics would postpone writing until a deadline loomed and then go into an extended period of getting out the words. However, these binges were so painful and exhausting that writing became associated with discomfort, thereby reinforcing the pattern. If writing is traumatic, then procrastination is the order of the day.
Procrastination and bingeing is just what I did in high school and undergraduate study. It’s what most academics did when they were younger, and they never learned a different pattern.
Boice observed that a small number of new academics were more relaxed and more productive. They didn’t binge. Instead, they would work on research or teaching preparation in brief sessions over many days, gradually moving towards a finished product. Boice had the idea that this approach to academic work could be taught, and carried out a number of experiments comparing different approaches to writing. (See his books Professors as Writers and Advice for New Faculty Members.)
In one study, there were three groups of low-productivity academics. Members of one group were instructed to write in their usual way (procrastinating and bingeing). They ended up with an average of 17 pages of new or revised text – in a year. That’s about half an article and far short of what was required to obtain tenure.
Members of the second group were instructed to write daily for short periods. In a year, they produced on average 64 pages of new or revised text. Members of the third group were instructed to write daily for short periods and were closely monitored by Boice. Their average annual total of new or revised text was 157 pages. This was a stunning improvement, though from a low baseline.
It didn’t surprise me too much. It was the difference between athletes who trained just occasionally, when they felt like it, and athletes who trained daily under the guidance of a coach. It was the difference between musicians who practised when they felt like it and musicians who practised daily on exercises assigned by their private teacher.
Gray and beyond
Decades later, in 2008, I came across Tara Gray’s wonderful book Publish & Flourish: Become a Prolific Scholar. In a brief and engaging style, she took Boice’s approach, extended it and turned it into a twelve-step programme to get away from procrastinating and bingeing. Immediately I tried it out. Instead of taking 90 minutes to write 1000 words, and doing this maybe one week out of three, I aimed at 20 minutes every day, producing perhaps 300 words. It was so easy! And it promised to result in 100,000 words per year, enough for a book or lots of articles.
Gray, adapting advice from Boice, recommends writing from the beginning of a project. This is different from the usual approach of of reading everything about a topic and only then writing about it. For me, this actually reduces the amount of reading required, because I know far better what I’m looking for. Over the following years, I gradually changed my writing-research practice. Previously, writing an article happened late in a project. Now I write from the beginning, and there is more follow-up work. The follow-up work includes looking up references, doing additional reading, seeking comments on drafts from non-experts and then from experts. It’s much easier and quality is improved.
I introduced this approach to writing to each of my PhD students. Some of them were able to take it up, and for them I could give weekly guidance. I also set up a writing programme for colleagues and PhD students. Through these experiences I learned a lot about what can help researchers to become more productive. An important lesson is that most academics find it extremely difficult to change their writing habits. Many can’t do it at all. Research students seemed better able to change, perhaps because their habits are less entrenched and because they think of themselves as learners.
With this newfound interest in helping improve research productivity, I looked for other sources of information. There is a lot of advice about how to become a better writer. Our writing programme was based on the work of Boice and Gray, so I looked especially at treatments that would complement their work. Excellent books include Paul Silvia’s How to Write a Lot and W. Brad Johnson and Carol A. Mullen’s Write to the Top! It was encouraging that most of these authors’ advice was similar to Boice’s and Gray’s. However, there seems to be very little research to back up the advice. Boice’s is still some of the best, with Gray’s research findings a welcome addition showing the value of regular writing.
To these books, I now add Joli Jensen’s superb Write No Matter What, and not just because it has a wonderful title. Jensen, a media studies scholar at the University of Tulsa, draws on her own experience and years of effort helping her colleagues to become more productive. As I read her book, time after time I said to myself, “Yes, that’s exactly my experience.”
“Writing productivity research and advice can be summarized in a single sentence: In order to be productive we need frequent, low-stress contact with a writing project we enjoy.” (p. xi)
Jensen excels in her exposition of the psychological barriers that academics experience when trying to write. She approaches this issue — one pioneered by Boice — through a series of myths, fantasies and fears. An example is the “magnum opus myth,” the idea held by many academics that they have to produce a masterpiece. This is profoundly inhibiting, because trying to write a bit of ordinary text feels so inadequate compared to the shining vision of the magnum opus. The way to avoid this discrepancy is to postpone writing, and keep postponing it.
Another damaging idea is that writing will be easier when other bothersome tasks are cleared out of the way. Jensen calls this the “cleared-desk fantasy.” It’s a fantasy because it’s impossible to finish other tasks, and new ones keep arriving: just check your in-box. Jensen says that writing has to take priority, to be done now, irrespective of other tasks that might seem pressing.
Then there is the myth of the perfect first sentence. Some writers spend ages trying to get the first sentence just right, imagining that perfecting it will unleash their energies for the rest of the article. This again is an illusion that stymies writing.
A colleague once told me how she was stuck writing the last sentence of a book review, with her fingers poised over the keyboard for an hour as she imagined what the author of the book she was reviewing would think. This relates to the perfect first sentence problem but also to Jensen’s “hostile reader fear.” Jensen also addresses the imposter syndrome: the fear that colleagues will discover you’re not a real scholar like them. Then there is the problem of comparing your work with others, usually with others who seem to be more productive. Upwards social comparison is a prescription for unhappiness and, in addition, can inhibit researchers. If others are so much better, why bother?
Write No Matter What is filled with valuable advice addressing all aspects of the writing process. Jensen offers three “taming techniques” to enable the time, space and energy for doing the craft work of writing. She has all sorts of practical advice to address problems that can arise with research projects, for example when you lose enthusiasm for a topic, when you lose the thread of what you’re trying to do, when your submissions are rejected (and subject to depressingly negative comments), when your project becomes toxic and needs to be dumped, and when you are working on multiple projects.
She says that writing can actually be harder when there’s more unstructured time to do it, something I’ve observed with many colleagues.
“When heading into a much-desired break, let go of the delusion that you will have unlimited time. Let go of vague intentions to write lots every day, or once you’ve cleared the decks, or once you’ve recovered from the semester. Acknowledge that academic writing is sometimes harder when we expect it to be easier, because we aren’t trying to balance it with teaching and service.” (p. 127)
Jensen is open about her own struggles. Indeed, the stories she tells about her challenges, and those of some of her colleagues, make Write No Matter What engaging and authentic. Her personal story is valuable precisely because she has experienced so many of the problems that other academics face.
With my experience of running a writing programme for a decade and helping numerous colleagues and research students with their writing, it is striking how few are willing to consider a new approach, how few are willing to admit they can learn something new and, for those willing to try, how difficult it is to change habits. Boice’s work has been available since the 1980s yet is not widely known. This would be like a successful sporting coach having superior training techniques and yet being ignored for decades.
To me, this testifies to the power of entrenched myths and practices in the academic system. Write No Matter What is a guide to an academic life that is both easier and more productive, but the barriers to shifting to this sort of life remain strong. In the spirit of moderation advocated by Boice, Gray and Jensen, read their books, but only a few pages per day. And write!
On 1 June this year, I received an email from Hildie Spautz. She wrote that her father, Michael E. Spautz, had died the previous day.
Michael Spautz, 1970s
I had only met Michael once, in 1981, and had not corresponded with him for a decade. But I knew a lot about his story.
Hildie was writing to me because she had found articles I had written about Michael’s difficulties at the University of Newcastle. I was one of the few who showed any sympathy for Michael’s concerns.
Hildie and her sister Laura, who each live in the US, were going through Michael’s belongings. He had vast numbers of paper files. Would I like to have them, or did I know anyone who would? My immediate response to both questions was no.
Michael’s death made me reflect on the events that derailed much of his life. Be prepared. This story does not have a happy ending. It is a story of wasted effort and dysfunction. There are, though, some useful lessons. I for one learned a lot from it.
The Spautz case
Spautz was originally from the US. He took a job in Australia at the University of Newcastle, where he was a senior lecturer in the Commerce Department. There were no particular dramas until 1978, after the appointment of a second professor in the department, Alan Williams.
Alan J Williams
In Australia at the time, the main academic ranks were lecturer, senior lecturer, associate professor, and professor. Relatively few academics reach the rank of professor, and decades ago it often came along with the role of the head of a department. To be a professor usually meant having an outstanding record in research or sometimes administration.
Williams, though, had far less than an outstanding record. He had recently received his PhD and had published two articles in management journals. Even though commerce was not then as research-intensive as disciplines like chemistry or sociology, nevertheless Williams’ record was decidedly lightweight for a professorial appointment. The back story was that the department was having trouble finding a suitable candidate and, it was suggested, made an inferior appointment rather than lose funding for the position.
Spautz had not been an applicant for the position when Williams applied, but had applied for it in earlier rounds when no appointment was made. Initially, there were no tensions between Spautz and Williams. However, after Williams was made head of a section within the department, Spautz began raising concerns. Alerted by two colleagues to problems with Williams’ research, Spautz started digging further.
Williams, in his PhD thesis, had studied the owners of small businesses, in particular their psychological problems. His argument was that such problems made the businesses more likely to fail. Spautz – who had a background in psychology – argued that the reverse process could have been responsible: when businesses struggle and fail, their owners are more likely to suffer psychologically. Spautz therefore claimed that Williams’ research was flawed due to “inverted causality”: he had mixed up cause and effect. Spautz also questioned some of the statistical methods used by Williams.
It is nothing special that scholarly research has shortcomings. Many academics exert great efforts in trying to find flaws in previous studies. This is part of the process of testing data and theory that is supposed to lead to reliable knowledge. In this context, Spautz’ critique of Williams’ research was nothing out of the ordinary.
However, it is uncommon for an academic to undertake a detailed critique of the work of an immediate colleague and then to do something about it. Academics often gripe about the weaknesses, irrelevance or unwarranted recognition of their colleagues’ research, especially colleagues who are arrogant or who seem to have gained unfair preferment. But griping is usually the extent of it. To openly criticise the work of an immediate colleague can be seen as disloyal. In some cases in which an academic speaks out about a colleague’s scientific fraud, it is the whistleblower who comes under attack by administrators.
Spautz, though, seemed to have few inhibitions in challenging the quality of Williams’ research. Spautz began his challenge in a conventional, scholarly way. He took his criticisms directly to Williams and to others in the Commerce Department, but obtained no support. He wrote a rebuttal of Williams’ published papers and sent it to the journals where those papers had been published. However, the editor was not interested. This should not have been surprising. If an article has had no particular impact, few editors would be keen on publishing a detailed rebuttal years later. This might be considered a shortcoming of the system of journal publication. It is far easier to publish an original study, with new data and findings, than a replication of a previous study, whether or not the replication supports the original study.
Williams had recently received his PhD from the University of Western Australia. Later on, Spautz wrote to UWA raising his concerns about shortcomings in Williams’ thesis. The Vice-Chancellor replied saying that this was a matter for the examiners of the thesis. Neither the identity of the examiners nor their reports were publicly available, as is usual in Australian universities. There is no standard institutional process for questioning the work in a thesis.
Spautz was stymied. He had tried the official channels for questioning Williams’ work and been blocked. This was long before the Internet, otherwise he could have posted his criticisms online.
There was one other institutional channel to be tried: the University of Newcastle itself. But Spautz’ complaints led nowhere.
Plagiarism, a scholarly sin
Along the way, Spautz added another claim to his allegations about Williams’ thesis: that it involved plagiarism, namely the use of other people’s words or ideas without appropriate acknowledgement. In the eyes of many academics, plagiarism is a cardinal sin, deserving the most severe condemnation. When undergraduate students are detected plagiarising in their assignments, they may be given a mark of zero or even referred to a student misconduct committee. (On the other hand, some teachers treat much undergraduate student plagiarism not as cheating but as a matter of not understanding proper citation practices.)
The plagiarism in Williams’ thesis is a subtle type, which can be called plagiarism of secondary sources. Williams gave references to a range of articles and books. Spautz was able to deduce that in quite a few cases Williams apparently had not actually looked at these articles and books himself, but had instead copied the references from a later publication, a “secondary source.” This sort of plagiarism basically involves copying references used by another author but not citing that author. It’s a common sort of plagiarism in many academic works. It is hard to prove, but in this instance Spautz was a super-sleuth, finding secondary sources and subtle clues that Williams had relied on these secondary sources, as I verified for myself.
Personally, having studied plagiarism, I don’t think this should be a hanging offence. However, because plagiarism has such a terrible reputation, especially plagiarism by academics, it would have been embarrassing for a university inquiry into Williams’ thesis to acknowledge any sort of plagiarism at all.
The snowflake campaign
Spautz started writing memos, in the form of typed or handwritten statements, mimeographed or photocopied. He put them in the mailboxes of academics on campus. This was his “campaign for justice.” It is accurately described as a campaign, because Spautz produced memo after memo, sometimes every day. He also called his efforts the “snowflake campaign” because there were so many white memos that they could be likened to flakes of snow landing on (or littering) the campus.
Spautz’s efforts drew the attention of the administration, and an inquiry was set up. Spautz’s aim was for his allegations about Williams’ research to be investigated. However, the inquiry instead focused on Spautz’s behaviour. Basically, he was told to shut up.
Spautz was not deterred by the admonitions from the inquiry, and continued his campaign. There was a second inquiry. Then in May 1980 the Council, the university’s governing body, dismissed Spautz. This was news: in Australia it is quite rare for a tenured academic to be fired. Furthermore, the circumstances in Spautz’s case were quite unusual.
From Spautz’s point of view, he had concerns about Williams’ research, had tried to raise them with Williams, journal editors and university administrators, and had been fobbed off, told to shut up and then dismissed. He wasn’t going to shut up, and dismissal just made him more determined to expose what he saw as injustice.
From the point of view of university administrators, Spautz was an annoyance. The solution was to go through some formal processes and then, when Spautz didn’t cooperate, to take the ultimate step of dismissing him. If administrators thought that this would be the end of the matter, they were wrong. Most dismissed academics are humiliated and go quietly. Others take legal action for dismissal, hoping to receive some compensation. (Reinstatement is exceedingly rare.)
Spautz never hired a plane to distribute his memos
Spautz was not like most other academics. He continued his campaign, and greatly expanded it. He continued production of memos, distributed to people on campus and numerous others beyond, including journalists. He heard about my work on suppression of dissent and contacted me in June 1980. I was henceforth on his mailing list.
Spautz expanded his allegations, claiming that various individuals were involved in a criminal conspiracy. He launched court cases, and more court cases. In the following years, at one point he was unable to pay court costs and was sent to prison. After 56 days, a judge found he had been falsely imprisoned. This was grist for more legal actions, and he later obtained compensation. Eventually he was declared a vexatious litigant. This was the only thing that stopped his decades of legal cases against various individuals he accused of wrong actions.
Michael Spautz, 1980s
The verdict: what a waste!
There are no winners in this story. From the time of his dismissal in 1980 until his death this year, he devoted most of his effort to his self-styled campaign for justice. For four decades he was obsessed, initially with the shortcomings of Williams’ research and then with the aftermath of his dismissal. Prior to this quest, Spautz had been a productive scholar, teaching undergraduates and authoring quite a few publications.
When I met him in 1981, I told him it would be better to put effort into writing up his story, and that pursuing action through the courts was likely to be futile. Others told him similar things. But he didn’t listen. He was convinced his course of action was the right one.
Alan Williams was another victim. He was unlucky to become the target of Spautz’s campaign. In another way, Williams was unlucky to have been appointed as a professor at the University of Newcastle on a thin research record, which made him vulnerable.
The University of Newcastle paid a severe penalty too. Spautz’s campaign brought it unwelcome attention, and several senior figures at the university had to spend considerable time dealing with Spautz’s charges against them. There were occasional news reports about Spautz’s legal cases. For a university administration, this is not a desired sort of media coverage.
University of Newcastle campus: a desired image
More damaging was the effect of the dismissal on the academic culture at the university. Although many staff found Spautz’s behaviour objectionable, many also were disturbed by his dismissal. The executive of staff association produced an informative report.
When I visited the campus in 1981, a year after Spautz had been dismissed, I could sense fear. Some staff did not want even to discuss Spautz, as if that would taint them and make them vulnerable. Openly expressing disagreement with the dismissal was felt to be risky, perhaps because they might be next. Spautz was unbowed by his dismissal, but it frightened many others.
Social, academic and legal systems are not designed to address cases such as this. When Spautz started raising concerns about Williams’ research, there was no one in a position of authority who was able or willing to step in and cut to the core issues he raised. At the University of Newcastle, all that administrators did was set up committees of inquiry that focused on Spautz’s behaviour. In many cases, such committees work well for their purposes, but they were manifestly inadequate to address Spautz and his campaign. The individuals involved in all these arenas were well meaning and following typical protocols. It was not a failure by individuals so much as a failure of the system.
Similarly, the legal system was not a good place to address Spautz’s concerns. It’s possible to imagine a more flexible system that would refer Spautz to a wise intervener who would look at the original grievance, namely the one not addressed by the university, and deal with it at the source. But of course the legal system is about applying the law, not about finding creative solutions to problems. As a result, the legal system suffered, with lawyers, judges and others spending a huge amount of time and money dealing with Spautz’s unending cases and appeals.
Would mediation have helped?
If systems are ill designed, then even the most well-meaning individuals can be caught up in them. Most people are likely to blame Spautz, but blame doesn’t provide any answers, just a feeling of superiority.
Occasionally in any society, there will be individuals who become obsessed about particular things. There is still much to be learned about how to find ways to channel obsessions into productive channels.
What I learned
Though the saga of Spautz’s ill-fated campaign for justice had no winners, I learned a lot from it. I studied Spautz’s allegations about Williams’ plagiarism, and to put them in context I read a lot about plagiarism more generally. I wrote a paper titled “Plagiarism, incompetence and responsibility” (and have now added links to numerous relevant documents). That paper was rejected by the first nine journals to which I submitted it. The tenth journal accepted a drastically revised version. From this experience, I learned how difficult it is to publish, in a scholarly journal, a discussion of an actual case involving allegations of incompetence and plagiarism. I talked with one journal editor on the phone. He told me that he would have liked to publish my article but the editorial committee, taking into account legal advice, decided not to proceed. They were worried about being sued.
I wrote a different (and less felicitous) article about the way Spautz’s actions were dealt with at the University of Newcastle. This was published in Vestes, the journal of the Federation of Australian University Staff Associations, FAUSA (which later became a union, the National Tertiary Education Union). It was delayed for a year due to concerns about legal action. It seems that writing about actual cases can be worrisome.
Most of all I learned about the failure of official channels. Spautz tried quite a few: journals, university administrations, courts. None of them worked well, certainly not for him. This was my first immersion in a case that showed clearly the shortcomings of formal procedures. This stood me in good stead when, over a decade later, I became involved in Whistleblowers Australia and talked to numerous whistleblowers. They told the same story: when they took their concerns to bosses, boards of management, ombudsmen and courts, they were regularly disappointed.
Official channels work fine in many circumstances, and most of the people on appeal committees and working in agencies are concerned and hard-working. But when a person with less power tries to challenge one with more power, or challenge the entire system, it is usually a hopeless cause. So that’s what for many years I have told whistleblowers and what I’ve written in my book giving advice to whistleblowers. Yes, you might be very lucky and find justice in official channels, but don’t count on it. Indeed, you should assume they won’t provide the justice you’re looking for. Although Spautz never learned that lesson, he taught it to me, and for that I am thankful.
Michael Spautz, 2011
Michael’s daughters Hildie and Laura had the unwelcome and overwhelming task of clearing his belongings from his unit, including accumulated files about his campaign that filled seven book cases (that’s cases, not shelves). Perhaps, whimsically, the files could have been placed as a display in a museum as a testament to the futility of spending years seeking justice through formal channels, with the message for those who might follow his path, “If at first you don’t succeed, then try something else.”
Don Parkes in his book Doctored!, mentioned above, made the following comments (page 12).
During the mid 1980s and through the 1990s, if one had an academic problem that required administrative attention, then at the University of Newcastle NSW, too often, one became ‘the problem’. As a serious enough problem one could end up in gaol, as was the case for Dr. Michael Spautz. Vice Chancellors and others will not give much attention to you, will not treat you as a colleague, or pay much real attention to the problem that you have raised: you become the problem and that is how they relate to you. Nevertheless, it is really quite easy to overcome the predicament: cooperate; just leave it to the powers that be: promotion and positive references await for such cooperation.
At about the time that our story was kicking in, Dr. Michael Spautz was sent to prison for 76 days in the high security, 150-year-old Maitland NSW gaol. He was an American, a Senior Lecturer in the Faculty of Economics and Commerce. Spautz fought the University all the way to the High Court of Australia because he was not satisfied that due process had been followed in the handling of reports of alleged plagiarism in the work of a newly appointed professor. Spautz was required to undergo psychiatric assessment and was eventually dismissed. He continued the fight.
Maitland gaol was a nasty place, high security prisons are nasty places, usually for nasty people. Dr. Spautz was not a nasty person. I knew him for many years and have often looked back, with some shame at my ‘bystander role’: though he was always openly welcome in my office; we met where and as we wished and together with my good friend Richard Dear from the university’s computer centre, we gave him many sheets of computer print-out paper on which to ‘roneo’ copy his ‘in vita veritas’ letters distributed to hundreds of staff and students. The reason for his imprisonment was clamed to be non-payment of an account. That’s believable? Technically probably ‘yes’, it is believable: but it was draconian, a ‘teach him a lesson’ sort of punishment. The university was well connected.
Fourteen years later, in 1996, he received a paltry sum of $75,000 for wrongful imprisonment; he was never reinstated in the University.
You’re active in an organisation and you’d like to help it become more effective. How do you proceed? You can work harder yourself. You can try to recruit others to support the cause. You can set up a website, run an advertisement, or invite some friends to a meeting. What’s the most effective thing to do?
This question is relevant to a wide range of organisations, including sporting clubs, corporations, government departments, environmental groups, churches and political parties. Despite the importance of the question, surprisingly most organisation members simply rely on what they’ve always done.
For insight, it’s worth learning from the 2014 book How Organizations Develop Activists by Hahrie Han. To try to assess what methods worked better, Han looked at the different chapters of two US national organisations that she calls People for the Environment and the National Association of Doctors. Some chapters were more effective than others. Han interviewed members and observed strategies, and came up with a framework.
Some chapters relied on lone wolves. A lone wolf in this context is someone who takes action on their own. These individuals became committed to the cause, studied the issues, became very knowledgeable and wrote submissions and personally lobbied politicians. The lone wolf approach is usually not very effective because very few individuals maintain a commitment on their own and because collective action is vital for some purposes.
Lone wolf at work
Other chapters relied on a second approach that Han calls mobilising. Core members would decide on actions, such as a meeting, petition drive or rally, and try to recruit people to join the action, for example by sending emails or ringing. Sometimes a mobilising strategy can bring huge numbers onto the street, especially when there is an event triggering outrage. This happened just before the 2003 invasion of Iraq, when campaigners were amazed by huge turnouts at rallies. But other times there is little response to the messages calling for action.
Han calls mobilising a “transactional” exchange between the organisation and the activist. The organisation seeks to make action as easy as possible so that, for the activist, the benefits of acting outweigh the costs.
Mass mobilisation: London, 15 February 2003
Yet other chapters relied on a third approach that Han calls organising. Experienced members, in their role as organisers, try to identify members or supporters who might take a leadership role, and spend time helping them to develop their skills and motivation. In this model, organisers identify and train others to become autonomous leaders.
Han calls organising “transformational” because it aims to change individuals, developing their understanding, perspectives and emotional investments. Through this process, activists become more knowledgeable and involved, and start thinking strategically of how the organisation can achieve its goals.
Han says the most effective chapters use a combination of mobilising and organising. They use mobilising, for example getting people to public events, to achieve the goals of the organisation, and to identify potential leaders. Then organising methods are used to develop possible leaders, who go on to train others, building the capacity to mobilise many more people.
Although mobilising and organising are used in the most effective chapters, organising is the most easily neglected. In the heat of a campaign, core members may focus on getting out the numbers rather than the slower, long-term effort in helping others develop skills and motivations. Organising requires much hard work.
Another factor is that media technologies now make mobilising easier than before. With databases giving the demographics of community members, it is straightforward to tap into pre-existing commitments. One consequence is that organising is sidelined.
Han’s analysis of civic organisations deals with US environmental and medical campaigning groups, and is oriented to influencing politicians. Whether her observations apply more widely is uncertain. Even with this caveat, I think Han’s conceptual division of organisational development into lone wolf, mobilising and organising approaches is immensely valuable. It provides an insight into the strengths and weaknesses of a range of organisations well outside the domain studied by Han.
“Distinct philosophies about transactional mobilizing and transformational organizing underlie these choices about how to engage with volunteers. In transactional mobilizing, the chapters were most focused on minimizing costs to maximize the numbers of people involved. In transformational organizing, the chapters were focused on creating experiences for volunteers that would begin to transform their affects and orientations towards activism. Thus, they were more likely to create work that brought people into contact with each other, and support that work through extensive coaching.” (p. 122)
Large unions have paid staff, and often the paid officials take on the bulk of union work, from holding meetings with employers to deciding on industrial action. There may not be much sustained effort to select workers who can become effective labour activists, thinking strategically, acting autonomously and in turn recruiting others to become activists. Why not? One reason is that unions have a natural constituency, the workers, with common interests, so it’s far easier to call on workers to take action than to develop more organisers.
Recently, I attended a campaign forum held by the local branch of my union, the National Tertiary Education Union. The presidents of branches at two other Australian universities — Damien Cahill at Sydney University and Vince Caughley at the University of Technology Sydney — told about their unions’ efforts to protect and improve staff conditions. They told about how union membership had declined in the aftermath of enterprise bargaining. Many university employees don’t see the point of being union members because they receive all the benefits of union efforts without having to pay union dues.
Vince and Damien at the University of Wollongong
Damien and Vince told about the importance of face-to-face meetings with individuals, of encouraging members to help in small ways (like putting up a notice about a meeting) and of identifying potential leaders. What they described fits perfectly in the organising mode. Because unions have a natural constituency for mobilising, organising is all the more important.
In Australia, political parties are poor at organising. Party memberships have been shrinking for decades, and ever more activity is driven by political staffers. One factor is compulsory voting. There is no need to “get out the vote,” and therefore less incentive to employ either mobilising or organising strategies.
Universities, for the most part, do not do much organising. Most of the effort at marketing is done by paid staff. There are quite a few people willing to be volunteers, especially alumni and retired staff, but at most universities it is not a priority to identify and develop volunteers who will become ambassadors for the university. As a result, most of the efforts are by lone wolves, individuals who take the initiative themselves.
Learning via organising
Consider education and the challenge of helping people learn. Imagine there is an independent campaign group that tries to promote learning. This is not a lobbying group, seeking more government or private funding, but a group that directly engages with eager learners. How can such a group become more effective?
Following Han’s insights, the most promising model is a combination of mobilising and organising. But are there any such groups? In Australia, they exist only on the margins. One place is refugee support groups. In Wollongong there is a group called SCARF (Strategic Community Assistance to Refugee Families). Among its activities is a tutoring programme for refugee children. SCARF can extend this programme through recruiting more tutors and by more systematic mentoring of tutors so they can become leaders to recruit and train others.
Another place for direct learning is the home. Many parents take it upon themselves to assist their children’s learning. Home schoolers take a much heavier responsibility. Campaigners for home schooling can use the mobilising and organising methods described by Han.
However, there seems to be no wide-scale campaign in Australia to foster learning. The best examples of such campaigns have been in countries with low literacy, where efforts by social movements link learning with understanding of oppression and resistance. Paulo Freire’s efforts are most well known.
Some Western social movements see learning as part of their brief. They can form reading groups, study circles and other processes to build understanding. But such efforts are often seen as low priority because it’s easier to draw on people who have developed their skills through formal education. Movements are thus likely to neglect organising for learning.
Citizen advocacy as organising
In the disability sector in Australia, there is an important role for advocacy, in which an individual supports a person with a disability, helping them to meet their needs. An advocate is different from a service provider, who directly helps by providing food, transport, housing and other essentials. An advocate, in contrast, essentially speaks on behalf of the person with a disability to make sure the service system operates properly on their behalf.
Alice has an intellectual disability. Abandoned by her family, she lives in a group home where she has been subject to abuse by other residents. She has no friends. Jo, an advocate for Alice, puts pressure on the managers of the group home to place her in a safer residence. Jo introduces Alice to a few others who might become friends, uses contacts to get her a job, and helps her develop living skills.
In practice, family members, especially parents, most commonly act as advocates. But in some cases the family is unwilling or unable to help and the service system is overloaded or dysfunctional, so some other form of advocacy is valuable.
Jo could be a paid advocate, who acts on behalf of several people with disabilities. Another possibility is that Jo is a citizen advocate, taking action on behalf of Alice out of a personal commitment.
Citizen advocacy programmes were set up to promote this form of advocacy. Typically they have a few staff paid by government or private donations. The staff search the local community for people with disabilities who have significant unmet needs, like Alice, called protégés, then seek to recruit someone like Jo who will be an advocate, often on an ongoing basis. The staff then support the advocate by providing advice, training and encouragement.
Citizen advocacy in essence operates using an organising model, with a highly specific focus. The paid staff do not do advocacy themselves but devote most of their efforts to finding protégés and a suitable advocate for each protégé, and then supporting the advocates. However, citizen advocacy has only a limited capacity for expansion because it does not recruit or train new coordinators, namely people who could become match-makers themselves, though without pay. As well, mobilising methods could be valuable to expand citizen advocacy.
In contrast, paid advocacy is more analogous to the lone wolf model of activism. Individual advocates may be very good at their jobs, but cannot expand their efforts more broadly because the methods of mobilising and organising are not used.
The methods of the lone wolf, mobilising and organising seem to apply most obviously to campaigning, which is Han’s focus. But what about actually doing jobs? Han studied a doctors’ advocacy organisation. But is there any organisation that tries to build a community capacity for health care? In China under Mao, “barefoot doctors,” who learned basic skills but were not professionally trained, served the rural poor. However, where the medical profession is well established, there is little or no fostering of the capacity of people outside the formal structures to contribute. About the most that anyone does is take a first-aid course, or perhaps volunteer at a hospice.
barefoot doctoring in rural China
By excluding non-trained individuals, occupations maintain a monopoly over service, preventing competition and maintaining salaries and conditions for those accepted into the occupation. This applies in professionalised domains such as medicine, dentistry, law and engineering. The same phenomenon applies to most large employers. A company, to get a job done, hires workers and spends little effort at developing the skills of non-workers to do the same job. To do so would be heresy: it would be seen as undermining the work of those paid to do it. Within government departments, the same applies. There is little effort at recruiting unpaid helpers and developing their skills. That would be a threat to the paid workers and seen as exploitation of the unpaid helpers, even if they were keen to contribute.
Things would be different if everyone was guaranteed a decent annual income, as proposed by advocates of the UBI, universal basic income. If paid work were a voluntary extra, then mobilising and organising would become more important to encourage people to make contributions to worthwhile causes.
Han points out that in practice few organisations rely entirely on one approach. The lone wolf, mobilising and organising approaches are “ideal types” that are helpful for better understanding what happens in actual organisations. One of Han’s most important messages is that organising is often neglected. One reason for this is that so many social institutions are set up to protect those with skills and to marginalise outsiders. Thus, it is bound to be an uphill battle to expand the role of organising. And to do this, the most obvious method is — organising!
Thanks to Damien Cahill, Sharon Callaghan, Julie Dunn and Jan Kent for valuable feedback.
Comment from Sharon Callaghan
I liked the pointers to longer-term solutions on building activism. Workers in disability services in Australia who are active in their union came together and said they wanted access to quality training and recognition of the skills they bring to their work. The Australian Services Union, as the union for these workers, commissioned a report. Workers in disability are now seeking “A Portable Training Entitlement Scheme for the Disability Support Services Sector“, to give the title of the report authored by Drs Rose Ryan and Jim Stanford. This campaign, if successful, will address other gaps in this sector. Quality training and supervision, whistleblower protections and strong workplace safety mechanisms are important to workers who often have extraordinary responsibilities caring for vulnerable service users. Organising and supporting workers to speak out and demand their entitlements has long lasting flow on benefits for the service and sector.
I was interested in the idea of organising both inside and outside formal structures and accept some forms of professionalisation are not open to those seen as “non expert.” Personal activism with the freedom to speak out may still be limited when lacking the resources, skill development and support that formal groups can provide. Somehow finding ways to allow the authentic voice of the activist come through with assistance of the formal structure of their union, university or community group, may be a good way to go.
Surviving and getting ahead as an academic can sometimes be detrimental to scholarly goals.
Universities are supposed to foster the creation and dissemination of knowledge. However, the academic system of funding and careers can get in the way. Here I’ll focus on research, looking at grants and publications in Australia.
To do research, academics can use resources provided by their universities, including libraries, computing, labs and equipment. In addition, it’s possible to obtain research grants from external sources to pay for equipment and personnel. I was once employed by a colleague’s research grant, and grateful for it. However, the research grant system has some damaging side-effects.
When success rates for grant applications are low, efforts to make applications persuasive can lead to dubious practices. Hyping the importance of research is commonplace. Some supervisors in scientific fields, in order to add to their publication record and thereby improve their chances in grant applications, put their names on papers done almost entirely by their research students.
In Australia, obtaining a grant has become so highly valued that it supersedes the research outputs it is supposed to enable. Absurdly, having a grant and producing some publications is more prestigious than producing those exact same publications without a grant, even though the researcher without a grant is more efficient.
This inversion of values is acute in the humanities, where extra money is seldom needed. Because obtaining grants is prestigious, scholars may apply for them even when they’re more trouble than they’re worth.
This especially applies to grants from the Australian Research Council, the main external source of funds for humanities and social sciences. Based on successes with direct grants, the ARC gives money to universities for scholarships and infrastructure. Therefore, university administrations have a financial incentive to encourage scholars to apply for ARC grants.
This was my experience. I have never needed extra funding for my research, but applied because it would look good on my CV and would bring additional funding to the university. I obtained four major grants from the ARC and its predecessor and used the money to hire assistants (usually becoming co-authors) to work on my projects. It was satisfying to collaborate but there were many administrative hassles.
The strange thing is that I don’t think my own productivity increased. I plotted my publications against the years in which I did or didn’t have a research grant and found no evidence of more outputs resulting from the grants. (The people employed by my grants contributed to publications, so what the grants did was fund their outputs.)
Writing grant applications is labour-intensive. My guess is that the work involved in writing a new ARC application is roughly the same as required to write a paper for publication. In a quest to obtain the money to do research, it is thus necessary to sacrifice vast quantities of time and effort that might otherwise be devoted to the research itself.
Applicants for grants often play it safe by pitching proposals that do not challenge standard views within the field. Many of those who receive grants feel obliged to follow a fixed research agenda. In these ways, grant systems discourage innovation.
Time, money and facilities are needed to do research. That’s not in question. The issue is whether grant schemes are the most effective way to foster research, especially path-breaking research. There seems to be little retrospective assessment of whether grants are providing value for money.
Where to publish?
Is the point of publishing to say something worthwhile or to get ahead? In practice, it can be a mixture.
The academic system fosters career-oriented publication. Academics are encouraged to publish in the “top” journals, the ones most prestigious in their fields. Papers in top journals are much more likely to be seen by other researchers and cited in their own papers. Receiving lots of citations is another measure of scholarly performance.
For decades in Australia, publishing in top journals has been a way of getting jobs and promotions. Because appointment committees usually have members from the same discipline, publishing in big-name journals is likely to impress them.
The Australian government through its scheme ERA (Excellence in Research for Australia) has institutionalised the preference for top journals. Universities are rated in different disciplines for “excellence,” which is largely judged on the basis of publications and citations. The more high-status the publications put forward for ERA assessments, and the more they are cited, the higher the rating.
This sounds sensible but it has pernicious effects. Many academics are now encouraged or even instructed to publish only in top journals, and new appointments are made with an eye to contributions to ERA. The result is that academics, even more than before, write primarily for each other, because most of the top journals are oriented to other researchers. People outside the field are unlikely to want to read them, as they require specialist knowledge and vocabulary and are often filled with jargon. Even people in the field may find reading published articles unappealing. Furthermore, people outside academia often do not have cheap and easy access to scholarly papers.
For books, an important currency in some fields, academics are encouraged to find prestigious publishers. Most academic books are very expensive and mainly purchased by libraries. Some of my colleagues are reluctant to recommend that others buy their own books.
Most of the top journals are owned by big publishers that make enormous profits from their control over scholarly outputs. It’s a strange situation: academics do the research, write the papers, review the submissions of their peers and edit the journals, but the resulting publications are controlled by commercial publishers. These publishers extract money in several ways. Many journals are pay-by-use, so if you’re not an academic you have to pay to read an article. Academic libraries pay large annual fees for databases with bundles of journals, so universities pay so their students and researchers can read scholarly outputs. Then there is the “open access” option provided by some journals, requiring authors (or their institutions) to pay a large fee to the publisher so it becomes free on the web.
The open access movement has been pushing to make all scholarly publications free online, but big publishers have mounted a strong resistance. Unfortunately, many authors continue to submit articles to journals owned by exploitative publishers. The reason: to get ahead in the academic game, it’s important to publish in high-prestige, high-impact journals, and most of these are controlled by the big publishers. What is called “socially just publishing” is a budding challenge to big-publisher domination.
The overall impact is that academics are encouraged to write articles in a style that alienates most readers and to publish in venues that limit access to people not at universities.
Some intellectuals are independent of universities and have the freedom to write in a readable style and publish where they like. Academics can do this too if they can resist the pressures to play the game.
A decade ago, I was able to carry out a comparison. First, I looked up the number of citations received by each of my publications. Second, I looked up the number of views of my publications on the university website. I compiled two top-20 lists of my articles and book according to citations and according to views. Lo and behold, the articles on the lists were almost entirely different.
My most frequently viewed publication was “Defamation law and free speech”.. It wasn’t published in a refereed journal, or indeed in any journal at all. I put it on my website as an aid to whistleblowers, many of whom are threatened with being sued for defamation. The article received twice as many views as anything else I had written. Meanwhile, few of my most highly cited publications received many views.
The implication was that the publications people wanted to read were systematically different from those seen as important by academics.
Why are academics so reluctant to write in places and ways that are accessible and understandable? No doubt part of the reason is the pressure to impress peers to obtain jobs, grants and promotions. But why are so many peers, namely the academics who sit on selection, grant and promotion committees, so enamoured with esoteric publications in journals and books that few people ever want to read?
I see this as an unconscious process by which academics control their fields of study, specifically to protect them from outsiders, and thereby gain resources and status. When only those in the field can understand research outputs, this provides support for the claim that only those in the field can judge who is a good scholar and who is making a useful contribution to knowledge.
Consider the alternative. If publications were expected to be readable by non-specialists, either entirely or with explanatory supplements, this would open the field to interlopers, namely scholars in neighbouring fields, and perhaps even some well-read non-academics. By keeping outputs esoteric, those in the inner sanctum are protected from competition.
Scientists have been prominent in the push for open access. Scientists have the least to worry about competition because most of their publications are understandable only by specialists. In the humanities and social sciences, and some applied fields, things are different. When concepts are easy to grasp, this can be limited by the proliferation of jargon and obscure theorising.
What to do?
Junior scholars usually feel the need to play the game for the sake of their careers. Nevertheless, it’s always possible to deviate from the expected path, though at some sacrifice or risk.
One option is to put the full text of all publications free online, on a personal website, an institutional repository or a platform like academia.edu. This alleviates the financial barrier to access.
Access is one thing; understandability is another. Open access is not all that helpful for writing that is opaque to outsiders. More readable treatments can be posted in blogs, published in the mass media and in popular magazines. For those seeking to rise in the system, writing for general audiences is usually an added burden, and undertaken at the risk of being seen to be unscholarly.
Academics can even rewrite their job descriptions, laying out a commitment to open access and to socially just publishing.
None of this is easy or can be tackled singlehandedly. This is why movements for change, like the open access movement, are so important. Contributing to these movements is probably the most valuable way to help promote long-term change.
Are there things everyone should be required to learn? If so, what are they?
Apage of logarithms from the Handbook of Chemistry and Physics, 44th edition, 1962-1963
There are lots of things that are useful to know or be able to do. Reading and writing are fundamental. Knowing how to count, add and subtract. Grammar can be useful, and spelling too. So is recognising street signs. The list could go on.
These are things that are useful to know, but they are not identical to things students have to study. In high school in the US, I had to take two years of a foreign language in order to get into a good university. French was my worst subject. Then, at Rice University, I had to take two years of a language to graduate, even though my major was physics. I chose German this time around, and despite studying hard, was lucky to pass. For me, studying foreign languages was challenging, and I retained little of what I learned.
I vaguely remember some of the things learned in school mathematics classes, like interpolating in a table of logarithms. To multiply or divide numbers, we would look up the logarithm of each number, add or subtract the logarithms and then find the number corresponding to the result. For greater accuracy, we would interpolate in the tables, namely estimate the number between two entries in the table.
I learned how to use a slide rule, which is basically two rulers with logarithmic scales that can be used to multiply and divide. I remember in year 8 daring to use my slide rule in an exam, and then checking the result by calculating it longhand.
These skills became outdated decades ago, after the introduction of pocket calculators. No one says today that anyone should have to learn how to interpolate in tables of logarithms or to use a slide rule. Most young people have never heard of a slide rule.
Some knowledge becomes obsolete and other knowledge is never used. So is there anything that everyone must study and learn?
The math myth
These reflections are stimulated by Andrew Hacker’s new book The Math Myth. He is greatly disturbed by the requirement that all US students must study math (or maths as we say in Australia) to a level far beyond what is required in most people’s lives and jobs.
Hacker, a political scientist at Queens College in New York City, actually loves maths, and shows his knowledge of the field by dropping references to polynomials and Kolmogorov equations. He is ardent in his support of learning maths, primarily arithmetic (requiring addition, subtraction, multiplication and division) and practical understanding of real world problems. His target for criticism is requirements for learning algebra, trigonometry and calculus that damage the morale and careers of many otherwise capable students.
In the US, according to Hacker, the most common reason students fail to complete high school or university is a maths requirement. Everyone has to pass maths courses, and learn how to solve quadratic equations, whether they are going to become a hairdresser, truck driver or ballet dancer. His argument is that many people have talents they are prevented from fully developing because of an absurd requirement to pass courses in mathematics. Even when students pass, many of them quickly forget what they learned because they never use it.
Hacker makes a bolder claim. He says that in many professions in which maths might seem essential, actually most practitioners use only arithmetic. This includes engineering. Hacker interviewed many engineers who told him that they never needed to solve algebraic equations or use trigonometric functions.
On the flip side, Hacker cites studies of some occupations, like carpet laying, in which workers in essence solve difficult equations, but they do it in a way passed down from experienced workers. The irony is that many of these workers never passed the maths classes mandated for finishing high school.
The resulting picture is damning. Millions of students struggle through maths classes, some of them falling to the wayside, others developing maths anxiety, yet few of them ever use the knowledge presented in these classes.
Why maths requirements?
How has this situation arisen? Hacker puts the blame on leaders of the mathematics profession, mostly elite pure mathematicians, who sit on panels that advise on high school and university syllabuses. Few of these research stars have any expertise in teaching, and indeed few of them spend much time with beginning students. Not only do they seldom visit a high school classroom, but most avoid teaching large first-year university maths classes. Educational administrators defer to these gurus rather than consulting with teachers who actually know what is happening with students.
It might be argued that being able to do well in maths is a good indicator of doing well in other subjects. Perhaps so, but this is not a good argument for imposing maths on all students. Research on expert performance shows that years of dedicated practice are required to become extremely good at just about any skill, including music, sports, chess and maths. The sort of practice required, called deliberate practice, involves focused attention on challenges at the limits of one’s ability. This sort of practice can compensate for and indeed supersede many shortcomings in so-called general intelligence. In other words, you don’t need to be good at maths to become highly talented in other fields.
Hacker argues that the test most commonly used for entry to US universities, the SAT, is unfairly biased towards maths, to the detriment of students with other capabilities. Not only do maths classes screen out many students with talents in other areas, but selection mechanisms for the most prestigious universities, whose degrees are tickets to lucrative careers, unfairly discriminate against those whose interests and aptitudes are in other areas.
Education as screening
Hacker’s analysis of maths is compatible with a wider critique of education as a screening mechanism. Randall Collins in his classic book The Credential Society argued that US higher education served more to justify social stratification than to stimulate learning. In other words, students go through the ritual of courses, and those with privileged backgrounds have the advantage in obtaining degrees that give them access to restricted professions.
In another classic critique, Samuel Bowles and Herbert Gintis in Schooling in Capitalist America argued that schooling reproduces the class structure. Their Marxist analysis gives the same general conclusion as Collins’ approach. Then there is The Diploma Disease by Ronald Dore, who described education systems worldwide, but especially in developing countries, as irrelevant in terms of producing skills that can be applied in jobs.
Schooling, up to teenage years, remains one of the few compulsory activities in contemporary societies, along with taxation. (In some countries, military service, jury duty and voting are compulsory.) There is no doubt that education can be a liberating process in the right circumstances, but for many it is drudgery with little compensating benefit, aside from obtaining a certificate needed for obtaining a job, while what is learned has little practical relevance.
A different system would be to set up entry processes to occupations, ones closely related to actual skills used in practice. Exams and apprenticeships are examples. Attendance at schools and universities then would be optional, chosen for their value in learning. There is one big problem: attendance would plummet.
Some teachers set themselves the task of stimulating a love of learning. Rather than trying to convey particular facts and frameworks, they see that learning facts and frameworks is a way of learning how to learn. The ideal in this picture is lifelong learning.
The trouble with schooling systems is that they undermine a love of learning by imposing syllabi and assessments. Students, rather than studying a topic because they are fascinated by it, instead learn that studying is tedious and to be avoided, and only undertaken under the whip of assessment.
How many students do you know who keep studying after the final exam? On the other hand, people who are passionate about a topic will put in hours of concentrated effort day after day in a quest for improvement and in the engaged mental state called flow.
The paradox of educational systems is that they are designed to foster learning yet, by subjecting students to arbitrary requirements, can actually hinder learning and create feelings of inadequacy. The more that everyone is put through exactly the same hoops — the same learning tasks at the same time — the more acute the paradox.
A different sort of education
Taking this argument a step further leads to a double implication. Education should be designed around the needs of individual students, as attempted in free schools and in some forms of home schooling. The second implication is that work should be designed around the jointly articulated needs of workers and consumers. Rather than students having to compete for fixed job slots, instead work would be reorganised around the freely expressed needs and capacities of workers and local communities.
Whether this ideal could ever be reached is unknown, but it nonetheless provides a useful goal for restructuring education — including maths education. This brings us back to Hacker’s The Math Myth. There are two sides to his argument. The first, as I’ve described it, is that US maths requirements are damaging because few people ever need maths beyond arithmetic and the requirements screen talented people out of careers where they could make valuable contributions.
The second element in Hacker’s argument is that for the bulk of the population, there are useful things to learn about maths and that these can be made accessible using a practical problem-solving approach. To show what’s involved, Hacker describes a course he taught in which students tackled everyday challenges.
Hacker’s course shows his capacity for innovative thinking. The Math Myth is not an attack on mathematics. Quite the contrary. Hacker wants everyone to engage with maths by designing tasks that relate to their lives.
Whether Hacker’s powerful critique will lead to changes in US educational requirements remains to be seen. Although Hacker talks only about pointless maths requirements, his arguments challenge the usual basis for screening that helps maintain social inequality. If maths cannot be used to legitimise inequality in educational outcomes, what will be the substitute?
Whether you respond to maths with affection or anxiety, it’s worth reading The Math Myth and thinking about its implications.
Be careful about data you encounter every day, especially in the news.
If you watch the news, you are exposed to all sorts of numbers, intended to provide information. Some might be reliable, such as football scores, but with others it’s harder to know, for example the number of people killed in a bomb attack in Syria, the percentage of voters supporting a policy, the proportion of the federal budget spent on welfare, or the increase in the average global temperature.
Should you trust the figures or be sceptical? If you want to probe further, what should you ask?
To answer these questions, it’s useful to understand statistics. Taking a course or reading a textbook is one approach, but that will mainly give you the mathematical side. To develop a practical understanding, there are various articles and books aimed at the general reader. Demystifying Social Statistics gives a left-wing perspective, a tradition continued by the Radstats Group. Joel Best has written several books, for example Damned Lies and Statistics, providing valuable examinations of statistics about contested policy issues. The classic treatment is the 1954 book How to Lie with Statistics.
Most recently, I’ve read the recently published book Everydata by John H. Johnson and Mike Gluck. It’s engaging, informative and ideal for readers who want a practical understanding without encountering any formulas. It is filled with examples, mostly from the US.
You might have heard about US states being labelled red or blue. Red states are where people vote Republican and blue states are where people vote Democrat. Johnson and Gluck use this example to illustrate aggregated data and how it can be misleading. Just because Massachusetts is a blue state doesn’t mean no one there votes Republican. In fact, quite a lot of people in Massachusetts vote Republican, just not a majority. Johnson and Gluck show pictures of the US with the data broken down by county rather than by state, and a very different picture emerges.
In Australia, aggregated data is commonly used in figures for economic growth. Typically, a figure is given for gross domestic product or GDP, which might have grown by 2 per cent in the past year. But this figure hides all sorts of variation. The economy in different states can grow at different rates, and different industries grow at different rates, and indeed some industries contract. When the economy grows, this doesn’t mean everyone benefits. In recent decades, most of the increased income goes to the wealthiest 1% and many in the 99% are no better off, or go backwards.
The lesson here is that when you hear a figure, think about what it applies to and whether there is underlying variation.
In the Australian real estate market, figures are published for the median price of houses sold. The median is the middle figure. If three houses were sold in a suburb, for $400,000, $1 million and $10 million, the median is $1 million: one house sold for less and one for more. The average, calculated as total sales prices divided by the number of sales, is far greater: it is $3.8 million, namely $0.4m + $1m + $10m divided by 3.
The median price is a reasonable first stab at the cost of housing, but it can be misleading in several ways. What if most of those selling are the low-priced or the high-priced houses? If just three houses sold, how reliable is the median? If the second house sold for $2 million rather than $1 million, the median would become $2 million, quite a jump.
Is the average or median house price misleading?
In working on Everydata, Johnson and Gluck contacted many experts and have used quotes from them to good effect. For example, they quote Emily Oster, author of Expecting Better: Why the Conventional Pregnancy Wisdom is Wrong, saying “I think the biggest issue we all face is over-interpreting anecdotal evidence” and “It is difficult to force yourself to ignore these anecdotes – or, at a minimum, treat them as just one data point – and draw conclusions from data instead.” (p. 6)
Everydata addresses sampling, averages, correlations and much else, indeed too much to summarise here. If Johnson and Gluck have a central message, it is to be sceptical of data and, if necessary, investigate in more depth. This applies especially to data encountered in the mass media. For example, the authors comment, “We’ve seen many cases in which a finding is reported in the news as causation, even though the underlying study notes that it is only correlation.” (p. 46) Few readers ever check the original research papers to see whether the findings have been reported accurately. Johnson and Gluck note that data coming from scientific papers can also be dodgy, especially when vested interests are involved.
The value of a university education
For decades, I’ve read stories about the benefits of a university education. Of course there can be many sorts of benefits, for example acquiring knowledge and skills, but the stories often present a figure for increased earnings through a graduate’s lifetime.
This is an example of aggregated data. Not everyone benefits financially from having a degree. If you’re already retired, there’s no benefit.
There’s definitely a cost involved, both fees and income forgone: you could be out earning a salary instead. So for a degree to help financially, you forgo income while studying and hope to earn more afterwards.
The big problem with calculations about benefits is that they don’t compare like with like. They compare the lifetime earnings of those who obtained degrees to the lifetime earnings of those who didn’t, but these groups aren’t drawn randomly from a sample. Compared to those who don’t go to university, those who do are systematically different: they tend to come from well-off backgrounds, to have had higher performance in high school and to have a greater capacity for studying and deferred gratification.
Where’s the study of groups with identical attributes, for example identical twins, comparing the options of careers in the same field with and without a degree? Then there’s another problem. For some occupations, it is difficult or impossible to enter or advance without a degree. How many doctors or engineers do you know without degrees? It’s hardly fair to calculate the economic benefits of university education when occupational barriers are present. A fair comparison would look only at occupations where degrees are not important for entry or advancement, and only performance counts.
A final example
For those who want to go straight to takeaway messages, Johnson and Gluck provide convenient summaries of key points at the end of each chapter. However, there is much to savour in the text, with many revealing examples helping to make the ideas come alive. The following is one of my favourites (footnotes omitted).
Americans are bad at math. Like, really bad. In one study, the U.S. ranked 21st out of 23 countries. Perhaps that explains why A&W Restaurants’ burger was a flop.
As reported in the New York Times Magazine, back in the early 1980s, the A&W restaurant chain wanted to compete with McDonald’s and its famous Quarter Pounder. So A&W decided to come out with the Third Pounder. Customers thought it tasted better, but it just wasn’t selling. Apparently people thought a quarter pound (1/4) was bigger than a third of a pound (1/3).
Why would they think 1/4 is bigger than 1/3? Because 4 is bigger than 3.
People misinterpreted the size of a burger because they couldn’t understand fractions. (p. 101)
John H. Johnson
John H. Johnson and Mike Gluck, Everydata: The Misinformation Hidden in the Little Data You Consume Every Day (Brookline, MA: Bibliomotion, 2016)
Imagine you are a teacher and you decide to try an innovative teaching technique. However, it goes horribly wrong. The technique didn’t work the way you expected, and furthermore numerous students make complaints to your supervisor. Luckily, your supervisor is sympathetic to your efforts and your job is secure.
What do you do next?
Avoid innovative techniques: they’re too risky.
Keep innovating, but be much more careful.
Tell a few close colleagues so they can learn from your experience.
Write an article for other teachers telling what went wrong, so they can learn from your experience.
Invite some independent investigators to analyse what went wrong and to write a report for others to learn from.
The scenario of innovative teaching gone wrong has happened to me several times in my decades of teaching undergraduates. Each time, through no particular fault of my own, what I attempted ended up disastrously. It even happened one time when I designed a course that worked brilliantly one year but failed miserably the next.
So what did I do? Mainly options 2 and 3: I kept innovating, more carefully, and told a few colleagues. I never imagined writing about these teaching disasters, even using a pseudonym, much less inviting others to investigate and publish a report. It would be humiliating, might invite additional unwanted scrutiny, and might even make innovation more difficult in the future.
Aviation: a learning culture
These thoughts came to mind as a result of reading Matthew Syed’s new book Black Box Thinking. The title refers to the flight recorders in commercial aircraft, called black boxes, that record data about the flight, including conversations among the pilots. When there is a crash or a near miss, these boxes are vital for learning from the failure. Rather than automatically blaming the pilots, an independent team of experts investigates accidents and incidents and publishes its findings so the whole industry can learn from what happened.
Some of the greatest improvements in aircraft safety have resulted from studies of disasters. The improvement might be redesigning instruments so confusion is less likely or changing protocols for interactions between pilots. One important lesson from disasters is that the flight engineer and co-pilot need to be more assertive to prevent the pilot from losing perspective during tense situations. The investigations using black-box information occasionally end up blaming pilots, for example when they are drunk, but usually the cause of errors is not solely individual failure, but a combination of human, procedural and technical factors.
Cover-up cultures: medicine and criminal justice
Syed contrasts this learning culture in aviation with a culture of cover-up in medicine. There is a high rate of failure in hospitals, and indeed medical error is responsible for a huge number of injuries and deaths. But, as the saying goes, surgeons bury their mistakes. Errors are seldom treated as opportunities for learning. In a blame culture, everyone seeks to protect their jobs and reputations, so the same sorts of errors recur.
Syed tells about some hospitals in which efforts are made to change the culture so that errors are routinely reported, without blame attached. This can quickly lead to fixing sources of error, for example by differently labelling drugs or by using checklists. In these hospitals, reported error rates greatly increase because cover-up is reduced, while actual harm due to errors drops dramatically: fewer patients are harmed. Furthermore, costs due to patient legal actions also drop, saving money.
So why don’t more hospitals follow the same path? And why don’t more occupations follow the example of aviation? Syed addresses several factors: cultures of blame, excess power at the top of organisations, and belief systems resistant to testing.
In the criminal justice system, one of the most egregious errors is convicting an innocent person of a crime. Police and prosecutors sometimes decide that a particular suspect is the guilty party and ignore evidence to the contrary, or don’t bother to find any additional evidence. Miscarriages of justice are all too common, yet police, prosecutors and judges are reluctant to admit it.
In some cases, after a person has been convicted and spent years in jail, DNA evidence emerges showing the person’s innocence. Yet in quite a few cases, the police involved in the original investigation refuse to change their minds, going through incredible intellectual contortions to explain how the person they charged could actually be guilty. Syed comments, “DNA evidence is indeed strong, but not as strong as the desire to protect one’s self-esteem.” (p. 89)
When I heard about Black Box Thinking, I decided to buy it because I had read Matthew Syed’s previous book Bounce, about which I wrote a comment. Syed was the British table tennis champion for many years and became a media commentator. Bounce is a popularisation of work on expert performance, and is highly engaging. In Black Box Thinking, Syed has tackled a related and broader subject: how to achieve high performance in collective endeavours.
The title had me confused at first, because in other disciplines a black box refers to a system whose internal mechanisms are hidden: only inputs and outputs can be observed. In contrast, flight recorders in aircraft, which actually are coloured orange, not black, are sources of information.
Syed’s book might have been titled “Learning from failure,” because this is the theme throughout his book. He presents stories from medicine, aviation, business, criminal justice, sport and social policy, all to make the point that failures should be treated as opportunities for learning rather than assigning blame. Individuals can heed Syed’s important message, but bringing about change in systems is another matter.
Another theme in the book is the importance of seeking marginal gains, namely small improvements. Syed tells about Formula One racing in which tiny changes here and there led to superior performance. Another example is when the company Unilever was manufacturing soap powder – laundry detergent – and wanted to make the powder come out of the nozzle more consistently.
Unilever’s initial nozzle
Unilever hired a group of mathematicians, experts in fluid dynamics and high pressure systems, to come up with an answer, but they failed. Unilever then hired a group of biologists – yes, biologists – who used a process modelled on evolution. They tried a variety of designs and determined which one worked best. Then they took the best performing design and tested slight modifications of it. Applying this iterative process repeatedly led to a design that worked well but never could have been imagined in advance.
Unilever’s final nozzle, after 45 trial-and-error iterations
Learning from mistakes in science
Syed presents science as a model for learning from error, seeing the experimental method as a great advance over adherence to dogma. Science certainly has led to revolutionary changes to human understanding and, in tandem with technology, to dramatic improvements in human welfare, as well as to unprecedented threats to human life (nuclear weapons and climate change). However, Syed notes that science students mainly study the latest ideas, with little or no time examining “failed” theories such as aether or astrology: “By looking only at the theories that have survived, we don’t notice the failures that made them possible.” (p. 52).
Even so, overall Syed’s view of science is an idealistic image of how research is supposed to work by continually trying to falsify hypotheses. Historian-of-science Thomas Kuhn argued in The Structure of Scientific Revolutions that most research is problem-solving within a framework of unquestioned assumptions called a paradigm. Rather than trying to falsify fundamental assumptions, scientists treat them as dogma. Sociologist Robert Merton proposed that science is governed by a set of norms, one of which is “organised scepticism.” However, the relevance of these norms has been challenged. Ian Mitroff, based on his studies, proposed that science is equally well described by a corresponding set of counter-norms, one of which is “organised dogmatism.”
Although science is incredibly dynamic due to theoretical innovation and experimental testing, it is also resistant to change in some ways, and can be shaped by various interests, including corporate funding, government imperatives and the self-interest of elite scientists.
Therefore, while there is much to learn from the power of the scientific method, there is also quite a bit that scientists can learn from aviation and other fields that learn systematically from error. It would be possible to examine occasions when scientists were resistant to new ideas that were later accepted as correct, for example continental drift, mad cow disease or the cause of ulcers, and spell out the lessons for researchers. But it is hard to find any analyses of these apparent collective failures that are well known to scientists. Similarly, there are many cases in which dissident scientists have had great difficulty in challenging views backed by commercial interests, for example the scandals involving the pharmaceutical drugs thalidomide and Vioxx. There is much to learn from these failures, but again the lessons, whatever they may be, have not led to any systematic changes in the way science is carried out. If anything, the subordination of science to powerful groups with vested interests is increasing, so there is little incentive to institutionalise learning from disasters.
Failure: still a dirty word
Although Syed is enthusiastic about the prospects of learning from failure, he is very aware of the obstacles. Although he lauds aviation for its safety culture, in one chapter he describes how the drive to attribute blame took over and a conscientious pilot was pilloried. Blaming seems to be the default mode in most walks of life. In politics, assigning blame has become an art form: opposition politicians and vulnerable groups are regularly blamed for society’s problems, and it is a brave politician indeed who would own up to mistakes as a tool for collective learning. In fact, political dynamics seem to operate with a different form of learning, namely on how to be ever more effective in blaming others for problems.
I regularly hear from whistleblowers in all sorts of occupations: teachers, police, public servants, corporate employees and others. In nearly every case, there is something going wrong in a workplace, a failure if you want to call it that, and hence a potential opportunity to learn. However, organisational learning seems to be the least likely thing going on. Instead, many whistleblowers are subject to reprisals, sending a message to their co-workers that speaking out about problems is career suicide. Opportunities for learning are regularly squandered. Of course, I’m seeing a one-sided perspective: in workplaces where failure does not automatically lead to blame or cover-up, there is little need for whistleblowing. When those who speak out about problems are encouraged or even rewarded, no one is likely to contact me for advice. Even so, it would seem that such workplaces are the exception rather than the rule.
The more controversial the issue, the more difficult it can be to escape blaming as a mode of operation. On issues such as abortion, climate change, fluoridation and vaccination, partisans on either side of the debate are reluctant to admit any weakness in their views because opponents will seize on it as an avenue for attack. Each side becomes defensive, never admitting error while continually seeking to expose the other side’s shortcomings, including pathologies in reasoning and links to groups with vested interests. These sorts of confrontations seem designed to prevent learning from failure. Therefore it is predictable that such debates will continue largely unchanged.
Although the obstacles to learning from failures might seem insurmountable, there is hope. Black Box Thinking is a powerful antidote to complacency, showing what is possible and identifying the key obstacles to change. The book deserves to be read and its lessons taken to heart. A few courageous readers may decide to take a risk and attempt to resist the stampede to blame and instead foster a learning culture.
“The basic proposition of this book is that we have an allergic attitude to failure. We try to avoid it, cover it up and airbrush it from our lives. We have looked at cognitive dissonance, the careful use of euphemisms, anything to divorce us from the pain we feel when we are confronted with the realisation that we have underperformed.” (p. 196)