Brian Martin is professor of social sciences at the University of Wollongong, Australia, and vice president of Whistleblowers Australia. He is the author of a dozen books and hundreds of articles on dissent, nonviolence, scientific controversies, democracy, information issues, education and other topics.
If you want to succeed in your career, it’s useful to study what it takes.
Albert-László Barabási’s book The Formula: The Universal Laws of Success was published in 2018. The title sounds presumptuous. Can there be laws of success, much less universal ones? It turns out there’s much to learn from this book.
Barabási is a complex-networks researcher. He took his toolkit and applied it to the issue of performance and success, collaborating with others to produce a string of scientific papers. The Formula is a popular account of research on the topic.
To clarify: “success” here refers to careers and is measured by recognition and income, in other words fame and fortune. Success in other ways, for example being a good parent, being honest or helping others behind the scenes, is not covered because it is too hard to study mathematically.
Is this your idea of success?
A fundamental idea behind Barabási’s laws is that individual success derives from the community’s response to an individual’s performance, not from the performance itself. Barabási calls his five laws “universal,” but whether they apply outside the US requires further investigation. In any case, The Formula is fascinating. It is informative and engagingly written, and worth reading even if success in conventional terms is not your personal goal.
Performance and networks
The first law is “Performance drives success, but when performance can’t be measured, networks drive success.” In some fields, for example chess and competitive individual sports, performance can be measured by observing who wins. If you want to succeed in chess, there’s no substitute for becoming a high-level performer.
In most fields, however, performance can’t be measured in a straightforward way. Barabási uses the example of art, giving examples of visual artists who are highly talented yet languish in relative obscurity because they exhibit only in local galleries. Artists who take the initiative to promote their work more widely then have opportunities for being exhibited in higher-profile venues, leading to ever more recognition.
Another example is the Mona Lisa. Did you ever wonder whether its fame is due to its unique artistic merit, or something else? Barabási tells how the Mona Lisa went from obscurity to world recognition.
The implication is that if you’re a very hard worker, willing to put in tens of thousands of hours of dedicated practice at your chosen craft, and willing to wait decades for recognition, then you have a chance in a field where performance can be measured. On the other hand, if you’re keen on networking and don’t want to work quite so hard, then pick a field where measuring performance involves a lot of subjectivity.
Barabási cites a study of elite classical music competitions. In piano competitions, each player performs difficult works, some assigned, some their own choice. There might be a dozen expert judges, who make their assessments independently. Everything seems fair. However, the trouble is that in classical music performance, the standard is so very high that it’s hard to tell the players apart. The judges might actually choose in part according to who looks like a virtuoso. The differences between elite performers are so small that a break — for example, a competition prize — can launch someone on a solo career, while others of equal calibre are left behind.
This is an example of Barabási’s second law, which is “Performance is bounded, but success is unbounded.” Saying performance is bounded means that top performers, like the pianists, are all so good that it’s hard to tell their performances apart. But if you are the chosen one, getting a few lucky breaks, in particular endorsements from gatekeepers, then your fame and fortune can be enormous. Unbounded success like this comes only to a few, and it’s unfair, in the sense that so much depends on luck.
Think of the Olympic games. The gold medal winner in a popular event can become a household name. The silver medallist might be just a second slower but receives only a fraction of the glory and opportunities.
Success breeds success
Barabási’s third law is that previous success, combined with fitness, predicts future success. The academic term for success breeding success is “preferential attachment.” It has an amazingly strong influence.
One experiment involved people listening to unfamiliar pop songs. Members of one group of subjects gave ratings to each song without knowing what other group members thought of the same songs. In a different group, subjects were able to see the ratings of other listeners. The experimenters were tricky: they seeded the ratings, giving some songs a head start. These songs ended up being the most popular.
The message is that most people go along with the crowd. Their preferences are influenced by what others rate highly.
In academia, this is what’s going on when certain theories and theorists are favoured. If lots of researchers are citing Foucault, then the common assumption is that Foucault’s ideas are more incisive or fruitful — better than those of other theorists. There’s a good article about this sort of favouritism, titled “How to become a dominant French philosopher: the case of Jacques Derrida.”
Michel Foucault: a beneficiary of preferential attachment?
Preferential attachment is important in business. A start-up known to have received funding is likely to receive more funding. One way to rig the system is to pretend your own money is from someone else, giving the impression of financial endorsement.
Because of preferential attachment, the first public rating of a product, for example a book on Amazon, is more likely to be an indication of its value. Later reviewers are likely to follow the crowd, so when a product has lots of ratings, its final rating deviates more from its fitness. So when you read a book, don’t read the endorsements first — judge it for yourself.
Judge for yourself: is this painting worth $100 million?
The team: who gets the credit for success?
Barabási’s fourth law is that when a team needs diversity and balance to succeed, an individual receives credit for the team’s achievements. Unfair!
Barabási has quite a few suggestions about how to make a team effective. He says, “Trust someone to be in charge and build an expert, diverse support group around him or her.” This is essential for breakthroughs. Top-rate individual team members are not enough, and can actually derail a group. “What matters is that people are offered opportunities to build rapport and contribute in equal measure.”
There’s an obvious tension here between building a top team and the merit principle in recruitment. In hiring employees, selection is supposed to be on the basis of merit (though there are lots of deviations from this). But choosing on the basis of individual merit isn’t always the best way to develop a productive team, at least one that has autonomy and is expected to be innovative.
Barabási says that credit for teamwork is based on perception, not contribution, and that a single individual receives credit for team success. So if you’re an aspiring newcomer, part of a productive team, at some point you will need to venture out on your own. There are additional biases involved. Studies of academic economists show that men lose nothing by collaborating, but women gain little, and women who collaborate with men gain nothing at all.
Among mathematicians, there is a common belief that to make a great breakthrough, you have to be young. Some famous mathematicians, like Évariste Galois, made their mark when quite young. By age 35, you’re over the hill.
Barabási’s fifth law challenges this belief. The law states that with persistence, success can come at any age. The key is persistence. Barabási found that young researchers are more productive: they write more articles each year. However, their articles written at older ages are just as likely to be breakthroughs. They are less likely to make breakthroughs at older ages because they aren’t trying as hard. (Maybe they become jaded or go into administration.)
Making a breakthrough is a matter of luck. You don’t know in advance which idea or project will be highly successful, so you just have to keep trying. For Barabási, this finding is encouraging. He’s getting older but now knows it’s worth persisting.
Should success be a goal?
For me, reading about Barabási’s “universal laws of success” raises the question of whether the conventional idea of success, as fame and fortune, is an appropriate goal. Who benefits from your success?
Surely you benefit from your own success. That’s obvious enough — or is it? Research shows that acquiring a lot of money is not a particularly promising way to increase happiness; other routes, such as physical activity, relationships, gratitude and optimism are more reliable for promoting happiness. Fame is not a reliable road to happiness, either. It can create some relationships but undermine others.
Some Olympic athletes fall into depression after they achieve their goal of a gold medal. Many athletes, and non-athletes, achieve more satisfaction from striving towards a goal than actually achieving it. As the saying goes, “There is no road to happiness; happiness is the road.”
Do others benefit from your success? That depends quite a lot. You might be very generous in helping and supporting others, using your skills and networks to assist those who are less fortunate. You might be a role model for others. On the other hand, you might have trampled over others in your efforts to get ahead, and become a heartless exploiter, continually on the lookout for challengers who must be crushed. Some business leaders are generous; others are better known for their ruthlessness and bullying.
Developing skills to a high level seems like a good thing. Surely it’s worth becoming an outstanding teacher or violinist. Again, it all depends. Some people with advanced skills, and who achieve success as a result, may be causing more harm than good. A soldier can become a highly skilled at killing. Is that good if it’s the bad guys are being killed or bad if the good guys are the target? A politician can become highly skilled at manipulating public perceptions. It might be for a higher cause or it might be just to obtain power.
The implication is that success alone is not necessarily a worthy goal. It can be better to have worthwhile goals, such as being ethical, enjoying life and helping others. If, in pursuing such goals, you achieve success, that’s just a little added bonus.
In higher education, being smart is greatly prized. But over-valuing smartness has downsides.
Alexander Astin is a US academic with vast experience with higher education in the country. During his career, he visited hundreds of campuses and talked with thousands of students, academics and administrators. He became convinced that there is a fundamental malady in the system: an obsession with smartness.
Astin summarises his concerns in a readable book titled Are you smart enough? How colleges’ obsession with smartness shortchanges students, published in 2016. His focus is entirely on the US but many of his assessments apply to Australia too.
Is your university prestigious?
University leaders greatly prize the status of their institutions. No surprise here. There is a widely known pecking order. Astin says that if you ask people in the US to name the best universities, they regularly come up with the same ones: Harvard, Yale, Princeton, Berkeley and so forth. The exact rankings might shift a bit over time, but the same ones appear in the top group. What is remarkable is that this order has hardly varied in half a century.
In Australia, the same thing applies: those commonly considered the best are the Australian National University, Melbourne, Sydney and so on down the list. The stability of the priority order is remarkable when you compare it to corporations. Apple, Amazon and Google are near the top of the pile but didn’t exist decades ago. Not a single new university has shot into the top group.
Next consider students. Most of them want to go to a prestigious university. They would rather go to Harvard than Idaho State, at least if they can get into Harvard. In Australia, students are attracted by the status of a university but also by the exclusiveness of a faculty. It’s higher status to study medicine or law than nursing or chemistry. Many high school students want to undertake the most exclusive degree they can. Why “waste” an ATAR (Australian Tertiary Admission Rank) of 99.9 on studying visual arts when you can do medicine?
Student preferences are driven mostly by status, with very little attention to the quality of the education provided. The student quest for status is misguided in several ways. One misapprehension is that a high-status university provides a better education. Because status is built mostly on research performance, it does not necessarily correlate with the quality of teaching and the richness of the university experience.
A second misapprehension is that getting a degree from a high-status university is a worthwhile investment. Universities regularly tout figures showing that graduates earn more over their lifetime than non-graduates. However, this is not a valid comparison, because if those who graduated had chosen a different path, they might have been just as successful. The point is that the qualities of the student may do more to determine their career success than the status of the university they attended or the advantages of the learning that it provided. The pay-off for attending a more selective university or undertaking a more exclusive degree may not be much at all.
The message for students is straightforward: instead of pursuing status, develop your skills and productive habits.
Are you attracting the best students?
Every university seeks to recruit the best students it can. At the University of Wollongong, this is obvious enough. The prestige of degrees accords with how restrictive they are. Faculties makes special pitches to students with high ATARs. They can become a “Dean’s Scholar” with special advantages. Universities with more money offer undergraduate scholarships to top performing students. Astin summarises the collective experience: “Every college and university, no matter its size or research emphasis, seeks out smart students.”
So what? Astin has three responses. First, he notes that the mad scramble to recruit top students is silly from a system point of view. If the students are going to go somewhere, why not just allocate them randomly? The reason is that a university’s status depends on the perception that its students are smart.
Astin’s second response is that universities have become so obsessed with smartness that they pay more attention to recruiting top students than educating them. As he puts it, “if you look at our higher education system from an educational perspective, this preoccupation with enrolling smart students makes little sense, because the emphasis seems to be more on acquiring smart students than on educating them well …” He provides many telling examples. More on this later.
His third response is to provide an analogy with the health system. If you are ill and go to a hospital’s emergency department, you will encounter a triage process. Your health problem will be assessed. If it is serious and urgent, you will be taken straight in for treatment. If it is not serious and not urgent, you will have to wait until the urgent cases have been dealt with. If it is nothing to worry about, you’ll be sent home. The health system puts most of its resources towards helping those with the worst health.
This orientation can be criticised by arguing that far more should be spent on preventive health measures, for example addressing pollution and unhealthy diets. But even in preventive health areas, the emphasis is on measures that help the greatest number of people at lowest cost.
In contrast, in higher education, most resources are directed towards those who are the highest performing, which means those who need the least support for learning. This is true in university entry, in provision of scholarships and in higher degrees. It is also true in classrooms where teachers give more attention and encouragement to the best students.
Astin points out that most teachers give more attention to what students know than to how much they have improved. Few teachers give tests at both the beginning and conclusion of courses in order to see what students have learned. Instead, they give tests to rank students, with the emphasis on seeing who is superior rather than focusing on improvement.
He notes that giving grades on assignments “is of limited usefulness in helping students improve their performance.” Many of my colleagues give extensive comments on assignments, not just grades. But I’ve also noted that many students focus on the grades, not on using comments to improve.
Another shortcoming of most classes is that teachers do not require students to keep working on the same assignment. When students are assigned to write an essay, usually it is marked and then the student moves on to the next assignment. There is more learning when students are expected to consider feedback and work on improving the essay, submitting it again and, if needed, yet again. On the few occasions when I used this approach, I could see its great value. But alas, this requires more time and effort by the teacher and is more challenging when class sizes expand.
Are you a smart academic?
Among academics too, there is a cult of smartness. Those researchers who bring in loads of external money and build up empires of research students and postdocs are highly prized. There is no such glorification of outstanding teachers.
The emphasis on being smart manifests itself in various ways. Astin says some academics are “maximisers” who seek to display how smart they are. Their questions at seminars are designed to show off their knowledge. Maximisers, when on committees, may become blockers. It’s easier to show your critical acumen by attacking someone else’s proposal than by presenting one’s own.
Other academics, Astin’s “minimisers,” put a priority on hiding any suggestion that they lack intelligence. This is related to the “imposter syndrome,” in which individuals feel they are faking it and don’t really deserve to be among all those other brilliant colleagues.
How nice it would be if it were easier to acknowledge weaknesses and lack of knowledge, to say “I don’t know” and “I need to improve my skills.”
Astin lists a whole range of ways that the obsession with smartness affects academic work. It:
• “limits prospects for educational equity • limits social welfare • hinders academic governance • limits recognition of different forms of intelligence • limits development of creativity, leadership, empathy, social responsibility, citizenship, self-understanding • limits finding better methods of assessment” (pp. 100-101)
What to do?
For getting away from the obsession with smartness and helping students who need help the most, Astin offers four principles for helping “underprepared students.”
The first is to promote engagement with learning, so students are motivated to study. Second is to foster peer interaction, so students learn from each other, including from more advanced students. Third is to have more interaction with academics. The fourth is to emphasise writing skills.
All these are worthwhile. It’s possible to imagine a university that pioneers systematic peer learning, with students in classes helping each other learn, students in upper-level classes assisting those in lower-level classes, and all spending time assisting disadvantaged students in the community. There are elements of each of these in some places, but shifting universities in this sort of direction seems a mammoth task. As Astin shows all too well, the prestige ranking of US universities is built on and helps perpetuate the obsession with smartness, an obsession that affects students, academics and administrators.
As critics have argued for decades, the education system serves not just to promote learning but to provide a rationale for social stratification. In other words, it justifies inequality: if you don’t succeed, it’s because you’re not smart enough. The implication of this critique is that changing the role of universities has to go hand in hand with challenging economic inequality. That’s a big task!
It is still possible to innovate in small ways within universities, and there are options for individuals. Students can choose to attend less prestigious institutions or to undertake less exclusive degrees, thereby questioning the smartness hierarchy. Academics can introduce peer learning in their classes, expand outcomes beyond cognitive tasks and measure learning before and after teaching.
Then there is the wider issue of the role of universities in society. If learning is the goal, why are degrees needed for certification? The radical alternative of de-schooling — learning by being part of a community designed for that purpose — can be reintroduced and updated for the digital age, in which access to abundant information is possible, and sorting through it and making sense of it are the greater challenges.
In a way, the biggest indictment of higher education is that it is so difficult to promote educational alternatives, to test out different ways of organising learning and to imagine different ways of pursuing greater knowledge for social benefit. Nevertheless, there remains hope for change when critics like Astin offer the insights of a lifetime and encourage the rest of us to see what is all too familiar with different eyes.
Glyphosate is the world’s most widely used herbicide. Is it as safe as its manufacturer claims?
Glyphosate is the principal ingredient in the herbicide named Roundup. It seems miraculous. It is deadly to weeds, yet harmless to humans, or so says Monsanto, the massive chemical company that manufactures it. (In 2018, Monsanto was purchased by Bayer.)
Glyphosate is used
on crops such as soybeans, corn and canola. It is used by local governments to
control weeds in public areas. It is used on golf courses. It is used by
householders to maintain beautiful lawns.
The biggest use is on crops. Glyphosate is deadly to all growing things, so initially Roundup had to be applied to the weeds but not the crops. However, when the patent on Roundup was about to expire, Monsanto developed a brilliant way to maintain sales. Using genetic engineering techniques, it spliced a gene into crops, such as soybeans, that made them resistant to glyphosate. As a result, Roundup could be sprayed directly on the crops. Weeds would be killed, but genetically modified crops would not be harmed. Such crops are called Roundup Ready.
What happened when farmers started reporting disease and scientists started finding problems? If you want the inside story, get Carey Gillam’s book Whitewash: the story of a weed killer, cancer, and the corruption of science. Gillam is an experienced journalist who was put on the agriculture beat and began looking behind the scenes. The picture isn’t pretty. She is now research director at U.S. Right to Know.
The victims and the regulators
that Roundup was safe, so safe that you could probably drink it without harm.
But what about farmers who had used Roundup for decades and then developed
non-Hodgkin’s lymphoma? There seemed to be a pattern, especially given
experiments with mice.
What about government regulators? The US Environmental Protection Authority (EPA) is supposed to be protecting the health of both people and the environment. Yet the EPA has seemed to be in the pocket of Monsanto, in all sorts of ways.
The EPA can set
upper limits to the intake of chemicals. However, when it came to glyphosate,
the limits it set were high, and were increased in line with increased use of
the herbicide. This was despite the applications of glyphosate becoming ten
times as great over a period of two decades.
You might expect
that with glyphosate being the most heavily used herbicide in the world, there
would be numerous studies of its prevalence and its impacts. Quite the
contrary. For years, no figures were collected of the levels of glyphosate in
different crops. The reason: because it was presumed to be safe, there was no
need to see what levels were appearing in foods. For years, studies were not
carried out on glyphosate’s possible health hazards. Again, the rationale was
that it was so safe that there was no need for testing.
Much that Gillam
reports relies on documents obtained using the discovery process in court
cases, in which parties are required to provide relevant documents to the other
side. Monsanto’s activities in subverting scientific research have been
Monsanto cultivated allies within the EPA and used them to block introduction of regulations. It cultivated tame scientists who would go on the attack against anyone who criticised glyphosate. These tame scientists were given “talking points” so they would know what to say, and given guidance on venues for giving talks and submitting articles. These tame scientists did not reveal their links to Monsanto. In this way, Monsanto could get out its message via seemingly independent scientists.
Resistance – by pests
According to its promoters and defenders, glyphosate is a miracle chemical, so safe to humans that it can be used widely with little or no impact on human health. However, it is not pure glyphosate that is applied to crops, gardens and walkways, but Roundup, which contains additional chemicals, including one called polyethoxylated tallow amine or POEA. The combination of glyphosate and POEA is what needs to be tested, but this is hardly ever done.
Roundup might be, there’s another problem. Pests can develop resistance to it.
This is evolution in action: a few pest species have or acquire resistance to
the pesticide, so they are the ones that start growing and spreading.
has been so remarkably effective in eliminating pests, farmers have become
complacent. Instead of rotating crops – a traditional practice that reduces
pest problems and replenishes the soil – farmers have planted the same crops
year after year, relying on Roundup rather than other methods to keep pests at
When Roundup-resistant pests started appearing, what was the solution? Farmers turned to other pesticides, using them in addition to Roundup. Some of these other pesticides are more highly toxic. This is the pesticide treadmill, in which the only solution to pests, even when they become resistant, is more pesticides.
Some farmers had
nearly forgotten how to grow crops in traditional ways. Others, though, have
turned towards alternatives, including organic agriculture.
Is Gillam’s treatment of the glyphosate saga accurate? Her account rings true, because it is history repeating. Monsanto’s response to criticisms of Roundup is remarkably similar to the response by earlier pesticide manufacturers to criticisms.
Rachel Carson’s famous book Silent Spring, published in 1962, raised the alarm about the effect of pesticides on wildlife and, tentatively, on human health. Many people have heard of Silent Spring, which is often credited with inspiring the modern environmental movement. Less well known is that Carson and Silent Spring came under fierce attack by chemical corporations. This is documented in a revealing 1970 book by Frank Graham, Jr., titled Since Silent Spring.
In 1978, biologist Robert van den Bosch’s book The Pesticide Conspiracy appeared. Van den Bosch told about the strong-arm tactics of the pesticide manufacturers, recounting case after case of scientists whose research and careers were attacked after they reported findings critical of pesticides.
Gillam’s story of
Monsanto’s tactics to attack any threat to its highly profitable Roundup is
eerily similar to the tactics used by pesticide companies since the 1960s. It
seems little has changed since, decades ago, I investigated suppression of scientists who
questioned pesticides. Given that the tactics are predictable, it is
plausible to work backwards and assume that presence of these tactics indicates
the likelihood of shortcomings in the pesticide paradigm. So what are the
* Attacks on
scientists who report research results showing dangers or limitations of
agency dependence on industry testing of pesticides.
* Testing only of
the active ingredient, not of the pesticide actually used.
ghostwriting of research papers.
* The failure of
companies to release documents except through freedom-of-information requests
or court discovery processes.
* A revolving door
between company jobs and jobs in the corporate regulator.
* Presence on
expert panels of members with conflicts of interest.
* Failure to carry
out relevant research or collect relevant data, such as amounts and locations
of pesticides used.
The presence of
these tell-tale signs does not prove that a pesticide, or some other product or
practice, is dangerous, but it does point to areas where extra scrutiny is
If you start investigating the likelihood that corporations and regulators are not serving the public interest, be prepared to be ignored or, if you start having an impact, being the target of dirty tactics.
“Monsanto Company and many leading chemical industry experts tell us that we should trust them and that more research is not needed. The safety of glyphosate and Roundup is proven, they say. But trust is hard to come by when the government does not require robust long-term safety data for a finished product such as Roundup, only for the active ingredient. There have long been concerns that the end product is more dangerous than glyphosate alone, and scientists say it is well-known that extra ingredients in pesticide products not only may themselves be toxic but also may enhance or supplement the toxic effects of the active ingredient. Extra ingredients in pesticides commonly include surfactants that help chemicals stick to the leaves of plants, antifoam compounds, and more. Yet the bulk of industry-sponsored toxicology tests are done using only the active ingredient. As well, there is very little long-term epidemiology data on glyphosate exposure, and there is no established base of information about just how much of the pesticide is in the products we eat and drink because the U.S. Food and Drug Administration (FDA) and the U.S. Department of Agriculture (USDA) have so steadfastly avoided including glyphosate in their testing regimes. And despite industry assurances of safety, there is an international body of published research that contradicts those claims.” (pp. 79-80)
Who is responsible for fake news? And what can be done about it?
• Trump offering free one-way tickets to Africa & Mexico for those who wanna leave America. • Police find 19 white female bodies in freezers with “Black Lives Matter” carved into skin. • Donald Trump protester speaks out: “I was paid $3,500 to protest Trump’s rally”.
During the 2016 US election campaign, teenagers in the town of Veles in Macedonia found a way to make some money by posting material on big social media sites. Facebook gets its income from advertisements, and gives tiny payments to suppliers of content. The teenagers could make money if their material attracted lots of readers, so they made up outrageous stories that they thought would find an audience.
Some fake stories, from the teenagers or others, do find an audience, like the ones listed above about Trump and Black Lives Matter, which were among the top 15 fake news stories in 2016.
Some made-up stories seem so plausible that readers share them with their friends. As the shares and retweets multiply, a story begins trending. It might even be reported in the mainstream media.
So who is responsible? The Macedonian teenagers, for sure. But they wouldn’t bother except for the economic model provided by social media. The advertisers on social media usually don’t care; they benefit when a story generates lots of clicks. The mass media are hurting financially and so do much less fact-checking, so bogus stories sometimes are run. Then there are the readers – that’s us – who think a story is worth sharing and don’t take the trouble to check whether it’s genuine.
James Ball is an
experienced journalist who cares about the news and is alarmed by its
corruption. In his book Post-Truth he
tells about the problem, those implicated in it, and what can be done about it.
The problem is far deeper than the spread of made-up stories. News can be distorted, one-sided and in other ways misleading. The label “fake news,” when used to refer to manufactured fantasies, is inadequate to capture the full extent of the problem.
Ball provides an informative and often eye-opening tour of the issues, giving numerous examples to illustrate his analysis and recommendations. Ball’s preferred term is “bullshit.” This refers to claims that are neither right nor wrong but rather indifferent to the truth. When bullshit fills the air, audiences may despair of figuring out what’s really going on and start distrusting every source of news, including the more established ones. The subtitle of Post-Truth is How Bullshit Conquered the World.
Ball starts with
the seemingly obligatory stories of Brexit and the election of Donald Trump,
canvassing the use of bullshit in these campaigns. He then examines six groups
involved in spreading bullshit. Politicians are important players, many of whom
want to manipulate audiences and use public relations, spin and other
techniques. Then there are “old media” – newspapers, television, radio – that
play a big role in propagating dubious stories. As the old media are squeezed
financially, they have less capacity to check sources and are more likely to
run fake stories.
Ball continues through
new media, fake media (such as created by the Macedonian teenagers) and social
media. Each of these helps spread bullshit. As in the case of old media,
economic imperatives are involved. Online, at least where advertising reigns
supreme, getting clicks is currency, so it becomes attractive to run or allow
stories with little checking.
The final of the six chapters on who is spreading bullshit is titled “… and you.” Audiences contribute to the problem. Ball cites the experience of a news operation created to provide quality news, with a more positive slant. It drew on research showing that people wanted more of this sort of news. The news operation soon folded. What people say they want (high quality news) is not necessarily what they actually end up buying or reading.
Audiences are attracted by scandals, gore, celebrity gossip and stories that reinforce their pre-existing views. The various forms of media, to survive financially, pander to these audience preferences. In this sort of environment, Ball says, bullshit thrives. Audiences are thus part of the problem.
A study of Twitter found that half the people who retweet a news story never bother even to read it. The implication is that people are reacting so quickly that they are driven by emotion rather than careful reflection. This is fertile ground for bullshit. It’s only possible to dream up some claim that appeals to readers’ gut reactions, namely something they’d like to believe is true, and then it starts spreading wildly with little or no scrutiny.
What to do
the problem, telling who is spreading bullshit and why, Ball turns to
solutions. One of them is fact-checking. Some large media organisations, such
as the New York Times, employ
fact-checkers, and there are now a number of independent bodies undertaking
valuable, but Ball says it’s not a full solution. One shortcoming is that fake
news runs far ahead of fact-checkers. Most news consumers read the politician’s
lie or the fake story and never get around to seeing what fact-checkers say
There’s also a
deeper matter: manufactured news items are only part of the problem. The majority
of suspect claims are some combination of right and wrong. They may be biased,
selective or misleading, and not easily amenable to fact-checking.
What else? Ball provides advice for politicians, media and news consumers. For example, one piece of advice for politicians is not to explain why the opponent’s claim is wrong, because this just highlights the claim. Explaining why alarms about terrorism are misleading only makes terrorism more salient. It’s better to reframe the issue, namely to provide a different narrative.
One of Ball’s
recommendations for media is to be careful about headlines, making sure they
capture the key ideas in stories. Because many readers share stories based
solely on headlines, some traditional headline-writing techniques need to be
Finally, Ball has recommendations for readers and voters. One of them is to put effort into thinking about stories and not just reacting to them emotionally. Another is to question the narratives that you believe as much or more as the ones you don’t believe. He also suggests learning basic statistics so you can assess claims made in the media.
All of Ball’s suggestions are worthwhile. If taken up, they would do a lot to change the media environment. But what would encourage people to follow his suggestions? Ball’s own analysis of the problem shows that politicians and the media are captives of large-scale processes, especially economic imperatives and audience emotional responses.
scholars and critics have been examining media cultures, especially the news,
showing all sorts of systemic problems. Journalists and editors treat events as
newsworthy when they conform to what are called “news values.” For example,
prominent people involved in conflicts are more newsworthy than ordinary people
behaving amicably. Hence, Trump’s campaign for a wall receives saturation
coverage while amicable relations between people living near the border between
Mexico and the US seldom warrant front-page media stories.
Ball doesn’t address the systemic biases in mass media coverage that pre-dated the rise in what he calls bullshit. His analysis is illuminating but needs to be supplemented.
Is there any hope?
A few readers of Post-Truth will take
up Ball’s suggestions, but for major change, collective action is necessary.
The lesson from history is that social movements are needed to bring about
change from below. An individual can seek to reduce personal greenhouse gas
emissions, but to tackle global warming, mass action is needed.
What sort of
collective action can make a difference regarding the news? There are signs in
what is already happening in circles where accurate information is vital.
When filter bubbles are needed
The mass media
have a strong preference for reporting events involving violence: “if it
bleeds, it leads.” This is frustrating for proponents of nonviolent action,
especially when there is little media coverage of a large peaceful protest or
the reports are about a minor scuffle rather than the issues at stake.
The methods of nonviolent action include strikes, boycotts, sit-ins, occupations and rallies. Research shows that nonviolent campaigns are more effective in overthrowing repressive regimes than armed struggle. Nonviolent action is the preferred approach of most social movements, including the labour, feminist, environmental and peace movements. Yet despite its effectiveness and widespread use, nonviolent action is marginalised in mass and social media coverage.
There’s an obvious reason for this. Nonviolent activists have no wealthy and powerful backers. In contrast, hundreds of billions of dollars annually are spent on militaries, with the full backing of governments and associated corporations. It is not surprising that media coverage follows power and money. Furthermore, the news values used by journalists to judge newsworthiness lead to a neglect of nonviolent alternatives.
In this context, nonviolent campaigns need to create their own news ecosystem, circulating information through sympathetic newsletters and websites. Getting rid of fake news and bullshit is fine for the dominant military approach but would do little to make audiences more aware of nonviolent options.
Voting is governments’ preferred method of citizen involvement in politics. Besides voting, there are numerous methods that enable citizens to participate in the decisions affecting their lives, such as initiatives and referendums. A method I find appealing is citizens juries, in which randomly selected citizens hear evidence and arguments about a contentious community issue, deliberate about it and make a recommendation.
However, alternatives to representative government have hardly any profile in the media. There is massive coverage of politicians, including their campaigning, policies, foibles and infighting, but almost none about participatory alternatives to the system in which elections and politicians are dominant. This means that campaigners for such alternatives need dedicated sources of information to find out about research and action, and to maintain their commitment.
Many social movements that are today considered progressive have struggled in the face of hostile media environments. Ball’s concerns about the rise of bullshit and the problems in gaining access to information are warranted. But for those seeking to challenge perspectives based on massive money, power and ideology, it has never been easy.
Trust is fundamental to human activities. How is it changing?
On a day-to-day basis, people put a lot of trust in others.
As I walk down a suburban street, I trust that a driver will follow the curve
of the road rather than drive straight into me. The driver trusts the engineers
who designed the car that it will not explode, at least not on purpose. Buying
an aspirin is premised on trusting the chemists and manufacturers that produced
When trust is betrayed, it is a major issue. When, last year in Australia, a few needles were discovered in strawberries and other fruit, it was national news. People normally assume that fruit purchased from a shop has not been tampered with.
Paedophilia in the churches was covered up for decades. When it was finally exposed, it destroyed a lot of trust in church leadership and the church as an institution.
knowledge is based on observation, experiment and theorising, but also relies
heavily on trust between scientists, who need to rely on each other to report
their findings truthfully. This helps explain the enormous condemnation of
scientific fraud, when scientists manipulate or fake their results.
In certain areas, public trust has plummeted in recent decades: trust in public institutions including government, corporations and the mass media. Opinion polls show large declines. In Australia, trust in financial institutions had been dropping due to scandals, and that was before the royal commission revealed widespread corruption. When people can’t trust their financial advisers, what should they do?
In order to ensure fairness and good practice, governments set up watchdog bodies such as ombudsmen, environmental protection authorities, anti-corruption commissions and auditor-generals. One of the casualties of the banking royal commission has been the credibility of financial watchdogs such as the Australian Securities & Investment Commission (ASIC). Rather than sniffing out bad practice, they were complacent. Whistleblowers reported problems, but ASIC ignored them. The message is that members of the public cannot rely on watchdog bodies to do their job.
Who can you
Rachel Botsman has written an insightful and engaging book titled Who Can You Trust? She argues that in human history there have been three types of trust.
local trust, based on personal experience in small communities. If someone you
know helps, or fails to help, in an hour of need, you can anticipate the same
thing in the future. Local trust is still relevant today, in families and
friendships. People learn who and when to trust through direct experience.
Next came institutional trust, in churches, militaries, governments, and professions such as medicine and engineering. People trusted those with greater authority to do the right thing. In the 1950s, high percentages of people in countries such as the US said they had a great deal of trust in their political leaders. However, institutional trust has taken a battering in recent decades.
“So why is trust in so many elite institutions collapsing at the same time? There are three key, somewhat overlapping, reasons: inequality of accountability (certain people are being punished for wrongdoing while others get a leave pass); twilight of elites and authority (the digital age is flattening hierarchies and eroding faith in experts and the rich and powerful); and segregated echo chambers (living in our cultural ghettoes and being deaf to other voices).” (p. 42)
Botsman writes about the rise of a third type of trust: distributed trust. People trust in systems that involve collective inputs, often anonymous.
Suppose you want to see a recently released film. If you
rely on local trust, you ask your friends what they thought of it. If you rely
on institutional trust, you see what the producers say about their own film:
read the advertisements. Or you can rely on distributed trust. For example, you
can look up the Internet Movie Database (IMDb) and see what different film
critics have said about the film, see what audience members have said about the
film and see the average rating audiences have given the film.
If you take
into account audience ratings from IMDb, you are trusting in two things. First,
you’re assuming that audience members have given honest ratings, and that the
film’s promoters aren’t gaming the system. Second, you’re assuming that IMDb’s
method of collecting and reporting ratings is honest. After all, IMDb might be
getting payoffs from movie producers to alter audience ratings.
Botsman says distributed trust seems to be reliant on technology but, ultimately, human judgement may be required. Of course, people design systems, so it’s necessary to trust the designers. However, after a while, when systems seem to be working, people forget about the designers and trust the technology.
One of Botsman’s examples is the self-driving car. Developers have put a lot of effort into figuring out what will make passenger/drivers feel safe in such cars. This sounds challenging. It turns out that the main problem is not building trust, because after being in a self-driving car it seems quite safe. The problem is that drivers become too trusting. Botsman thinks her young children will never learn to drive because self-driving cars will become so common.
Botsman has a fascinating chapters on the darknet, a part of the Internet frequented by buyers and sellers of illegal goods, among other nefarious activities. Suppose you want to buy some illegal drugs. You scroll through the various sellers and select your choice. How can you be sure you’ll receive the drugs you ordered (rather than adulterated goods) or that the seller won’t just run off with your money and not deliver the drugs? Botsman describes the trust-building mechanisms on the darknet. They include a rating service, rather like Amazon’s, and an escrow process: your payment is held by a third party until you’re satisfied with the goods. These darknet trust-enablers aren’t perfect, but they compare favourably with regular services. It turns out that trust is vital even when illegal goods are being bought and sold, and that reliable systems for building and maintaining trust are possible.
a high-rise apartment building called the Opal Tower had to be evacuated after
cracks were found in the construction. Experts debated when it was safe for
residents to return to their units. Some commentators blamed the government’s
system for checking compliance to building codes. Could trust in builders be
improved by learning from the systems used on the darknet?
Botsman’s special interest is in the blockchain. You might
have heard about the electronic currency called bitcoin. Used for purchases
online, it can provide anonymity, yet embedded in the code is a complete record
of every transaction. Furthermore, this record can be made public and inspected
by anyone. It’s as if a bank published online every transaction, with amounts
and dates, but without identifying who made them.
Botsman says bitcoin is a sideshow. The real innovation is the blockchain, the record-keeping code that enables reliable transactions without a middleman, such as a bank, taking a cut. It sounds remarkable, but blockchain-based operations have pitfalls. Botsman describes some disasters. When a new currency system was set up, someone found a glitch in the code and drained $60 million from the currency fund, one third of the total. The programmers and founders of the system were called in to intervene, which they did, preventing the extraction of currency.
seems not quite ready to provide a totally reliable trust system, one not
reliant on human intervention. But lots of people are working to achieve this
goal, as Botsman revealingly describes.
For me, the value of Who Can You Trust? is in highlighting the role of trust in contemporary life, especially as trust in institutions declines drastically. It made me think in a different direction: political alternatives.
The political philosophy of anarchism is based on the idea of self-management: people collectively make the crucial decisions affecting their lives without systems of hierarchy, namely without governments, corporations or other systems of domination. The usual idea is that there are assemblies, for example of workers who decide how to organise their work and what to produce. Assemblies elect delegates for coordination by higher-level groups.
This model of self-management relies on two types of trust. The assemblies have to be small enough for dialogue in a meeting and thus rely on local trust. The delegate structure parallels distributed trust, as long as the delegates remain bound by their assemblies and acquire no independent power
Another model is demarchy, which also dispenses with governments and corporations. In a local community, decision-making is carried out by citizens panels, with maybe 12 to 24 members each, whose members are selected randomly from volunteers. There could be panels for transport, manufacturing, art, education and a host of other topics. In essence, all the issues addressed by governments today are divided according to topic and allocated to randomly selected groups of citizens.
they are randomly selected, panel members have no mandate, so their terms are
limited. For coordination, experienced panel members would be elected or
randomly chosen for higher-level panels.
Demarchy relies on local trust, especially on the panels, and on distributed trust, namely trust in the system itself. This distributed trust is similar to the trust we have today in the jury system for criminal justice, in which randomly selected citizens deliberate together and make judgements. People trust a randomly selected person, who has no personal stake in the outcome, more than they are likely to trust a lawyer or a politician.
Botsman’s analysis of trust and technology raises a fascinating option: what would it mean to combine distributed trust based on technology with the local/distributed trust in political systems like anarchism and demarchy?
In December 2018, a partnership was announced between the Ramsay Centre and the University of Wollongong. The university would establish a degree in Western Civilisation funded by the centre.
The new degree was immediately controversial. In the previous months, there had been considerable publicity about proposed Ramsay-funded degrees in Western civilisation at the Australian National University and the University of Sydney. At both universities, many staff were opposed to the degrees. The ANU proposal did not go ahead, while the Sydney proposal was still being debated. Given this background, opposition to the degree at Wollongong was not surprising.
My aim here is to give a perspective on the controversy over the Ramsay-funded Western civilisation degree, especially as it has been played out at the University of Wollongong (UOW). I write as an academic at the university without a strong stake in the new degree, because I am retired and the issues involved do not impinge greatly on my main research areas. However, a number of my immediate colleagues have very strong views, and I have benefited from hearing their arguments, as well as the views of proponents of the degree.
The next section gives a brief overview of the institutional context, which is useful for understanding both incentives and concerns associated with Ramsay funding. Following this is an introduction to the Ramsay Centre. Then I outline the major issues raised at the university: decision-making, the conservative connection, Western civilisation and equality of resourcing. The conclusion offers a few thoughts on the de-facto strategies of key players.
It would be possible to go into much greater depth. Relevant are issues concerning the aims of education, the funding of higher education, the impact of private funding and agendas, the question of Western civilisation and the role of political ideology. Others have more expertise on these and other issues, and I hope some of them will contribute to the discussion.
Australian university sector
Most Australian universities are funded by the federal
government, but the funding environment has become increasingly challenging. In
the 1980s, the government introduced tuition fees based on government
zero-interest loans paid back as part of income tax only when a student’s
income reached a moderate level. Introducing these fees provided universities a
sizeable income stream, but not a bonanza, because the government cut its
direct funding, while opening the gates to a massive expansion in student
numbers over the following decades.
The result was that academics were met with ever-increasing class sizes. The student-staff ratio dramatically increased, almost doubling in some fields. However, this wasn’t enough to fix the financial squeeze. University managements dealt with it in two main ways.
aggressively recruited international students, who had to pay substantial
tuition fees. International student fees were used to cross-subsidise other
operations. Eventually, this income became Australia’s third largest export
industry, after iron and coal.
teaching was increasingly carried out by “casual” staff, paid by the hour or on
short-term contracts. University teaching was casualised almost as much as the
fast food industry.
beginning in the 1980s, the government pushed universities and other higher
education institutions to amalgamate. Increased size, through amalgamations and
student recruitment, became a goal, augmented by setting up of additional
campuses in Australia and in other countries. Universities became big
businesses, with budgets of many hundreds of millions of dollars.
management at Australian universities, finances became a preoccupation. All
avenues for income are canvassed, though the options have been restricted
mainly to government funding, student fees and research grants. The other side
of the coin has been cost containment, including by increasing class sizes,
cutting staff numbers and, as mentioned, relying ever more on casual staff for
US, in Australia there is no tradition of private support for universities.
Gifts from alumni are welcome but are usually a tiny portion of income.
Philanthropy is not prominent.
It was in this context that the Ramsay Centre for Western Civilisation entered the picture. Paul Ramsay made a fortune in private healthcare, including buying and running numerous hospitals. He died in 2014, having bequeathed a portion of his estate to setting up university courses in Western civilisation, run with small classes in which students study great books, in the manner of a few other such courses in the US and elsewhere. The Ramsay Centre was set up to manage this bequest. In 2017, the Centre invited expressions of interest from Australian universities to receive funding to set up and run degrees in Western civilisation.
University of Wollongong was the first university to announce an agreement to
set up such a degree. From the point of view of university managers, this was
an attractive proposition. It would involve the largest ever injection of
private money into an Australian university to fund a humanities programme,
amounting to many tens of millions of dollars. It was enough to employ ten
academics and give scholarships to dozens of undergraduates.
Early in 2019, Professor Theo Farrell, executive dean of the Faculty of Law, Humanities and the Arts at UOW, outlined the financial benefits of the arrangement in meetings held to discuss the new degree. The faculty was affected by a decline in the number of undergraduate students enrolling in arts degrees, a decline occurring across the state, not just at Wollongong. The Ramsay-funded degree would have both direct and spinoff benefits financially. The students undertaking the degree would have to take a major or a double degree at the university, most likely in the faculty, giving a boost to enrolments.
benefit was claimed: because the Ramsay-funded students had to have good results
in high school and because they were being paid, they were more likely than
other students to finish their degrees. If true, this would aid the faculty’s
overall retention rate, something the government would favour.
money would support the employment of ten academics and two professional staff.
One of the academics is Dan Hutto, senior professor of philosophy, appointed
head of the new School of Liberal Arts hosting the new degree. There are to be
nine newly hired academics, all of them philosophers. Though hired for
teaching, their relatively light teaching loads would free them up to do
research. Their presence potentially could turn UOW into a philosophy
powerhouse, beyond its current dynamism led by Hutto.
point of view of its advocates, the new degree thus brought great advantages to
the faculty and the university. It involved the injection of a large amount of
money with spinoff benefits for the rest of the faculty. And it would position
UOW as a prominent player internationally among great-books programmes and in
Acceptance of the degree was not straightforward. As soon as it was announced, academics and students expressed opposition. Here, I look at the grounds for opposition under several categories: decision-making, the conservative connection, Western civilisation and equality. In practice, these concerns are often mixed together.
Discussions between the centre and UOW were carried out in
secret. Only a few people at the university even knew negotiations were
occurring. Critics decried the secrecy.
officials said, in defence, that these sorts of negotiations are carried out
all the time, without any public announcement. Indeed, there are many examples
in which major developments have been announced as fait accompli. For example,
in November 2018 an announcement was made that the university had purchased colleges
There was no protest about this; indeed, few took any notice.
On the other hand, the Ramsay Centre was already controversial elsewhere, separately from Wollongong. As the Australian National University negotiated with the Ramsay Centre, there was considerable publicity, especially when university leaders decided against having a Western civilisation degree because of concerns about academic freedom. At the University of Sydney, major opposition emerged to a Ramsay-funded degree, with protests and much media coverage.
context, the secrecy at UOW seemed anomalous. It was true that university
management often proceeded on major initiatives without consultation with
academic staff, but this was not a typical case: it was already known to be
On the Ramsay Centre board are two prominent political conservatives: former prime ministers John Howard and Tony Abbott. For quite a few staff at UOW, the presence of Howard and Abbott tainted the Ramsay Centre and its funds.
explained by Farrell, the board of the Ramsay Centre has no input into what was
taught in the degree. Negotiations with the centre were with two academics that
it employed, Simon Haines and Stephen McInerney, not with the board.
One of the concerns expressed about the degree was that Ramsay Centre representatives would be members of the selection committees for the newly hired academics. For many academics, the idea of non-academic ideologues sitting on academic selection committees was anathema. Farrell countered by emphasising that members of the Ramsay Centre Board, such as Howard and Abbott, would have nothing to do with appointments. Only the Ramsay academics would be involved. A typical selection committee would have the two Ramsay academics, one outside academic, up to six UOW academics including Farrell as chair of the committee. Farrell said that it was not unusual for non-UOW figures to sit on selection committees. In other words, there were many precedents for the processes relating to the new degree.
noted that in his experience most selection committees operate by consensus,
not voting, but that if it came to a vote, UOW members had the numbers. In
response to a question about what the Ramsay academics would be looking for —
the worry being that they would want candidates aligned with particular
political positions — Farrell said that in his interactions so far with the
Ramsay academics, their main concern was that the appointees be good teachers.
meeting for faculty members about the new degree held on 11 February, Marcelo
Svirsky, senior lecturer in International Studies, raised a concern about the
reputational damage caused by the connection between Ramsay and the university.
Farrell said the university’s reputation internationally would be enhanced via
connections with Columbia University and other institutions with similar sorts
of degrees. Such connections were important given how difficult it was to build
affiliations with leading universities. Domestically, Farrell said that
information about the content of the UOW degree was gaining traction in the
media, counteracting earlier bad publicity about the proposed degrees at other
universities. He explicitly denied any risk to reputation.
It is fascinating to speculate what the response to the Ramsay money would have been had Howard and Abbott not been on the board. Many academics vehemently oppose the political positions of Howard and Abbott, making it difficult to accept any initiative associated with the two politicians. In the wider public, the involvement of Howard and Abbott mean the Ramsay Centre is inevitably caught up in the emotions associated with right-wing politics and the so-called culture wars.
be the same academic opposition to money coming from a centre linked to leading
figures from green or socialist politics? This can only be surmised, because if
a green-red twin of the Ramsay Centre were funding a degree, it would not be
called a degree in Western civilisation.
For academics in some sections of the humanities and social
sciences, “Western civilisation” is a term of opprobrium, not endearment. It is
useful to note that in several fields, critique is one of the standard tools:
accepted ideas, practices and institutions are subject to critical scrutiny,
often with assumptions and beliefs skewered. For example, in my field of
science and technology studies, challenges to ideas such as scientific progress
and “technology is neutral” are fundamental to much teaching and research. Yet,
in the wider public, conventional ideas about science, technology and progress
remain dominant. Therefore, teaching in the field necessarily involves
questioning conventional thinking.
For some, “Western civilisation” brings up images of Socrates, Michelangelo, Shakespeare and Einstein: great thinkers and creators from Europe. It also brings up images of parliamentary democracy, human rights and liberation from oppressive systems of domination. These are some of the positives of Western history and politics.
also a seamier side to Western history and politics. Colonialism and imperialism
sponsored by Western European states resulted in massive death, displacement
and enslavement of Indigenous peoples. In Australia, white settlement caused
death and the destruction of the culture of Aboriginal peoples.
As well as
the legacy of colonialism, the history of Europe has its own dark aspects, for
example the Crusades, the Inquisition, the horrors of the industrial revolution
and the Nazi genocide. A full account of Western cultures needs to address
their damaging as well as their uplifting sides.
While Western civilisation has been responsible for horrific deeds, these have been carried out with convenient rationales. Colonialism was seen by its defenders as part of a civilising mission, bringing enlightenment to savage peoples. Yet the aftermath of this mission continues to cause suffering. For example, in Rwanda, Belgian colonialists imposed the categories of Tutsi and Hutu on the population, helping lay the stage for the 1994 genocide. In Australia, poverty and incarceration of Aboriginal people are among the contemporary consequences of colonialism.
academics, it is imperative to challenge the glorified myth of the beneficence
of Western culture. It is part of the scholarly quest to attain insight into
what really happened, not just what is convenient to believe, and this often
involves pointing to the unsavoury aspects of history and politics that others
would rather ignore or downplay.
context, the very label “Western civilisation” is an insult to some scholars in
the area, because the term “civilisation” has positive connotations unlike, for
example, “Western barbarism.” For scholars, the label “Western civilisation”
suggests a focus only on one side of a complex and contentious past and legacy.
Hutto, in presenting the subjects to be taught in UOW’s Western civilisation degree, emphasised that about half of them involved studying texts from other cultures, including texts concerning Buddhism, Islam and Indigenous cultures. To fully understand Western culture, it is valuable to appreciate other cultures: a respectful dialogue provides more insights than concentrating on Western items alone.
some of the texts that Hutto proposed from Western writers offered critical perspectives
on Western societies. In these ways, Hutto distanced the degree from Abbott’s
claim that it would be for Western
instead positioning it as something different. In Hutto’s view, the degree uses
the study of great works of Western civilisation, in conversation with
non-Western traditions, as a way for students to develop their critical
capacities, using evidence and argument to back up their views. In short,
Hutto’s aim for the degree is that students learn how to think, not what to
think. Students are bound to be exposed to critical perspectives, including in
the major or degree they are required to take in addition to the one in Western
The degree as designed by Hutto might clash with the conceptions of some Ramsay Centre board members. It might also clash with the public perception, at least as informed by media coverage, that the degree would be one-sided advocacy for Western contributions. Intriguingly, if Howard or Abbott were to express reservations about UOW’s degree, this would temper the media and public perceptions of one-sidedness.
One of the
problems with the concept of Western civilisation is that, in the public
debate, it is seldom defined. Some critics might say that to talk of Western
civilisation is a category mistake, attributing a reality to an abstraction
whose meaning is contested. The variability of the meaning of “Western
civilisation” may lie behind some of the disputes over the degree carrying this
Ramsay’s large donation seems like a boon to a cash-strapped university, enabling the hiring of staff and the running of small classes that otherwise would be infeasible. On the other hand, UOW’s planned degree creates tensions between the privileged few and the rest.
The academics hired to teach the new degree would seem to have some extra benefits. In particular, they will be teaching small classes, of no more than ten students, of high-calibre students. In contrast, their colleagues, namely the rest of the academics in the faculty, are saddled with tutorial classes of 25, plus lectures sometimes with hundreds of students.
academics, this contrast is a source of considerable disquiet. Imagine someone
working in a field where offerings cover the same topics as proposed in the
Western civilisation degree. They might well say, “We have the expertise and
experience in the area. Why are we being squeezed while newcomers are given
generous conditions to teach the same topics from a philosophical perspective?”
been no formal response to questions of this type. One reply would be to say
that there are all sorts of inequalities between staff, only some of which are
related to merit. The most obvious inequality is between permanent and
non-permanent teachers. Some of the teachers on casual appointments are just as
qualified as those with continuing appointments. There are also inequalities
between academics, especially in research. For example, some researchers are
exempted from teaching on an official or de-facto basis.
tend to be highly sensitive to inequality in treatment, in part because
professional status is so highly valued. There are regular disputes about
workloads: seeing a colleague with a lighter teaching load can cause envy or
resentment. That a whole group of new academics seems to receive special
conditions can bring this sort of resentment to the fore.
students selected for scholarships to undertake the Western civilisation degree
have to satisfy several conditions. They must be Australian citizens or
permanent residents, young, recently completed high school and have obtained a
high score in the examinations at the end of high school. In other words,
mature-age students and international students are excluded from consideration.
Scholarship students will receive an annual stipend of $27,000, paid for up to
To some, the special privileges for scholarship students are unfair, especially the restriction to young Australian students. To this, a reply might be that inequalities between students are commonplace. The most obvious is between domestic and international students, the latter having to pay large tuition fees. Students on postgraduate scholarships are privileged too. This sometimes can be justified on merit, though the difference between students near the scholarship cut-off point may be tiny.
To appreciate the struggle over the Ramsay-Centre-funded degree in Western civilisation at the University of Wollongong, it is useful to think of the key players as using tactics to counter the moves of their opponents. Thinking this way is a convenience and does not imply that players actually think in terms of a strategic encounter.
proponents of the degree seem to be driven by two main considerations: the
availability of a large amount of private money to be injected into the
humanities, and the opportunity to build a world-class philosophy unit. To
acquire the Ramsay money and build the philosophy unit, it was useful to
counter likely sources of opposition, in particular the opposition of academics
in cognate units concerned about the ideological associations with the Ramsay
Centre and the concept of Western civilisation.
forestall the sort of rancorous public debate that occurred at the Australian
National University and Sydney University, which might scuttle the degree
before it was agreed, the degree proponents negotiated in secret. This did
indeed reduce public debate, but at the expense of a different source of
concern, the secrecy itself.
To counter concerns associated with the ideological associations with Ramsay and Western civilisation, Dan Hutto, designer of the degree, went to considerable effort to include in the core subjects respectful intellectual engagements with non-Western cultures, and to include negative as well as positive sides of Western culture.
opponents of the degree were not mollified. Some simply ignored the innovative
aspects of the subject offerings and assumed that any degree labelled “Western
civilisation” must be an apologia for Western colonialism. Other opponents,
though, focused on procedural matters, for example the fast-track approval of
the degree despite its possible risk to the university’s reputation.
One of the consequences of the degree is the introduction of a privileged stratum of staff, with much lighter teaching loads, and of students given scholarships to undertake the degree. For proponents of the degree, there is no easy way to address the associated staff and student inequality. However, this inequality has not played a significant role in the public debate. There are numerous other inequalities within universities, so perhaps the introduction of one more, despite its high profile, is not a likely trigger for public concern.
One of the
positive outcomes of the new degree is the debate it has stimulated. Hutto has
grasped the opportunity by planning to have the students discuss, in their
first week in the degree beginning in 2020, the debate about the degree itself.
For those so inclined, the new degree provides a golden opportunity to articulate
critiques of Western civilisation and make them available to staff and students
in the new School of Liberal Arts. Although Tony Abbott claimed that the
Ramsay-funded degrees would be for Western
civilisation, it is quite possible that many of the degree graduates will develop
a sophisticated understanding of Western civilisation. Perhaps, along the way,
members of the public will learn more about both the high and low aspects of
Paul Ramsay think of the furore over degrees in Western civilisation? Perhaps
he would be bemused that his bequest is receiving much more attention than he
ever sought for himself during his lifetime.
I thank the many individuals who have discussed the issues with me and who have offered comments on drafts.
 In the debate about Ramsay
Centre funding, Paul Ramsay and Ramsay Health Care have scarcely been
mentioned. Michael Wynne, a vigorous critic of corporate health care, developed
an extensive website with information about numerous heathcare corporations in
the US and Australia. While being critical of for-profit heathcare, Wynne has
relatively generous comments about Paul Ramsay himself and about Ramsay Health
Care, at least compared to other players in the corporate scene. See:
Wynne’s pages on Ramsay were last updated in 2005, but after this Paul Ramsay played a less direct role in Ramsay Health Care.
 I attended
meetings on 16 January and 11 February 2019 held for members of the Faculty of
Law, Humanities and the Arts. Theo Farrell and Dan Hutto told about plans for
the new degree and answered questions.
 Another factor,
specific to UOW, was the setting up of a Faculty of Social Sciences that,
despite its name, does not house the classic social sciences of sociology,
political science and economics. This faculty set up a social science degree
that is in direct competition with the arts degree, attracting students that
otherwise would have contributed to the budget for the Faculty of Law,
Humanities and the Arts.
 Andrew Herring, “University of Wollongong continues global expansion into Malaysia,” 19 November 2018, https://media.uow.edu.au/releases/UOW253448.html: The media release begins as follows: “The University of Wollongong (UOW) has continued its global expansion by acquiring the university colleges of Malaysian private education provider KDU from long-standing Malaysian investment company Paramount Corporation Berhad (PCB).
Subject to Malaysian Ministry of Education approval,
the deal will see UOW wholly-owned subsidiary, UOW Global Enterprises, immediately
acquire a substantive majority equity interest in the university colleges in
Kuala Lumpur and Penang—including the new campus under construction in Batu
 Tony Abbott, “Paul Ramsay’s vision for Australia,” Quadrant Online, 24 May 2018, https://quadrant.org.au/magazine/2018/04/paul-ramsays-vision-australia/. Quite a few commentators blamed Abbott’s article for hindering acceptance of a Ramsay-funded degree at the Australian National University, e.g. Michael Galvin, “Abbott single-handedly destroys Ramsay Centre for Cheering On White People,” The Independent, 17 June 2018; Peter van Onselen, “Ramsay Centre has Tony Abbott to blame for ANU’s rejection,” The Australian, 9 June 2018. Note that the preposition for is contained in the full name of the centre: the Ramsay Centre for Western Civilisation.
 Entry to the degree course is open to students of any age, and to five non-residents. The conditions mentioned apply only to those receiving Ramsay scholarships, and even then exceptions can be made. An ATAR (Australian Tertiary Admission Rank) of 95 has been mentioned as an expectation for scholarship recipients. Other factors will be taken into account.
Being the subject of news coverage can be both exciting and disturbing.
Have you ever been in the news? If you’re a politician, sports star or celebrity, of course you have, but for others it can be a rare experience. What does it feel like?
There is a vast amount of writing and research about the news. However, most of the research is from the point of view of either journalists or audiences. Surprisingly, few have bothered to interview so-called “ordinary people” appearing in the news. Ruth Palmer, in her new book Becoming the news: how ordinary people respond to the media spotlight,has addressed this omission. Her findings are fascinating.
My own experience is not typical. For decades I have regularly spoken with journalists, and I’ve written quite a few articles and letters to the editor published in newspapers. However, I do remember one of the first times I was on television. A television crew came to a friend’s house and I was interviewed on camera for half an hour. When the programme was broadcast, less than half a minute of the interview was used. That was when I concluded that television is the most manipulative of the mass media.
Palmer is now a professor of communications at IE University in Spain. In doing her PhD at Columbia University, she set out to study the experiences of people in the US who had been in news stories, usually without any initiative on their part, and this research became the basis for Becoming the News.
After the famous “miracle on the Hudson,” when a pilot landed a damaged passenger plane on the Hudson River with no loss of life, journalists sought the views of survivors. Palmer interviewed one of them, “Albert” (a pseudonym). Also among the 83 people she interviewed in 2009-2011 were “Deanne,” who witnessed an attempted suicide and was approached for comment, and “Alegra,” who miscarried due to a rare syndrome and agreed to speak to the media about it.
There were a variety of reasons why
Palmer’s interviewees had encounters with the media. What the interviewees had
in common was the novel experience of having their words or images conveyed to
a wide public in a story written by someone else — a journalist.
“Subjects imagined that those large audiences not only saw the coverage but also believed it. Based on their subsequent interactions with people who had seen them in the news, this usually proved to be true. This is the final factor that defines news subjecthood: being represented by a journalist in a mainstream news product means being represented in a product that makes authoritative truth claims.” (p. 8)
In many cases,
people make a voluntary choice to be in the news. A journalist rings and asks
for comments on an issue. It’s possible to say no, but many subjects agree, for
a variety of reasons. Some want to inform the public about an issue, like Alegra
who wanted to warn other mothers. Some seek publicity for their business or
Some people have just experienced something dramatic, like a plane crash or being shot in the street. Palmer calls the event leading to journalists being interested a “trigger.” If you’ve just been collateral damage in a shooting incident, or witnessed someone trying to kill themselves, you could be traumatised. It’s not an ideal time to be talking to a journalist, but even so many people agree: they are witnesses and are willing to give their point of view.
Journalists are hardly neutral in this process. They want a story. They learn how to encourage subjects to agree to comment and how to get them talking. In this sort of encounter, the journalist is highly skilled and experienced whereas the subject is unprepared and sometimes traumatised. Some journalists exploit people’s natural inclination to respond to questions.
In general, people agree to be
interviewed if they think they will benefit more than they will be harmed. Some
don’t want their stories told, especially if they have something to hide.
In my own experience, most journalists are straightforward in their dealings and competent in doing their jobs. For example, they ring me to comment about whistleblowing or plagiarism or some other topic about which I’ve written. My situation is different from that of the “ordinary people” Palmer writes about; I’ve never been approached after witnessing a crime or being in an accident.
Palmer’s findings about the accuracy of stories are especially interesting. First consider the point of view of journalists and editors: they put a premium on factual accuracy. Some high-prestige media, like the New York Times, employ fact-checkers to ensure accuracy in stories.
However, Palmer found that most
subjects were not too worried about factual errors, such as giving the wrong
street or even misspelling their names. They were far more disturbed by the
general impression given by the story, especially when it was different from
what, based on an interview, they had anticipated.
For an ordinary person to be
featured in the news means being singled out as special: in most cases, it adds
to the person’s status. This occurs despite the news media having a low
reputation generally. Many subjects were thrilled by the stories. They thought
the journalists had done a good job, and had given them greater visibility than
they could have achieved otherwise.
For subjects who wanted to promote a
business or a cause, media coverage provided more effective advertising and
legitimation than alternatives. Stories were especially credible because they
were written by someone else, a journalist.
Subjects reported that when family
members and friends saw their name in the newspaper, many of them bought copies
and sent congratulations. In quite a few cases, this response seemed
independent of what the story said. Being in the news was enough to be seen as
For a small minority, though, news
coverage was a disaster. This was mainly when the story was about something
disreputable, such as a crime, or simply cast them in a bad light. A university
student was quoted out of context in a way that made her look bad, and as a
result received abusive comments from peers and strangers alike.
Many subjects found it strange,
indeed unnerving, to see how they were portrayed in the news. It is difficult
enough for most people to appreciate how others see them. News coverage
provides one avenue.
The strangeness arises from a contrast of perspectives. Subjects knew about their own lives, of course. Then a sliver of their life is interpreted by someone else, a journalist, and presented to the world, so readers would assume that that is what they are like. Subjects could examine the coverage and contrast it with their own self-perception. Added to this was the knowledge that many other people, people who didn’t know them otherwise, were forming their opinions of them based on this particular portrayal.
With time and experience, people can
get used to media coverage of themselves. Palmer’s subjects, though, were
newcomers to the experience.
according to Palmer, evaluate their reporting mainly in terms of accuracy and
ethical process. Subjects, while deeming these facets important, were much more
concerned with the overall orientation of the coverage and with its impact on
audiences. These are aspects given less attention in US media coverage. If a
journalist had to worry about the impact of coverage on the life of an
interviewee, this could lead to a type of self-censorship.
Long after journalists have moved on
to other stories, subjects may be coping with the impact of being in the news.
This is exacerbated by the indefinite online availability of stories.
Pre-Internet, media coverage would come and go, with impacts being localised in
time and space. With online stories, the coverage can have a long-term impact
via search engines.
One of Palmer’s subjects, “Rich,” had been arrested for having kidnapped a politician’s wife, a story given local media coverage. Later, he was released because he had nothing to do with the kidnapping. However, his exoneration was not newsworthy. His employer believed Rich was innocent but fired him anyway, because clients might find the damaging media stories online. Three years later, Palmer reports, Rich was still unemployed.
Rich’s disastrous experience with media coverage highlights something relevant to most of Palmer’s subjects: they realised that journalists and editors had far more power than they did. In effect, they were at the mercy of journalists, who could decide how to frame stories, enhancing or damaging their reputations.
Journalists have a lot of power because their stories can have a wide and long-lasting impact. Furthermore, this power is mostly unaccountable: in the face of unwelcome coverage, the ordinary person has little recourse aside from expensive services to manage online reputations. That journalists have a lot of power does not mesh easily with journalists’ own self-image. They are pressured to produce ever more stories with fewer resources. If anything, they see themselves as courageous champions of the underdog, holding the spotlight to the wrongdoings of powerholders, in the tradition of what is called the fourth estate. With this self-image, it is easy to forget that media coverage can have drastic impacts on subjects of the coverage and that their relationship with those subjects is quite unequal.
Insights about the news
Based on her interviews and other research, Palmer offers a set of lessons for journalists and subjects. To these I would add a few suggestions for consumers of the news, reading about someone who is portrayed as, for example, a hero, an innovator, a victim or a crook.
It’s useful to remember that media
portrayals can, at best, capture only one aspect of a person’s life. So try not
to assume that coverage defines a person. This is especially important when the
treatment is negative. This is a warning not to engage in social media mobbing
without full information.
In one instance in which I came under attack in a newspaper, there were numerous hostile social media comments. I received a number of hostile emails as well as favourable ones. Most disturbingly, I received just one request for more information. The lesson is that if you see negative coverage about someone and don’t know them, then refrain from joining in an attack; instead, ask them for their side of the story. Or ask someone else who might have independent information.
If a friend of yours is in the media, you might congratulate them, assuming the coverage is positive. You might also take extra care and talk to them about the issues involved.
On quite a few occasions, acquaintances have said to me, “I heard you on the radio.” Sometimes, not remembering the interview, I say, “What was I talking about?” Usually they can’t remember. This experience accords with Palmer’s observation that media coverage conveys status independently of the content of the coverage. So when you hear someone you know on the radio, you might like to strike up a conversation about the topic. You might learn something extra. But be careful: they might be sick of the topic and want to talk about anything else.
There’s an old saying in media studies: “Newspapers don’t tell people what to think; they do tell people what to think about.” Keep this in mind when you respond to media stories and try, at least occasionally, to explore what wasn’t in the news.
Here’s the “deep story” that Palmer’s subjects felt was true about the mass media:
“The news media is extremely powerful — much, much more powerful than most citizens. Journalists are primarily motivated by profit and status, rather than public service. And yet, outrageously, journalists claim the mantle of public defender. Thus hypocrisy and the potential for abuse define the news media’s relationship to the public.” (p. 214)
To achieve happiness, can it be useful to pursue pain and discomfort?
Many people make enormous efforts to avoid stress and strain. They will search for a convenient parking space rather than walk a few hundred meters. When the temperature gets too hot or cold, they turn on the cooling or heating. For headaches, there are analgesics. For emotional pain, therapy or maybe a stiff drink.
While avoiding pain, people often pursue pleasure. This can be comfortable chairs, tasty food, thinking positive thoughts and becoming absorbed in social media. Pleasure is commonly seen as the opposite of pain.
But what if much of this quest is misguided? That is the argument presented by Brock Bastian in his new book The Other Side of Happiness. Bastian, a psychology researcher at the University of Melbourne, reports on studies by himself and others that support a seemingly counter-intuitive conclusion: pain can be a route to true happiness.
Bastian begins by noting a curious phenomenon. Despite the apparent vanquishing of both physical and emotional pain, levels of anxiety and depression in young people seem to be increasing. I noticed this among students in my classes. Colleagues who deal with student issues tell me the entire university sector is affected. Richard Eckersley has written about the problems affecting young people who, despite reporting high happiness levels, seem to suffer inordinately high levels of psychological distress.
Bastian reports on something else: the pursuit of pain. You might ask, who, except for masochists, would voluntarily seek painful experiences? Actually, quite a few do. Running a marathon is gruelling, yet surprising numbers of people see this as a worthwhile goal. Likewise climbing mountains. Eating a hot chilli pepper can be bracing. Some people get a thrill out of scary rides or jumping out of aeroplanes, even though (or because) these cause a huge adrenaline rush.
There are also painful emotional experiences. For some, singing in front of others requires enormous courage, yet this is undertaken voluntarily. Others find it nerve-racking to approach someone they revere.
How should a psychologist go about doing controlled studies of how people handle pain, both physical and emotional? It’s hardly feasible to have subjects scale mountain cliffs or have an audience with the Queen.
For physical pain, one ingenious method is to ask subjects to hold their hands in a bucket of ice water. This is quite painful but not harmful. Before or after the ice water treatment (or, for controls, some other activity that isn’t painful), subjects then are asked to do other tasks. The way they react to these tasks reveals something about the role of pain.
For example, one experiment used a task that tested generosity, such as donating to a worthy cause. What do you think: would experiencing physical pain make people more or less generous? (The answer: more generous.)
For emotional pain, a clever technique is to simulate ostracism. In a computer game, subjects find they are being left out of the interaction by the other players. So strong is the urge to be included in a group that even in this short simulation being neglected is a distressing experience.
As well as studies in the lab, psychologists also undertake survey research. For example, one finding is that early stress in a marriage can make it resilient in the face of future challenges, and lead to greater satisfaction.
Based on a wide range of evidence, from lab studies to studies of trauma victims, Bastian concludes that it’s better to encounter some adversity in our lives. It shouldn’t be overwhelming, just enough to build the capacity to overcome it. In this process, we become emotionally stronger. Conversely, hiding from pain gives it extra power to cause distress.
“The key to healthy psychological functioning is exposure. If we want to be happy, we cannot afford to hide from our challenges and surround ourselves in protective layers of comfort. To achieve emotional stability and the capacity to handle challenges when they arise, we may be well advised to occasionally seek out discomfort and to take ourselves outside our proverbial comfort zones more often than we do.” (p. 95)
Bringing people together
In 1980, Lindy Chamberlain’s baby Azaria was taken away by a dingo. In television interviews, she put on a brave face, hiding her grief. Unfortunately, this was damaging to her credibility, because not showing emotions makes others think you deserve your pain.
On the other hand, expressing your physical or emotional pain triggers support from others. This is observed in the outpouring of generosity after disasters. It is also observed in combat, which bonds fighters together.
Support from people you know or trust makes a difference: it actually reduces the pain. Bastian notes that even a photo of a loved one can have this effect. It is not surprising, then, that experiencing pain encourages people to seek social connections.
Keep a photo of your loved one handy
There is another fascinating social effect of hardship: studies show it can promote creativity. So perhaps there is some truth in the stereotypical image of the struggling artist. Bastian concludes, “We need to endure the challenge of sometimes stressful, novel and potentially threatening environments to foster true originality.” (p. 125)
This idea might be used to justify unpleasant working conditions, and precarious employment. On the other hand, it could also justify reducing executive salaries and putting political leaders in small, cramped offices.
There’s an important qualification that needs to be emphasised. When discomfort is voluntary, then inhibiting desires can improve performance. An example is uncomfortable yoga postures, which can help train the mind to focus. But involuntary discomfort, for example chronic pain, reduces performance. The implication is that imposed pain should be reduced or relieved, while there should be more opportunities for voluntary discomfort.
Bastian cites eye-opening data showing that people in poorer countries report greater meaning in their lives. Perhaps this should not be such a surprise given the number of well-off people who seem to lack purpose, spending time on fleeting pleasures rather than pursuing deeper connections. Note that country comparisons can be misleading and that having a meaningful life is not the same as being happy.
Negative experiences, including being reminded of death, trigger a search for meaning, leading to a greater sense of purpose that isn’t there when there is no suffering. Bastian describes research on an earthquake emergency. People who had thoughts of dying during the earthquake were more likely to shift their priorities from extrinsic to intrinsic ones. This meant, for example, putting less priority on income and possessions and more on relationships and beliefs. Bastian concludes, “The more we consciously engage with our own mortality the more likely we are to focus on things that matter; to seek out things that are ultimately likely to provide more depth in our lives.” (p. 170)
The Other Side of Happiness provides a powerful counter to the usual emphases in society, in which the priority is seeking pleasure and reducing pain. It also puts a somewhat different perspective on happiness research. Happiness researchers have challenged the usual emphasis on possessions, income, good looks and education, saying that, outside of poverty, they have only a limited impact on wellbeing. Instead, changing one’s thoughts and behaviours has greater impact, for example expressing gratitude, being mindful, being optimistic, building relationships and helping others.
However, happiness research gives little attention to the benefits of physical and emotional pain. This is addressed by implication in recommendations for physical activity, building resilience and pursuing a purpose. However, the painful sides to these activities are seldom emphasised, perhaps because it is not easy to sell a recommendation for seeking pain rather than pleasure.
Yet that is exactly Bastian’s recommendation. He says there is a need to recognise that stress, struggle and pain can bring happiness. Examples include intense exercise, having children, working hard and helping others. The key is to recognise the process, namely to see the positive side of negatives.
The takeaway message: seek out calculated risks and challenges, and let your children do the same. Search for discomfort and embrace feelings of sorrow and loss. Recognise that experiencing and valuing unpleasant experiences can be a path to greater satisfaction.
Some Australian media outlets have been warning that university students are unduly protected from disturbing ideas. But are these same media outlets actually the ones that can’t handle disturbing ideas?
For years, I’ve been seeing stories in The Australian and elsewhere about problems in universities associated with political correctness (PC). The stories tell of students who demand to be warned about disturbing material in their classes, for example discussions of rape in a class on English literature. The students demand “trigger warnings” so they can avoid or prepare for potentially disturbing content. Detractors call them “snowflake students”: they are so delicate that, like a snowflake, they dissolve at exposure to anything slightly warm.
Former Labor Party leader Mark Latham, for example, referred to “the snowflake safe-space culture of Australian universities.”
Richard King, the author of On Offence: The Politics of Indignation, reviewed Claire Fox’s book I Find that Offensive. King says that the principal target of Fox’s book “is ‘the snowflake generation’, which is to say the current crop of students, especially student activists, who keep up a constant, cloying demand for their own and others’ supervision. ‘Safe spaces’, ‘trigger warnings’ and ‘microaggressions’ are all symptoms of this trend.”
I treat these sorts of stories with a fair bit of scepticism. Sure, there are some incidents of over-the-top trigger warnings and demands for excessive protection. But are these incidents representative of what’s happening more generally?
Before accepting that this is a major problem, I want to see a proper study. A social scientist might pick a random selection of universities and classes, then interview students and teachers to find out whether trigger warnings are used, whether class discussions have been censored or inhibited, and so forth. I’ve never heard of any such study.
What remains is anecdote. Media stories are most likely to be about what is unusual and shocking. “Dog bites man” is not newsworthy but “man bites dog” might get a run.
Most of the Australian media stories about trigger warnings and snowflake students are about what’s happening in the US, with the suggestion that Australian students are succumbing to this dire malady of over-sensitivity.
Trigger warnings: Australian movie and video game classifications
There is a case for trigger warnings. Nevertheless, in thirty years of undergraduate teaching, I never saw any need for them — except when I asked students to use them.
For one assignment in my class “Media, war and peace,” students formed small groups to design an activity for the rest of the class. The activity had to address a concept or theory relating to war or peace, violence or nonviolence. Quite a few student groups chose the more gruesome topics of assassination, torture or genocide, and some of them showed graphic pictures of torture and genocidal killings.
Never did a single student complain about seeing images of torture and killing. Nevertheless, I eventually decided to request that the student groups provide warnings that some images might be disturbing. Thereafter, when groups provided warnings, no students ever excused themselves from the class. I was watching to see their reactions and never noticed anyone looking away.
This is just one teacher’s experience and can’t prove anything general. It seems to show that some Australian students appear pretty tough when it comes to seeing images of violence. Perhaps they have been desensitised by watching news coverage of wars and terrorist attacks.
However, appearances can be deceptive. My colleague Ika Willis pointed out to me that students may hide their distress, and that few would ever complain even if they were distressed. So how would I know whether any of my students were trauma survivors and were adversely affected? Probably I wouldn’t. That is an example of why making generalisations about trigger warnings based on limited evidence is unwise.
A journalist attends classes – covertly
On 8 August 2018, Sydney’s Daily Telegraph ran a front-page story attacking three academics at Sydney University for what they had said in their classes. The journalist, Chris Harris, wrote about what he had done this way: “The Daily Telegraph visited top government-funded universities in Sydney for a first-hand look at campus life …” This was a euphemistic way of saying that he attended several classes without informing the teachers that he was attending as a journalist, and covertly recorded lectures without permission. Only in a smallish tutorial class, in which the tutor knows all the students, would an uninvited visitor be conspicuous.
Harris then wrote an expose, quoting supposedly outrageous statements made by three teachers. This was a typical example of a beat-up, namely a story based on trivial matters that are blown out of proportion. Just imagine: a teacher says something that, if taken out of context, can be held up to ridicule. Many teachers would be vulnerable to this sort of scandal-mongering.
One issue here is the ethics of covertly attending classes and then writing a story based on statements taken out of context. Suppose an academic covertly went into media newsrooms, recorded conversations and wrote a paper based on comments taken out of context. This would be a gross violation of research ethics and scholarly conventions. To collect information by visiting a newsroom would require approval from a university research ethics committee. Good scholarly practice would involve sending a draft of interview notes or the draft of a paper to those quoted. In a paper submitted for publication, the expectation would be that quotes fairly represent the issues addressed.
A typical Daily Telegraph front page
Where are the snowflake students?
So when Harris attended classes at universities in Sydney, did he discover lots of snowflake students who demanded to be protected by trigger warnings? He didn’t say, but it is clear that at least two individuals were highly offended: a journalist and an editor! They thought the classroom comments by a few academics were scandalous.
In a story by Rebecca Urban in The Australian following up the Telegraph expose, Fiona Martin’s passing comment about a cartoon by Bill Leak comes in for special attention. According to this story, “The Australian’s editor-in-chief Paul Whittaker described the comment as ‘appalling’ and ‘deeply disrespectful’.”
So apparently News Corp journalists and editors are the real snowflakes, not being able to tolerate a few passing comments by academics that weren’t even intended for them or indeed for anyone outside the classroom. Or perhaps these journalists and editors are outraged on behalf of their readership, who they consider should be alerted to the dangerous and foolish comments being made in university classrooms.
Where in this process did the call for students to be tough and be exposed to vigorous discussion suddenly dissolve?
The contradiction is shown starkly in a 10 August letter to the editor of The Australian by Andrew Weeks. The letter was given the title “Bill Leak’s legacy is his courage in defending the right to free speech”. Weeks begins his letter by saying “I am unsure what is most disturbing about the abuse of sadly departed cartoonist Bill Leak by Fiona Martin.” After canvassing a couple of possibilities, he says “Perhaps it is the fact that Sydney University has supported its staffer, offering lip service in support of freedom of speech when that is exactly what is being endangered by the intolerance characteristic of so many university academics.”
The logic seems to be that freedom of speech of Bill Leak (or those like him) is endangered by an academic’s critical comment in a classroom, and that a university administration should not support academics who make adverse comments about Leak.
Again it might be asked, what happened to the concern about the snowflake generation? The main snowflakes are, apparently, a journalist, an editor and some readers. Perhaps it would be wise in future for journalists to avoid visiting university classrooms so that they and their readers will not be disturbed by the strong views being expressed.
Universities do have serious problems, including a heavy reliance on casual teaching staff and lack of support for international students, both due to lack of money. More students report problems with anxiety and depression. There is also the fundamental issue of the purpose of higher education, which should not be reduced to job preparation. Instead of addressing these issues, News Corp newspapers seem more interested in the alleged danger, apparently most virulent in humanities disciplines, of political correctness.
My focus here is on an apparent contradiction or discrepancy in treatments of PC and “snowflake students” in The Australian and the Daily Telegraph. While decrying the rise of the so-called snowflake generation, journalists and editors seemed more upset than most students by comments made in university classrooms.
One other point is worth mentioning. If you want to inhibit vigorous classroom discussions of contentious issues, there’s no better way than spying on these discussions with the aim of exposing them for public condemnation. This suggests the value of a different sort of trigger warning: “There’s a journalist in the classroom!”
Further reading (mass media)
Josh Glancy, “Rise of the snowflake generation,” The Australian, 8-9 September 2018, pp. 15, 19.
Christopher Harris, “Degrees of hilarity” and “Bizarre rants of a class clown,” Daily Telegraph, 8 August 2018, pp. 4-5.
Richard King, “Fiery blast aimed at ‘snowflake generation’,” The Australian, 1 April 2017, Review p. 22.
Mark Latham, “The parties are over,” Daily Telegraph, 9 January 2018, p. 13.
Bill Leak, “Suck it up, snowflakes,” The Australian, 11 March 2017, p. 15.
Rebecca Urban, “Uni backs staffer on secret suicide advice,” The Australian, 9 August 2018, p. 7; (another version) “University of Sydney stands by media lecturer following Bill Leak attack,” The Australian, 8 August 2018, online.
Further reading (scholarly)
Sigal R. Ben-Porath, Free Speech on Campus (University of Pennsylvania Press, 2017).
Emily J. M. Knox (ed.), Trigger Warnings: History, Theory, Context (Rowman & Littlefield, 2017).
Acknowledgements Thanks to several colleagues for valuable discussions and to Tonya Agostini, Xiaoping Gao, Lynn Sheridan and Ika Willis for comments on a draft of this post. Chris Harris and Paul Whittaker did not respond to invitations to comment.
According to mainstream scientists, HIV transmission in Africa operates differently than elsewhere. An alternative view has been systematically ignored and silenced.
HIV prevalence in Africa
AIDS is the most deadly new disease in humans, with the estimated death toll exceeding 30 million. In order to restrain the spread of the infective agent HIV, scientists have tried to figure out how it spreads. The consensus is that HIV is most contagious via blood-to-blood exposures, such as through shared injecting needles, and in comparison the risks of transmission via heterosexual sex and childbirth are small.
However, there’s a mystery in relation to Africa. The scientific consensus is that in Africa, unlike elsewhere, HIV spreads mainly through heterosexual sex. Why should this be?
My own interest in research on AIDS derives from a different controversy, the one over the origin of AIDS. The standard view is that AIDS first appeared in Africa and was due to a chimpanzee virus, called a simian immunodeficiency virus or SIV, that got into a human, where it was called a human immunodeficiency virus or HIV. Chimps have quite a few SIVs, but these don’t hurt them presumably because they have been around long enough for the population to adapt to them, in the usual evolutionary manner. There are various species of chimps, and when a chimp is exposed to an unfamiliar SIV, it can develop AIDS-like symptoms.
So the question is, how did a chimp SIV enter the human species and become transmissible? The orthodox view is that this occurred when a hunter was butchering a chimp and got chimp blood in a cut, or perhaps when a human was bitten by a chimp, or perhaps through rituals in which participants injected chimp blood.
In 1990, I began corresponding with an independent scholar named Louis Pascal who had written papers arguing that transmissible HIV could have entered humans through a polio vaccination campaign in what is present-day Congo, in which nearly a million people were given a live-virus polio vaccine that had been grown on monkey kidneys. The campaign’s time, 1957 to 1960, and location, central Africa, coincided with the earliest known HIV-positive blood samples and the earliest known AIDS cases.
Despite the plausibility and importance of Pascal’s ideas, no journal would publish his articles, so I arranged for his major article to be published in a working-paper series at the University of Wollongong. Independently of this, the polio-vaccine theory became big news. Later, writer Edward Hooper carried out exhaustive investigations, collected much new evidence and wrote a mammoth book, The River, that put the theory on the scientific agenda. Over the years, I wrote quite a few articles about the theory, not to endorse it but to argue that it deserved attention and that scientific and medical researchers were treating it unfairly.
In the course of this lengthy controversy — which is not over — I became increasingly familiar with the techniques used by mainstream scientists to discredit a rival, unwelcome alternative view. I had been studying this, on and off, since the early 1980s; the origin-of-AIDS saga made me even more attuned to how dissenting ideas and researchers can be discredited.
With this background, when I read John Potterat’s chapter “Why Africa?” it was like he was providing a front-row seat for a tutorial on how an unwelcome view can be marginalised. I saw one familiar technique after another.
I’m not here to say that Potterat’s view is correct. Furthermore, unlike the origin-of-AIDS debate, I haven’t studied writings about HIV transmission in Africa. What I do here is outline Potterat’s account of his experiences and comment on the techniques used to dismiss or discredit the ideas he and his collaborators presented to the scientific community.
HIV is infectious, so it is important to know exactly how it gets from one person to another. Knowing transmission routes is the basis for developing policies and advice to prevent the spread of the virus.
In Seeking the Positives, Potterat tells about his personal journey in scientific work. It was unusual. With a degree in medieval history, he ended up with a job in Colorado Springs (a moderate-sized town in Colorado) tracking down networks of people with sexually transmitted diseases (STDs). Learning from his mentors, the approach he developed and pursued with vigour was to interview infected individuals, find out their sexual or injecting-drug partners and proceed to build up a database revealing the interactions that spread the disease. The military base near the city meant there were lots of prostitutes (some permanent, some seasonal) and STDs to track. This sort of shoe-leather investigation (seeking those positive for disease) led to many insights reflected in a vigorous publication programme. For the Colorado Springs research team, AIDS became a key focus from the 1980s on.
When submitting a paper to a scientific journal, editors and reviewers are supposed to assess it on its merits. It should not matter whether an author has a PhD in epidemiology from Oxford or no degree at all. The test is the quality of the paper. Potterat became the author of dozens of scientific papers. However, his unusual background may have been held against him in certain circles.
In Seeking the Positives, Potterat doesn’t tell that much about his team’s clients/informants. Sensitively interviewing prostitutes, partners of prostitutes, drug users, gay men and others would have been a fascinating topic in itself, but Potterat focuses on the research side of the story.
A diagram from one of Potterat’s papers
You might think that contact tracing is an obvious way to study the transmission of disease, especially a new disease for which the patterns of contagion are not fully understood. But what Potterat’s team was doing was unusual: mainstream AIDS researchers pursued other approaches. Because the mainstream researchers had lots of research money, they didn’t take kindly to a small, non-prestigious team doing something different.
Mainstream groups, both researchers and activists, raised a series of objections to HIV contact tracing. First they said there was no reason for contact tracing unless there was a test for HIV. Second, after a test became available in 1985, they said tracing would allow the government to compile lists of homosexuals. Third, they said that without effective treatment, notifying individuals would distress them and lead to suicides. Fourth, after the drug AZT became available in 1987, they said contact tracing would be too expensive.
The interesting thing here is that none of the objections was backed by any evidence. Potterat says that in his team’s studies nearly all of those approached for contact tracing were very helpful.
“Contact tracing was generally opposed by AIDS activists, by civil libertarians, and (disappointingly) by many public health workers, who were often influenced by political correctness and by not wanting to offend strident constituencies.” (pp. 68-69)
Later, mainstream public health officials in the US took the line that AIDS was a danger to the heterosexual population, not just to gays and injecting drug users. If HIV was highly contagious in the wider population, this lowered the stigma attached to gays and injecting drug users, and coincidentally made it possible to attract more funding to counter the disease, a worthy objective. However, contact tracing showed that HIV transmission was far higher in specific populations. This was another reason the research by Potterat’s group, published in mainstream journals, didn’t lead to changes in research priorities more generally.
HIV transmission in Africa
In 2000, Potterat was approached by David Gisselquist about the spread of AIDS in Africa, questioning the usual explanations for why the mechanisms were claimed to be different from those in Western countries. After his retirement the following year, Potterat and some of his collaborators joined with Gisselquist in examining the studies that had been made.
The orthodox view was that in Africa, uniquely, HIV transmission occurs primarily through heterosexual sexual activity. This, according to Potterat et al., was based on assumptions about high frequencies of sexual interactions and high numbers of partners, neither of which were supported by evidence. They said the evidence suggested that sexual activity in Africa was much like elsewhere in the world.
In this was the case, the orthodox view couldn’t explain HIV transmission in Africa, so what could? The answer, according to Potterat and his collaborators, was skin-puncture transmission that occurred when contaminated needles were reused during health-care interventions such as blood testing, vaccinations and dental work, plus tattooing and traditional medical practices. This was heresy. It was also important for public health. Potterat writes, “Only when people have accurate knowledge of HIV modes of transmission can they make good decisions to protect themselves and their families from inadvertent infection.” (p. 200)
Potterat’s team wrote dozens of papers, but they had a hard time getting them published in top journals, where orthodoxy had its strongest grip. Nevertheless, they were quite successful in publishing in reputable journals of slightly lower standing.
The most common response was to ignore their work. Even though Potterat et al. had poked large holes in the orthodox view, orthodoxy was safe if the critique was given no attention.
Another response was to try to prevent publication of orthodoxy-challenging research. One study was by a team, not Potterat’s, involving Janet St. Lawrence, then at the Centers for Disease Control and Prevention (CDC), and her colleagues. According to Potterat, St. Lawrence’s CDC superiors asked her not to publish the paper, but she refused. The paper was rejected by several journals, and then submitted to the International Journal of STD & AIDS. After peer review and acceptance, the CDC applied pressure on the editor to withdraw acceptance, but he refused. This is just one example of efforts made to block publication of dissenting research findings.
Janet St. Lawrence
“… it does not engender trust in the official view to know that our informal group has solid evidence of several instances by international health agencies actively working to suppress findings supportive of non-sexual transmission and to discourage research into non-sexual transmission.” (p. 221)
Another tactic was to misrepresent views. On 14 March 2003, the World Health Organisation held a meeting of experts to, as stated in a memo to participants, “bring together the leading epidemiological and modeling experts with Gisselquist and Potterat.” Potterat was dismayed by the consultation: data disagreeing with the orthodox view was dismissed. After the meeting, a statement was put out by WHO presented as representing a consensus. Actually, this so-called consensus statement did not represent everyone’s viewpoints, and was actually finalised prior to the conclusion of the meeting. (This was an exact parallel to what happened at an origin-of-AIDS conference.)
Potterat was surprised and disappointed to be subject to ad hominem comments, otherwise known as verbal abuse. He writes:
“Among other, less printable, things I was called ‘Africa’s Newest Plague’; ‘Core Stigmatizer’; ‘Linus Pauling—in his later years’ (when Pauling was thought to be advancing crackpot ideas); and [a reward being offered] ‘for his head on a platter’.” (pp. 193-194)
Potterat was surprised at this invective because none of his team had imagined the resistance and anger their work would trigger among mainstream agencies and researchers. He was disappointed because many of the comments came from colleagues he had previously admired.
Researchers into the dynamics of science have coined the term “undone science” to refer to research that could be done and that people are asking to be done, but nevertheless is not carried out. A common reason is that the findings might turn out to be unwelcome to powerful groups. Governments and industry, through their control over most research funding, can stifle a potential challenge to orthodoxy by refusing to do or fund relevant research.
Undone science is most common in areas where citizen groups are calling out for investigations, for example on the environmental effects of mining in a particular area or the health effects of a new chemical. Three research students who I supervised used the idea of undone science as a key framework for their theses, on drugs for macular degeneration, on vaccination policy, and on the cause of the cancer afflicting Tasmanian devils. My former PhD student Jody Warren and I, drawing on our previous work, wrote a paper pointing to undone science in relation to three new diseases. With this experience, I was attuned to notice cases of undone science in whatever I read. In Potterat’s chapter “Why Africa?” there were many striking examples.
In their papers, Potterat and his colleagues presented findings but, as is usual in scientific papers, acknowledged shortcomings. In one case, to counter criticisms, they reviewed research on the efficiency of HIV transmission by skin-puncturing routes, while admitting that new studies were needed to obtain better data. Potterat concludes, “To my knowledge, such studies have not been fielded.” (p. 199)
In another study, on discrepancies in studies of Hepatitis-C strains and patterns, Potterat writes, “In the intervening decade, however, no studies had been fielded to resolve these uncertainties.” (p. 199)
Potterat and his collaborators were unable to obtain external funding to carry out studies to test their hypotheses. So Potterat used his own money for a small study of HIV transmission in Africa. “Yet this pilot study supported our contentions and should have provoked the conducting of larger studies to confirm our findings. Regrettably, this did not happen.” (p. 205)
As stated earlier, I am not in a position to judge research about transmission of HIV in Africa. I approach the issue through Potterat’s account of the tactics used by supporters of orthodoxy against a contrary perspective. The tactics, according to him, included ignoring contrary findings, denigrating the researchers who presented them, putting out a misleading consensus statement, and refusing to fund research to investigate apparent discrepancies. I was struck by the remarkable similarity of these tactics to those used against other challenges to scientific and public-health orthodoxy. This does not prove that the dissident viewpoint is correct but is strong evidence that it has not been treated fairly. To be treated fairly is usually all that dissident scientists ask for. The hostile treatment and failure to undertake research (“undone science”) suggest that defenders of orthodoxy are, at some level, afraid the challengers might be right.
Potterat nicely summarises the multiple reasons why the findings by him and his colleagues were resisted.
“By their own admission, the international agencies feared that our work would cause Africans to lose trust in modern health care, especially childhood immunizations, as well as undermine safer sex initiatives. (Recall that their condom campaigns were also aimed at curtailing rapid population growth in sub-Saharan Africa.) We speculate that disbelief on the part of HIV researchers that medical care in Africa could be harming patients may have been a significant factor in their defensive posture. We were also impugning the quality of their scientific research and potentially threatening their livelihoods. In addition, our analyses also directly threatened the politically correct view that AIDS was not just a disease of gay men and injecting drug users, but also of heterosexuals. Lastly, our data were undermining the time-honored belief about African promiscuity, a notion that may well have initially contributed to the (pre)conception that AIDS was thriving in Africa because of it.” (p. 194)
The depressing lesson from this saga, and from the many others like it, is that science can be subject to the same sorts of groupthink, intolerance of dissent, and defence of privilege that afflict other domains such as politics. To get to the bottom of long-standing scientific disputes by trying to understand the research is bound to be time-consuming and very difficult, something few people have the time or interest to pursue. I aim at something easier: observation of the tactics used in the dispute. This doesn’t enable me to determine which side is right but does give a strong indication of whether the dispute is being pursued fairly.