psychology • Topic • Inside Story https://insidestory.org.au/topic/psychology/ Current affairs and culture from Australia and beyond Thu, 21 Mar 2024 04:27:39 +0000 en-AU hourly 1 https://insidestory.org.au/wp-content/uploads/cropped-icon-WP-32x32.png psychology • Topic • Inside Story https://insidestory.org.au/topic/psychology/ 32 32 Virtual anxiety https://insidestory.org.au/virtual-anxiety/ https://insidestory.org.au/virtual-anxiety/#comments Mon, 18 Mar 2024 03:15:14 +0000 https://insidestory.org.au/?p=77541

Jonathan Haidt probes the causes of young people’s mental distress with refreshing humility

The post Virtual anxiety appeared first on Inside Story.

]]>
It’s now common knowledge that we are in the grip of a mental health crisis. Stories about rising rates of diagnosis, surging demand for treatment and straining clinical services abound. It is hard to avoid feeling that the psychological state of the nation is grim and getting grimmer.

The truth of the matter is more nuanced. The National Study of Mental Health and Wellbeing, carried out between 2020 and 2022 by the Australian Institute of Health and Welfare, tells us that 22 per cent of Australians had a mental disorder in the previous twelve months and 43 per cent within their lifetime. Large numbers, no doubt, but no larger than the 20 per cent and 45 per cent figures obtained when the study was conducted in 2007.

But hidden in these aggregated figures is a worrying trend. Among young people aged sixteen to twenty-four, the twelve-month prevalence of mental disorder rose from 26 per cent to 39 per cent, and that increase was especially steep for young women, up from 30 per cent to 46 per cent. When half of this group has a diagnosable mental illness — an underestimate, because the study only counts a subset of the most prevalent conditions — something is clearly very wrong.

A similar story of age- and gender-biased deterioration is told by the Household, Income and Labour Dynamics in Australia survey. When an index of mental health is tracked across iterations of the survey from 2001 to 2021, older and middle-aged adults hold relatively steady but people aged fifteen to thirty-four, and especially young women, show a relentless decline beginning around 2014. The pandemic, the usual all-purpose explanation for recent social trends, can’t be held responsible for a rise in psychiatric misery that preceded it by several years, so what can?

Jonathan Haidt’s The Anxious Generation offers a provocative but compelling answer to this question. Haidt, an American social psychologist known for influential books on well-being (The Happiness Hypothesis), moral psychology and political polarisation (The Righteous Mind) and upheavals on US college campuses (The Coddling of the American Mind, written with Greg Lukianoff), argues that some of the usual explanatory suspects are innocent. They don’t account for why declining mental health disproportionately affects young women, why it is occurring now or why the trendline started to dive in the early 2010s after a period of stability.

The prospect of ecological catastrophe, for example, weighs most heavily on younger people but every generation has experienced existential threats. Wars, natural disasters, and economic crises are conspicuous reasons for distress and despair, but world events have always been terrible. It is not obvious why they should disproportionately make young women anxious and depressed while leaving older and maler people unaffected. The stigma of mental illness may have declined so that people have become more willing to acknowledge it, but increases in the prevalence of mental ill-health among young people are not confined to subjective reports but also found in rates of hospitalisation and suicide.

The chief culprit, Haidt proposes, is technological. Smartphones and social media have rewired young minds to an unprecedented degree, replacing “play-based childhood” with “phone-based childhood.” Portable devices with addictive apps and algorithms engineered to harvest attention and expose children to damaging content have wrought havoc on young people’s mental health. They have done so in ways that are gendered and most severely affect generation Z. Born after 1995, these young people are the first to have gone through puberty in the virtual world.

Haidt marshals high-quality evidence for the decline in young people’s wellbeing over the past decade. Graph upon graph show inflection points in the early 2010s when mental health and related phenomena such as feelings of social connection or meaning in life start to trend downward. These trends are not limited to the United States but occur more or less in lockstep around the Western world. Their timing indicates that it is not the internet or social networking sites themselves that are damaging, but the transformation that resulted from the advent of smartphones, increased interactivity, image posting, likes chasing, algorithmic feeds, front-facing cameras and the proliferation of apps engaged in a race to the bottom to ensnare new users.

Haidt argues that the near-universal use of smartphones in children and especially pre-teens is driving the increase in mental health problems among young people. Coupled with over-protective parenting around physical risks in the real world has been an under-protection around virtual risks that leaves children with near-unfettered access to age-inappropriate sites. Like Big Tobacco, the developers of social media platforms have designed them to be maximally addictive, have known about the harms likely to result, have made bad faith denials of that knowledge, and have dragged their heels when it comes to mitigating known risks that would have commercial consequences.

There are many reasons why phone-based childhood has damaging effects. It facilitates social comparisons around appearance and popularity, enables bullying and exclusion, exposes young children to adult-focused material, and serves individualized content that exploits their vulnerabilities. It fragments attention and disrupts sleep, with implications for schooling as much as for mental health. Smartphones also function as “experience blockers,” reducing unstructured time with friends and the opportunities for developing skills in synchronous social interaction, conflict resolution and everyday independence.

Haidt is emphatic that the problem of phone-based childhood is not just the direct harms it brings but also the opportunity costs: the time not spent acquiring real-world capabilities and connections. Added to a prevailing culture of safetyism that attempts to eradicate risk and prescribes structured activity at the expense of free play and exploration, the outcome is a generation increasingly on the back foot, worried about what could go wrong and feeling ill-equipped to deal with it. Well-documented developmental delays in a range of independent and risky behaviours are one consequence, and the rise of anxiety is another.

When many children and adolescents report that they are almost constantly on their phones we should therefore not be surprised that they feel disconnected, lonely, exhausted, inattentive and overwhelmed. Haidt argues that many of these emotional and social effects are common to young people as a group, but some are gendered. Girls are more likely to be entrapped by image-focused networking sites that promote perfectionist norms, decrease their satisfaction with their bodies, and expose them to bullying, trolling and unwanted attention from older men. Boys are more often drawn into videogames and pornography, which foster social detachment, pessimism and a sense of meaninglessness, sometimes combined with bitter misogyny.

Haidt reminds us not to think of children as miniature adults, but as works in progress whose brains are malleable and developmentally primed for cultural learning. “Rewiring” may be an overstatement — brains never set like plaster and cultural learning continues through life — but the preteen years are a sensitive period for figuring out who and what to look up to, a bias easily hijacked by influencers and algorithm-driven video feeds. Older adults can be moralistic about adolescents who won’t disengage from their phones, but when those phones are where life happens, and when the brain’s executive functions are only half-formed, we should understand why shiny rectangles of metal and glass become prosthetic.


What to do? Haidt has a range of prescriptions for parents, schools, tech firms and governments. Parents should band together to encourage free play, promote real-world and nature-based activities that build a sense of competence and community, limit screen time for younger children, use parental controls, and delay the opening of social media accounts until age sixteen. Schools should ban phones for the entirety of the school day, lengthen recess, encourage unstructured play, renormalise childhood independence and push back against helicopter parenting. There is a social justice imperative here, Haidt observes, as smartphone use seems to disproportionately affect the academic performance of low-income students.

Responsibility for intervening can’t be left to individuals and local institutions alone. Governments and tech firms must recognise their duty of care and come to see the current state of affairs as a public health issue, much like tobacco, seat belts, sun exposure or leaded petrol. Tech firms must get serious about age verification and increasing the age of “internet adulthood” at which young people can make contracts with corporations hell-bent on extracting their time and attention. Governments can legislate these requirements, design more child-friendly public spaces, and remove penalties for healthy forms of child autonomy such as going to a playground without a parent, currently criminalised in the United States as “neglect.”

The Anxious Generation is a passionate book, coming from a place of deep concern, but most of it is written with the cool intonation of social science. The work is accessible and clearly intended for a wide readership, each chapter ending with a bulleted summary of key points. There is a refreshing humility about the empirical claims, which Haidt accepts can be challenged and may sometimes turn out to be wrong, referring the reader on to a website where updates on the state of the evidence will appear.

The part social media plays in mental ill-health is in dispute, for example, although the evidence of a correlation with heavy use is not. Haidt offers up studies supporting the causal interpretation but acknowledges that nothing is straightforward where human behaviour is concerned. Nevertheless, he is justified is arguing that his “Great Rewiring” hypothesis is now the leading account of the origins of the youth mental health crisis. No other contender appears capable of explaining why things seemed to start going wrong around the globe somewhere between 2010 and 2015.

Critics of The Anxious Generation are likely to argue that Haidt’s hypothesis is simplistic or that it amounts to a moral panic. Both charges would be unfair. A single explanatory factor rarely accounts for something as complex as a major social trend, of course, but identifying a dominant cause has the pragmatic benefit of prioritising interventions. If phone-based childhood is the problem then we have a clear target for possible solutions.

As explanations go, Haidt’s isn’t quite as simple as it might seem in any case. The advent of smartphones and all-consuming social media may take centre stage, but earlier cultural shifts that amplified the sense of risk and promote over-protection set the scene and compounded young people’s vulnerability. Haidt’s account of the elements of smartphone use that are most damaging is also highly specified rather than a wholesale rejection of the virtual world.

The mental health field often extols the complexity of its subject matter, which sits at the jumbled intersection of mind, brain and culture, but that recognition can hamper the search for agreed interventions. The usual calls to boost clinical services are understandable, but solutions that address individual distress in the present fail to tackle the collective, institutional and developmental sources of the problem.

Some proposed solutions, such as efforts to build online social connections, may be ineffective because they fail to foster the embodied, real-world connections that matter. Other supposedly compassionate responses, such as accommodating student anxiety with diluted academic requirements and on-demand extensions, may make anxiety worse by enabling and rewarding avoidance. Haidt arguably overlooks how much mental ill-health among young people is being inadvertently made worse by well-meaning attempts to accommodate it and by backfiring efforts to boost awareness and illness-based identities.

The charge of moral panic is equally problematic and doesn’t stick for three reasons. First, evidence for the harmful consequences of phone-based childhood is now documented in a way that past worries about new technologies were not. Second, Haidt’s proposal focuses on the welfare of young people rather than social decay. Although he argues that phone-based life can cause a form of spiritual degradation, his critique is primarily expressed in the register of health rather than morality. Third, although Haidt articulates a significant threat, with the partial exception of social media companies he is not in the business of lashing villains so much as promoting positive, collective responses and a sense of urgency.

The youth mental health crisis is real, and it shows no signs of abating. The human cost is enormous. If rates of mental illness among Australians aged sixteen to twenty-four had remained steady since 2007, around 350,000 fewer young Australians would be experiencing one this year. The Anxious Generation is vital reading for anyone who wants a sense of the scale of the problem and a clear-eyed vision of what it will take to tackle it. •

The Anxious Generation: How the Great Rewiring of Childhood is Causing an Epidemic of Mental Illness
By Jonathan Haidt | Penguin | $36.99| 400 pages

The post Virtual anxiety appeared first on Inside Story.

]]>
https://insidestory.org.au/virtual-anxiety/feed/ 3
“An unfathomable, shapeshifting thing” https://insidestory.org.au/an-unfathomable-shapeshifting-thing/ https://insidestory.org.au/an-unfathomable-shapeshifting-thing/#comments Wed, 13 Mar 2024 01:41:18 +0000 https://insidestory.org.au/?p=77516

Writer Adele Dumont charts trichotillomania — compulsive hair-pulling — from the inside out

The post “An unfathomable, shapeshifting thing” appeared first on Inside Story.

]]>
When she was a teenager Adele Dumont’s hair was so thick and heavy she felt shame at how it looked undone — “it didn’t work with gravity like other girl’s hair, it took up too much space.” Then, at age seventeen, The Pulling began. From peeling apart split ends — an ordinary ritual for the long-haired — Dumont “started to do this other thing, an arresting thing…” She would pull out individual hairs, “curled and coarse,” stretch them out and inspect them, taking special interest in the “hidden bits” that grew out of the central part of her scalp.

“The whole process was mysteriously painless,” Dumont recounts in her new book, The Pulling. She discovered that the hairs on her head “sit as shallowly as birthday candles on a cake” and “can be removed as effortlessly as a grape can from its stem.”

More than a decade later, Dumont has been pulling out strands and roots of hair from her scalp for so long that she invests in an expensive, custom-made hairpiece, especially designed to blend inconspicuously into the patchy hair that remains. The catalyst is the publication of her first book, No Man Is An Island (2016), an account of her time teaching English to asylum seekers on Christmas Island. Her motivation, she writes, was not “wanting to look nice” on the publicity circuit but the desire “to be able to stop thinking about my hair altogether.”

As in every other essay in Dumont’s finely wrought collection, “The Piece” stands alone, as well as in unison as memoir. The themes of shame and secrecy, evocatively rendered, pervade The Pulling. Entering the building for her first “hair transition” appointment, Dumont “felt the kind of edginess that I imagine a married man might feel visiting a brothel.” She is assigned Andrew, whose “dispassionate” approach and knowledge of her “problem” put her at relative ease. After her partner M, Andrew is “the second person on the planet to witness my scalp in this state: naked and defenceless.”

Dumont’s “problem” has had a name, “trichotillomania,” since 1987, when it was categorised in the third edition of the Diagnostic and Statistical Manual of Mental Disorders, known as the DSM, under the “dubious heading Impulse-Control Disorders Not Classified Elsewhere.” In DSM V, the current edition, trichotillomania has been reclassified under Obsessive-Compulsive and Related Disorders but, as Dumont notes, there is no medical consensus. Some professionals liken “the disorder to a substance addiction” while others “see it as a form of self-harm.”  Like her own attempts to “get my head around the problem,” the condition, writes Dumont, “seems to resist the medical world’s attempt to categorise it. An unfathomable, shapeshifting thing.”

In The Pulling Dumont sets herself the challenge of putting into words what can’t be captured in an official diagnosis. She begins with her family of origin, and an early onset nail-biting habit, suggesting her condition has its roots in some formative trauma, but from there she avoids the obvious route. There is life before The Pulling but not yet after: hers is not a recovery memoir. If there is a dividing line it is circa 2005, when Dumont finds a book in her university library, published in 1989, by a “Distinguished Psychiatrist” who documents cases of clients with “pointless disorders.” She recognises herself in its pages and furtively photocopies the relevant section.

As the outside knowledge accumulates and she comes to know her condition through authorities other than herself, Dumont initially feels more resistance than relief. She “felt robbed” and wanting “to reclaim my singularity, I decided that even if my condition might align to others’ conditions in its generalities, surely how it manifested in me was unique.” Dumont cycles through numerous therapists, theories and key texts and while she finds some solace, insight and direction, she also remains protective of the enduring mysteries, paradoxes and specifics of her condition.

Some of the most exquisite sentences and passages, in a book full of them, detail what it is like for Dumont inside or in the immediate wake of a “ravenous episode.” To give in is a kind of surrender, what she describes as “a turning.” Then comes the “the deepest pleasure and fullest absorption” of being “inside the experience, when the world is reduced to teeth and touch, and taste.” At the end of an episode, Dumont feels “that I’ve been shipwrecked: dazed and conspicuously fragile.”

On the flipside, Dumont speculates on the view from outside, shifting between awe and shame as the dominant registers. Perhaps, from above, it might appear that “my fingers must be moving in accordance with some greater design, like a needleworker’s, or like a spider darting from point to point to build her web.” Elsewhere, she is convinced that her behaviour “must look masochistic, deviant, repulsive.”

The beauty and power of The Pulling resides in how artfully Dumont balances two sometimes competing concerns — filling a gap and sharing a secret. Dumont makes fathomable and palpable a neglected condition estimated to affect around one in fifty people — more than bipolar or schizophrenia. Readers with trichotillomania will surely be drawn in, as will any of us who have or have had a compulsive habit dating back to childhood that began, as it did for Dumont, as “just something that I did.”

Yet Dumont is as much a writer as she is a person with trichotillomania, and accordingly The Pulling exhibits the propulsive and exacting qualities of a book that had to be written and had been brewing for a long time. Here and there, she addresses the reader directly to tell us that this is not easy, or to reflect on her own motivations. “I ought to say,” she writes, “I am finding it hard to tell you, harder than even I anticipated.” In less skilled hands, such self-reflexivity could easily grate, but Dumont succeeds in creating intimacy with her imagined reader and audience. We come to learn what it has meant for the author to carry her secret, and now to share it.

Beyond liberating herself as a writer, Dumont stakes a powerful claim for all people who have been diagnosed with a condition having the authority to tell their own stories and comprehend their own experience. As she persuasively writes, “my not-knowing that my illness existed was a precondition for coming to know it as intimately as I have.” •

The Pulling: Essays
By Adele Dumont | Scribe | $29.99 | 288 pages

The post “An unfathomable, shapeshifting thing” appeared first on Inside Story.

]]>
https://insidestory.org.au/an-unfathomable-shapeshifting-thing/feed/ 5
Jagged solitude https://insidestory.org.au/jagged-solitude/ https://insidestory.org.au/jagged-solitude/#comments Thu, 18 Jan 2024 05:30:59 +0000 https://insidestory.org.au/?p=76971

A German writer’s candid account of the shifting boundary between solitude and loneliness

The post Jagged solitude appeared first on Inside Story.

]]>
“I had increasingly been feeling as if something had gone wrong,” writes Daniel Schreiber in his affecting examination of living alone. As a younger man he hadn’t intended to wind up uncoupled but at a certain point that state became habitual. It wasn’t as if he had been left behind, unlucky in love. Instead, after a youthful flurry of relationships he began to seek out solitude and wondered if there was enough room in his life to accommodate a partner. His book, Alone, is an extended meditation on the solitary life, set against the backdrop of the Covid-19 pandemic.

Schreiber is the author of several literary essays and biographer of Susan Sontag. The original edition of this memoir was a bestseller in Germany and stands out among the flood of books addressing our supposed epidemic of loneliness. The idea that social connection is the key to happiness and disconnection the root of mental ill health has become the new commonsense, stamped into our consciousness by lockdowns and compelled distancing. Many writers have offered diagnoses of the problems of loneliness and prescriptions for overcoming it, but few provide such a vivid first-person account and fewer still bring such an erudite sensibility to the task.

The backbone of Alone is life-historical. In 2019 Schreiber has an epiphany that things are going wrong; by Christmas “I stop believing that this life, as I live it, as I live it alone, is a good life.” He is partly restored by a writing trip to Switzerland, struggles through the compounding isolation of the pandemic, and embarks on a trip with friends to the Canary Islands that turns into a sort of sabbatical. Along the way he finds a series of therapeutic diversions, all of them physical activities that relieve some of his self-consciousness: gardening, hiking, knitting and yoga.

Off this narrative spine Schreiber hangs a series of meditations on the solitary condition, heavily supported by big thinkers. His intellectual tastes run philosophical and French, and anyone with a passing acquaintance with the humanities in the 1980s and 1990s will recognise many of his theoretical muses: Barthes, Derrida, Foucault, Lacan, Lyotard. At times the insights are rich, especially in contrast to the shallowness of some psychological and sociological accounts of loneliness, but at other times the price seems high. I had hoped never again to encounter the word “phallogocentrism,” but there it was. And do we need a deconstructionist to tell us that we should not insist our friends conform to all of our wishes?

Schreiber’s analysis of friendship is powerful, pointing out its many forms and virtues — it is non-exclusive, voluntary, enduring — but showing how often it is made to defer to coupledom and the “grand narrative” of romantic love. Friendship is often pictured as a stage of life prior to nesting, and Schreiber notes with some disdain how many couples withdraw inward, a dynamic especially evident during the pandemic. Even so, he is scathing about how friendship has been misrepresented in the Western philosophical tradition, in which he claims it is portrayed as a quest for similarity and equality, the perfect friend idealised as “another oneself.” There is an element of straw-manning (and patriarchy-bashing) here, but Schreiber is adamant that friendship needs to be celebrated for its embrace of diversity rather than sameness, a conviction that resonates with his emphasis on the importance of friendship in queer communities.

He is equally incisive and contentious on the topic of loneliness. Drawing the standard distinction between loneliness and being alone, the latter an objective lack of social contact, the former subjective distress over the degree or quality of contact, Schreiber writes of the pleasures and benefits of solitude, admitting to enjoying some aspects of pandemic isolation. Being alone can be good and loneliness is not all bad. Though painful, it is not a disease, and the important lessons about loss and compassion to be learned from it mean it should not be dreaded. Some hand-wringing about the loneliness epidemic is reactionary, he suggests, motivated by nostalgia for the traditional family.

But Schreiber also muddies the conceptual waters. Solitude would normally be understood as positively valued aloneness, but he criticises it as “the presentable, dignified version of loneliness,” a word people use to deny the shameful reality of their true but taboo feelings. Although he sometimes confuses the picture by using solitude and loneliness interchangeably, Schreiber adds some useful complexities here. Loneliness may be distressing but also ethically and existentially desirable, and solitude may be a pathway to self-knowledge but also a cover for self-deception.

Schreiber makes no attempt to hide his ambivalence about being alone. He can present himself as bravely fronting the challenges of solitude and rising above coupled conformity but also admit to holding petty resentments and vulnerabilities. He can clothe his loneliness in grand ideas and social critique but also express his unhappiness with naked honesty. In one breath he flays romantic relationships and claims not to want them anymore, and in the next he confesses to feeling unlovable.

At times these vacillations suggest a cerebral Schreiber who reads the loftiest French theory and criticises the idea of self-care as “the ultimate victory of neoliberal late capitalism” while coexisting uneasily with a visceral Schreiber who likes to watch Friends, loves yoga and acknowledges that his neurotic misery may be due more to a lack of sunshine and exercise than existential angst.

Schreiber is a perceptive and relatable writer. He grapples with many of the same trials of social life that we all face, trials that became significantly more challenging in recent history. As a character in this memoir, he is not so much rounded in the literary sense — deep, complex and many-faceted — as he is jagged. The conflicts and quirks that other writers might edit out are on candid display in this book. It is well worth spending a few hours of quality solitude with it. •

Alone: Reflections on Solitary Living
By Daniel Schreiber | Reaktion Books | $34.99 | 152 pages

The post Jagged solitude appeared first on Inside Story.

]]>
https://insidestory.org.au/jagged-solitude/feed/ 20
The one who told them who they were https://insidestory.org.au/the-one-who-told-them-who-they-were/ https://insidestory.org.au/the-one-who-told-them-who-they-were/#respond Thu, 19 Oct 2023 05:41:29 +0000 https://insidestory.org.au/?p=76128

A writer and activist explores the changing seasons of grief

The post The one who told them who they were appeared first on Inside Story.

]]>
When her mother killed herself at the age of seventy-five, Natasha Walter suffered more than the usual burden of filial guilt. Ruth had frequently talked of suicide, but Natasha hadn’t always listened, brushing off warnings with a mix of impatience and anxious deflection. In retrospect there were other signs: jewellery given away, an unexpected family lunch convened, frequent laments about developing unendurable dementia.

Walter, a British feminist writer and activist, has written an evocative book, Before the Light Fades, that begins as an exploration of the emotional aftermath and turns into an examination of the lives of Ruth and her refugee parents. At first her grief is raw, turning her into “a little scuttling mollusc without armour.” It is compounded by the lingering stigma of suicide, which somehow coexists with a new cultural openness to talking about it.

Walter struggles against the view that her mother had the worst kind of death and that her suicide could only be attributed to mental illness. She bristles at the modern tendency to see all dark emotions through a psychiatric lens, but also worries that towards the end Ruth was in a state of despair rather than Socratic composure.

Friends who use the well-meaning but “off the shelf” language of self-care to comfort her provoke the same irritation. She tries out a range of healing distractions — yoga, swimming, running, gardening — but the idea that we should soothe and coddle ourselves in times of loss seems to her self-absorbed and alien to the generation that is being lost.

The book’s account of the changing seasons of grief is intense and unsparing. Walter has tears, self-reproach and regret, as our current bereavement script leads us to expect, but also times of anger, bitterness and misanthropy. Mourning does not always deepen or ennoble. At times it leads her to resent the living and become hardened, cauterising her empathy to stem the flow of pain. “I am becoming less human, the more I grieve.”

Walter captures the experience of having an ageing parent beautifully. Her relationship with Ruth is about as solid and unambivalent as two strong personalities can have, but she confesses to having experienced a growing annoyance with her mother’s growing vulnerability. Ruth’s preoccupation with dementia, amplified by experiencing her own father’s illness and her work in aged care, seemed out of proportion. Walter is saddened by the loss of Ruth’s independence, fearlessness and rebellious spirit, but her sadness is mingled with an implied criticism of her slide into weakness, as if Ruth should have tried harder to embody the maternal ideal she represented as a younger woman.

Walter reclaims that younger self in a compelling retelling of Ruth’s past, from the horror of her parents’ early life as Jews in Nazi Germany, to their circuitous escape into an unwelcoming England that sent them to internment camps, their shrinking into postwar suburban anonymity, and their upset when the young Ruth resurrects her father’s abandoned radicalism in the fight for nuclear disarmament in the 1960s. Georg knew where dissent could lead.

Ruth’s involvement in Bertrand Russell’s Committee of 100 is a mix of clerical tedium — so much typing, copying and mailing — and daring escapades, peaking when she helps uncover evidence that the British government had built bunkers to house the great and good in the event of nuclear apocalypse.

Ruth’s politics extended beyond the nuclear issue, leading her into a brand of feminism that would later conflict with her daughter’s. Walter recounts how the power feminism she embraced in the 1980s rejected Ruth’s critique of femininity. She believed she could remain glamorous while the last few glass ceilings were quickly shattered. That former self was naive, Walter writes, failing to anticipate that “objectification would be sold back to us as an empty mirage of empowerment.”

This realisation becomes part of a broader and more sympathetic re-evaluation of Ruth’s unorthodox and sometimes puzzling life choices. Even the suicide becomes intelligible, “like leaving a party when you’ve had enough.”

Before the Light Fades reveals not only the courage and creativity of Ruth’s generation of protesters, but also how the disarmament movement’s mission to avert global disaster is echoed in the climate emergency movement of today. Ruth herself comes across as a free spirit who retained her own parents’ sense of displacement and never became entirely settled. Marriage to a fellow activist burns brightly for a while but ends badly. She throws herself into study, social work with refugees, and being a mother and grandmother: “the myth maker of the family, the one who told us who we were.”

Walter’s writerly voice is distinctive without being showy: she is humane, curious and allergic to cliché, but also sceptical, half in the world and half on the sidelines looking askance. She is a deep thinker but not a wallower or a theorist. As her grief starts to lift, she recommits to political action as if carrying forward a family tradition. Her book is a moving meditation on ageing and loss, the persistence of the past, and the necessity of hope in spite of it all.

It’s a funny kind of hope, peeking through a cloud of pessimism, but it seems a fitting tribute to Walter’s lineage of brave and beleaguered radicals. •

Before the Light Fades: A Memoir of Grief and Resistance
By Natasha Walter | Hachette | $32.99 | 256 pages

The post The one who told them who they were appeared first on Inside Story.

]]>
https://insidestory.org.au/the-one-who-told-them-who-they-were/feed/ 0
Personality problems https://insidestory.org.au/the-personality-problem/ https://insidestory.org.au/the-personality-problem/#comments Mon, 11 Sep 2023 07:34:46 +0000 https://insidestory.org.au/?p=75606

When does a type become a disorder?

The post Personality problems appeared first on Inside Story.

]]>
Personality disorder is among the most controversial topics in mental health. Personality itself is a rather slippery concept, and deciding where to draw the line between “normal” and disturbed personality is even slipperier. The fact that personality disorders often seem to have a greater impact on other people than on the person with the disorder makes them even more contentious.

Personality is usually defined as an individual’s typical way of thinking, feeling and behaving. The word comes from the Latin persona, meaning “mask,” referring to what actors in classical theatre once wore to play different characters. We now think about personality as our psychological uniqueness, closely related to our sense of self or identity. It is usually described as a particular set of enduring characteristics or traits.

Every one of us has a distinctive combination of these traits. Personality researchers have identified as many as 4500 trait words in the English language, such as sociable, curious, callous, irresponsible, optimistic, immoral, warm and impulsive. But, you might ask, are there really so many ways people differ from one another? Or is there, perhaps, a limited number of basic personality types or dimensions amid all this complexity?

Sure enough, many attempts have been made to distil personality down to a more manageable number. The psychologist Raymond Cattell, a pioneer, applied sophisticated statistical techniques to construct a set of sixteen factors, each on a continuum stretching between two poles. Your position on each continuum, such as reserved–warm, shy–bold, relaxed–tense and serious–lively, summarises your unique personality. For example, someone might be very low on the reserved–warm factor, average on the shy–bold factor, well above average on the relaxed–tense factor, and slightly above average on the serious–lively factor.

Although boiling 4500 distinct words down to sixteen basic personality factors was an impressive feat, other psychologists found that number to still be unwieldy. Over time, researchers suggested that an even more basic set of five personality factors lurked beneath these sixteen, unimaginatively dubbed the Big Five: neuroticism, extraversion, openness, agreeableness and conscientiousness. This has become the pre-eminent model of personality structure, and as we shall see, it sheds light on the concept of personality disorder.

Once we appreciate that personality factors fall on a spectrum, we must grapple with the question of when a personality pattern crosses the boundary from normal to problematic, or even whether such a boundary exists at all. If we fall at the extreme end of one or more of the Big Five factors, we may tend to think, feel and behave in ways that cause problems in our everyday lives. For example, if we score extremely highly on neuroticism, we are likely to suffer storms of negative emotion and self-criticism that will adversely affect our close relationships, capacity to work and sense of general wellbeing.

If we are extremely low on agreeableness, on the other hand, we will tend to distrust and act in a hostile way towards others, leading to conflict with them and social rejection. If we are extremely low on extraversion, we are bound to be painfully shy and isolated, unable to reach out for the social connections that all of us need and incapable of developing interpersonal skills.

Mental health professionals adopt a pragmatic position in diagnosis: if we show longstanding inflexible, maladaptive patterns of functioning that adversely affect our lives in the spheres of family and general social relationships, work and recreation, then the possibility of our personality being dysfunctional arises.

We ourselves have reservations about the term “personality disorder” since it has often been used in a derogatory and stigmatising way. Mental health professionals are obliged to understand why people behave the way they do, and not to trivialise or dismiss their problems with a throwaway label. It is regrettable that these attitudes prevail given that people with personality problems are vulnerable to a full range of mental illnesses and are candidates for treatment, and stigma can affect their response to treatment.


Interest in types of problematic personalities goes back to ancient Greece. The Greek philosopher Theophrastus, in the fourth century BCE, described no fewer than twenty-nine troublesome types, including the flatterer, the grumbler, the boor, the buffoon, the tactless and the patron of rascals.

More objective and methodical efforts were only initiated in the early nineteenth century. People who showed a pattern of behaviour that suggested the absence of a conscience — that is, lying, stealing, assaulting, even killing, without remorse — were the first to be studied. In 1835, James Prichard, an English physician, postulated that what he called “moral insanity” might represent an illness, related to an aberrant moral centre in the brain. He included among its features “angry and malicious feelings, which arise without provocation” and elicit “the greatest disgust and abhorrence.” It later came to be called “psychopathic personality.”

It may strike you as questionable to define such behaviour as a psychiatric condition rather than as a moral failing or a criminal propensity. This thorny question has confronted both the mental health and the legal systems for many years: are people labelled as psychopathic responsible for their actions? That is, do people who kill without remorse deserve punishment or medical treatment?

The “psychopath” was soon joined by many other categories of problematic personality, culminating in the 1920s when a renowned German psychiatrist, Kurt Schneider, grasped the nettle and attempted to bring order to the chaos. Some of the ten personality disorders he identified have continued to be applied in the sphere of mental health, particularly by the American Psychiatric Association, which groups them into three clusters.

We examine each of the ten disorders in our new book, but for the moment let’s familiarise ourselves with the clusters and their underlying themes:

• Cluster A is marked by odd, eccentric behaviour: schizoid, paranoid and schizotypal personalities.

• Cluster B is typified by dramatic, explosive, emotional and erratic behaviour: histrionic, antisocial, narcissistic and borderline personalities.

• Cluster C is characterised by anxious and fearful behaviour: avoidant, dependent and obsessive-compulsive personalities.

Although these three clusters are helpful for getting a general sense of the personality patterns, the specific conditions remain somewhat controversial for a few reasons. They have no distinctive biological features, such as unique patterns of genes or brain chemistry, their causes remain unclear, the boundaries between them are blurred, and a person may satisfy the diagnostic criteria for more than a single disorder.

Because personality exists on a spectrum, determining whether someone’s personality problems are severe enough to warrant a diagnosis is often a source of disagreement among clinicians. How many personality disorders exist is hotly debated and will no doubt continue to remain controversial. Moreover, even in the modern era, some have already been ditched. The proposed “self-defeating personality disorder” is one example, removed on the grounds that it might be inappropriately applied to survivors of domestic violence.

Given this plethora of reservations, some influential systems for classifying personality problems, such as the World Health Organization’s ICD-11, have dispensed entirely with the idea of distinct personality disorders. Instead, personality disorder is diagnosed according to one or more problematic traits, and is deemed mild, moderate or severe depending on how much the traits disrupt a person’s life (being called, for example, “moderate personality disorder with detachment” or “severe personality disorder with negative affectivity and disinhibition”).

Although it is clear from our comments that distinguishing between personality disorders is a conundrum, mental health professionals need a workable framework. Such a system enables people whose personalities cause them difficulties to receive professional treatment.


Roughly one in ten adults, both men and women, meet the diagnostic criteria of a personality disorder. The rate jumps to one in three among those who have other psychiatric conditions, such as depression, anxiety and substance use. And about two-thirds of people with a personality disorder also have at least one of these conditions.

In the general population, the prevalence of specific personality disorders is around 1 to 2 per cent. Rates tend to decline with age, although middle-aged adults with a past diagnosis may continue to lead troubled lives. Although early signs of a problematic personality can be observed among children and adolescents, mental health professionals are reluctant to apply a diagnosis to them since these features may be a manifestation of another psychiatric disorder.

The factors that lead to personality disorders are complex and imperfectly understood. Studies of identical and fraternal twins point to a substantial genetic component. Interestingly, genetic influences overlap a great deal with the genetic factors underpinning the Big Five factors of (high) neuroticism, (low) agreeableness and (low) extraversion.

Biological factors besides genetic influences are also implicated. Although research findings tend to be inconsistent and hard to summarise, abnormalities in specific chemical messengers in the brain (neurotransmitters) and in the size or activity of certain brain regions have been detected.

Narcissistic individuals, for example, have been found to have smaller brain structures associated with empathy. And in those with an avoidant personality, the amygdala, a structure involved in fear and anxiety, appears to be more reactive in social situations. Problematic personality traits can also emerge after physical damage to the brain, caused, for instance, by tumours, head injuries or the emergence of dementia.

Psychoanalytically oriented theorists assert that problematic personalities are mostly psychological in origin. Sigmund Freud and Erik Erikson, among others, posit that conflicted interactions between children and their carers can create problematic ways of coping and relating. For example, a deep lack of trust might result when the child’s need for consistent care is not met. A tendency to blame others or seek attention in response to conflict as a child might become the foundation for paranoid and histrionic personalities, respectively. When parents fail to be empathically attuned to their infants, their children may fail to develop a stable sense of self, paving the way for borderline or narcissistic patterns of personality.

A related approach, based on John Bowlby’s attachment theory, proposes that children who do not develop a stable emotional bond with their carers are vulnerable to personality disturbance in adulthood. Maltreatment in childhood, particularly physical and sexual abuse, are noteworthy risk factors for certain types of personality disorders.


People with problematic personalities may seek professional help but just as often are urged to do so by distraught relatives or friends at a time of crisis such as a family or workplace conflict, breakdown of a key relationship, or excessive substance use.

Offences such as physical assault, drink driving and shoplifting may lead to the police or courts initiating the process. An impulsive overdose or self-inflicted bodily lacerations are often the route to clinical attention for people with borderline personalities; they tend to present repeatedly in this way. Alternatively, help may be sought for psychiatric difficulties associated with a problematic personality. The failure of a psychiatric disorder to respond to treatment may indicate a previously undetected problematic personality.

Less dramatically, a person may request help from a mental health professional for persistent low morale, anxiety, self-doubt, failed relationships, preoccupation with bodily symptoms, or other personal difficulties that they feel unable to change. An unrecognised personality disorder may also reveal itself in the wake of a failure to engage in treatment, a clash with a therapist, or even a threat of litigation.

Mental health professionals aim to capture as complete a picture as possible of a patient’s past and present life. An account of childhood and early relationships within the family (and with significant others) is at the heart of the inquiry, as are methods typically used to deal with the challenges and demands of life.

This emphasis on gathering information about multiple aspects of the person’s life contrasts with a more symptom-focused approach typical of a discrete mental illness. Personality difficulties are not like such illnesses: they are woven into a way of being in the world, encompassing how the person relates to others, and their self-esteem, coping styles, motivations and aspirations.

Pinpointing a specific personality disturbance requires learning about a person’s lifelong patterns of thinking, feeling and behaviour. For example, a severely depressed executive in charge of a large company may be utterly reliant on his family and on professional staff for even the most trivial decision. Such reliance could well constitute a longstanding feature of a dependent personality disorder, but could also be a feature of a troubled mood state, or could be a combination of both. Talking with him alone may not yield enough information to reach the correct diagnosis. Since we are not always the most objective observers of ourselves, the views of parents, siblings and friends are invaluable.

As you can see, assessing personality to ascertain if it is disordered is challenging to say the least. A thorough evaluation by highly skilled mental health professionals is essential, and even then, doubts may remain. Questionnaires to identify personality disorders have been available for many years but are of limited utility, in part because respondents tend to have limited and often distorted insight into the nature of their personality and its problems. •

This is an edited extract from Troubled Minds: Understanding and Treating Mental Illness, published by Scribe ($35).

The post Personality problems appeared first on Inside Story.

]]>
https://insidestory.org.au/the-personality-problem/feed/ 7
(Don’t) always look on the bright side of life https://insidestory.org.au/dont-always-look-on-the-bright-side-of-life/ https://insidestory.org.au/dont-always-look-on-the-bright-side-of-life/#respond Tue, 25 Jul 2023 06:54:18 +0000 https://insidestory.org.au/?p=74938

How best to deal with dark moods?

The post (Don’t) always look on the bright side of life appeared first on Inside Story.

]]>
Think of an existentialist and you might picture a Frenchman in a trench coat, a severe German or a lugubrious Russian. Their writings expose the human condition in the cold light of grand abstractions: being-in-itself, authenticity, freedom and responsibility, the absurd, angst, meaninglessness. There is a musty mid-twentieth-century aura around these ideas, like bebop and monochrome photographs.

Mariana Alessandri’s new book Night Vision: Seeing Ourselves through Dark Moods presents a revived and colourised existentialism. Her philosophical authorities are gender balanced and come, with a few exceptions, from the New World. Alessandri hopes to convince her readers that the tragic vision of the existentialists still has value, and that their embrace of the darker sides of human experience is all the more urgent now.

A Latina philosophy professor and self-described Eeyore, Alessandri tells the reader she feels “pelted by positivity.” She sees a deeply problematic emphasis in American culture on being happy and optimistic at all costs. Her concern is widely shared, as many takedowns of “toxic positivity” attest, Barbara Ehrenreich’s Smile or Die, published in 2009, being the most well known.

Ehrenreich urged her fellow citizens “to recover from the mass delusion that is positive thinking,” a delusion “that encourages us to deny reality, submit cheerfully to misfortune, and blame only ourselves for our fate.” This obligatory uplift turns people inward when they should be changing the world. It removes the safety net of compassionate support when people falter because it makes happiness and success a purely personal responsibility.

Alessandri shares this dim view of a culture that obliges us to be shiny, happy people — a culture that tells those who are miserable to stop moaning, turn their frowns upside down, and try harder to lift themselves up by their hedonic bootstraps. Self-help writers lead the way by promoting the idea that we should strive for happiness above all else and illuminating the true path to self-improvement.

For too long, Alessandri writes, we have associated light with goodness and disparaged and dreaded the dark, including dark moods and dark skins. We should therefore embrace “the truth of darkness” and “turn down the lights and stop smiling.” The central chapters of Dark Vision each take a form of emotional darkness and explore it with existential company.

Sadness and depression, for example, are not negative states to be avoided, stoically endured or reasoned away but elements of what Spanish philosopher Miguel de Unamuno referred to as the tragic sense of life. Mental pain humanises us and enables compassion and deep connection. Alessandri cites feminist scholar Gloria Anzaldúa’s theoretical and autobiographical writings as rejecting the idea that depression is weakness, brokenness or disease, and offering a pathway to self-knowledge by “seeing depression by the light of the moon.”

Grief receives similar treatment. By Alessandri’s account, our obsession with brightness leads us to expect the bereaved to take action to overcome their loss and to be intolerant when they seem to succumb to it. Grief is often misrepresented as irrational and shameful, especially if it endures. Taking the novelist C.S. Lewis’s stance, detailed in his powerful A Grief Observed, we should “grieve hard” rather than privately stifle the pain of loss.

Kierkegaard is the existentialist guide through anxiety and dread, as he was for the protagonist of David Lodge’s comic novel Therapy. Alessandri criticises mainstream psychiatry for seeing anxiety as evidence of a chemical imbalance or a broken brain, and mainstream psychotherapy for piling shame on top of it by blaming patients who fail to recover. Both critiques attack straw men, but they set up a sharp contrast to a Kierkegaardian view of anxiety as “the dizziness of freedom” and a way of being “excruciatingly alive to the world.”

Anger, perhaps an odd choice for a study of dark moods given its fiery connotations and the undeniable pleasure to be found in righteous indignation, receives its own existential reframing. African-American writer Audre Lorde teaches that anger, unlike hate, enables social progress, and Argentine philosopher Maria Lugones instructs that it preserves our dignity in the face of injustice and grounds us in the reality that something is wrong with our situations, not our selves.

Both thinkers counsel that we should listen to and learn from anger rather than repress it, a conclusion hard to deny. The awkward reality that people tend to see their own anger as morally justified and others’ anger as irrational hatred is overlooked, as is the dissonance between anger’s progressive credentials and its role in fuelling violence.


Night Vision is a useful counterweight to some contemporary styles of thinking that equate mental health with happiness, but it has its own weaknesses. Often the radicalism of its claims is overstated by distorting those it opposes. The book caricatures orthodox approaches to mental ill health as if they amounted to nothing but disease-mongering and patient-blaming. Occasionally it veers into misrepresentation, as when Alessandri claims that psychiatry now defines grief lasting more than two weeks as mental illness and sees any amount of anxiety as dysfunctional.

The idea that mental health workers, avid consumers of popular psychology or everyday folk would find it a revelation to be asked “what if truth, beauty and goodness reside not only in light but also in darkness?” is absurd. The contemporary mental health landscape is arguably as open to emotional darkness as it ever has been. Lived experience is amplified; destigmatisation has progressed to the point where few people think depression reflects weak character, laziness or neurotic wallowing; sympathy and support for people disclosing mental health problems is high and rising; and people enthusiastically and sometimes promiscuously embrace diagnoses that would once have been shameful secrets.

Some writers now worry that the pendulum has swung too far towards valorising mental ill health and raise concerns about the spread of illness-based identities. It is hard to square this view with Alessandri’s analysis of a “nyctophobic” society populated by tough-minded stoics. It is one thing to avoid toxic positivity and seek meaning in suffering, quite another to tell those who suffer that their depression reveals their superior sensitivity and depth of feeling, that their anxiety “is a sign of emotional intelligence,” that attempts to motivate them to recover constitute shaming, or that their troubles stem only from a broken society.

These prescriptions point to one of the ironies of Night Vision. It overtly rejects the self-help genre while co-opting its conventions. Highlighting the author’s relatability through folksy self-disclosure and popular culture references (The Real Housewives of New Jersey, Entenmann’s donuts), repetitively invoking a few Big Capitalised Ideas (Light Metaphor, The Brokenness Story), and cosying up to a few admired self-help writers who supposedly resist the dominant approach while somehow managing to be among its biggest stars (Brené Brown, Susan David), are moves from the self-help author’s playbook.

Some of Alessandri’s meme-able sayings fall a little flat — “diagnosis doesn’t rhyme with dignity” (but neither does “darkness”) — but their mere existence points to her proximity to the self-help genre. This effort to appropriate a successful rhetorical style does at least have the advantage of softening some of the prescriptions a less compromising approach to existential insight might make. Alessandri defends the value of misery and attacks psychiatric reductionism, but she doesn’t want you to stop taking your antidepressants when you start reading Kierkegaard.

As therapy-speak becomes increasingly popular, Night Vision could be seen as offering another dialect with more intellectually elevated influences. That may not be such a bad thing. Philosophical ideas about emotion and suffering offer a perspective on mental health and illness that can complement, challenge and complicate the familiar chatter about diagnostic labels and wellbeing.

Alessandri makes a good case that existentialist writers should have a seat at the table when mental health is discussed rather than continuing to sit at another, smaller table feeling superior and overlooked, as theorists tend to do. If sometimes she engages in caricature and turns her own insights into self-help nostrums, that will at least reach a larger audience than philosophy texts from university presses ordinarily manage. Night Vision is a stimulating and sometimes maddening contribution to broadening the scope of mental health discourse. More please! •

Night Vision: Seeing Ourselves through Dark Moods
By Mariana Alessandri | Princeton University Press | $37.99 | 216 pages

The post (Don’t) always look on the bright side of life appeared first on Inside Story.

]]>
https://insidestory.org.au/dont-always-look-on-the-bright-side-of-life/feed/ 0
The ambiguity of hope https://insidestory.org.au/the-ambiguity-of-hope/ https://insidestory.org.au/the-ambiguity-of-hope/#respond Thu, 15 Jun 2023 02:00:21 +0000 https://insidestory.org.au/?p=74487

Do positive expectations and a sense of personal control add up to a unique predictor of wellbeing?

The post The ambiguity of hope appeared first on Inside Story.

]]>
“The juvenile sea squirt,” writes philosopher Daniel Dennett, “wanders through the sea searching for a suitable rock or hunk of coral to cling to and make its home for life… When it finds its spot and takes root, it doesn’t need its brain anymore, so it eats it!” “It’s rather like getting tenure,” Dennett adds unkindly. The older sea squirt (or academic) may enjoy a form of mindless happiness, but it is the younger one, adrift and seeking a secure future, who needs hope.

In her new book, The Power of Hope, Carol Graham, a noted economist and senior fellow at the Brookings Institution, argues that hope is a dimension of wellbeing that is too often neglected. The economics of wellbeing — a growing specialisation that aims to expand the scope of the discipline and make its science less dismal — usually examines feelings of life satisfaction or levels of positive and negative emotion. But whereas both those metrics evaluate past happiness from the standpoint of the present, hope looks forward. For Graham, believing we can realise a better future is crucial to thriving.

Hope is a staple of American political rhetoric and has been a reliable feature of its presidential campaigns. It runs from Jesse Jackson’s “Keep Hope Alive” to Bill Clinton’s The Man from Hope biopic, from George W. Bush’s “A Safer World and a More Hopeful America,” through Obama’s The Audacity of Hope and iconic campaign poster, Bernie Sanders’s “A Future to Believe In” and Joe Biden’s “Our Best Days Still Lie Ahead.” As optimistic slogans go, these examples certainly outshine “Make Your Wet Dreams Come True” (a reference to ending Prohibition) from Al Smith’s 1928 run.

Campaigners everywhere are selling a future, of course, but the prominence of the American hope trope can be striking to foreigners. In recent times it is almost as if hope is proclaimed so loudly precisely because the need to restore it seems so desperate.

Graham wouldn’t disagree with that. “Deaths of despair” in the United States — suicides, overdoses, alcohol poisonings — have risen to astonishing levels and life expectancy is tracking backwards, unlike in any other rich country: American exceptionalism in reverse. To Graham, these grim trends reflect a growing wellbeing inequality that is every bit as troubling and socially toxic as more familiar inequalities of income and wealth.

Loss of hope is regionalised and racialised, disproportionately affecting the white working class in the heartland, fuelling disengagement from work and education, and promoting political radicalisation and resentment of coastal elites. Restoring hope is urgent not only to stem general misery and the opioid epidemic, but also to overcome threats to civil society, national security and liberal democracy itself.

The Power of Hope reviews some of the accumulating evidence on the economics of happiness and despair, and presents two new research studies in separate chapters. These studies exemplify Graham’s focus on the racial and cross-national dimensions of hope, and her interest in the fate of young people who, like larval sea squirts, must navigate the currents of life in search of a solid footing.

The first follows late adolescents from a poor district in Lima over a three-year period, assessing their aspirations for education, occupation and migration. These aspirations are strikingly high and stable over time, and they predict what Graham calls “human capital outcomes.” Higher educational aspirations at eighteen, for example, were associated with greater educational involvement and less risk-taking at twenty-one.

The second study was intended to replicate the first with adolescents in St Louis, Missouri, but was disrupted by Covid. Graham finds a stark racial difference: white participants had lower and less parentally supported educational aspirations and less social trust than their African-American peers, despite being materially better off. These findings chime with other research Graham reviews on the greater resilience of African Americans, who are buffered by “communities of empathy” more than low-income Whites, who retain a culture of “stubborn individualism” but have lost the belief that hard work pays off. Unfortunately, the study’s tiny sample of thirty-two provides a flimsy platform for generalisations and makes it a questionable inclusion in the book, although vignette descriptions of individual participants help to personalise the findings.

Before and after these empirical studies, Graham makes a compelling theoretical case for why economists and psychologists should view hope as a unique explanatory factor rather than submerge it within other concepts of wellbeing. Its definition, and how it differs from optimism, aspirations and goals, is left somewhat open, but Graham presents it as a combination of positive expectations for the future and a sense of personal agency in bringing them about.

It is entirely credible that a specific lack of hope in this sense, more than present-focused unhappiness, should undermine motivation to act and emotional investment in the future. It has been well established in clinical psychology that hopelessness is more strongly associated with suicide than is depression, for example.

Despite hope’s plausibility as a driver of positive life outcomes and resilience, though, the evidence mustered by Graham’s studies falls short of demonstrating that it plays a causal role. Having high aspirations may be associated with doing better in life, but other factors, such as realistic assessment of one’s prospects, may give rise to both hope and positive life outcomes without the former influencing the latter. It is important to remember cautionary tales of oversold psychological concepts like self-esteem, which was once seen by advocates as a panacea for all manner of social and psychological pathologies and later shown to be primarily an effect rather than a cause of doing well in life.

A similar point can be made about proposed interventions to restore hope. Graham reviews a range of options, including community-based wellbeing initiatives, scaled-up mental health programs and private–public partnerships. But none of these directly target hope — many focus on building social connection and belonging rather than cultivating optimism — and there are few grounds to infer that a revival of hope is the main active ingredient in any benefits they may have.

Graham makes a strong theoretical case that hope matters and a strong methodological case that it should be measured, but current evidence is yet to justify any claim that hope is the key to unlocking social misery. Hope remains a promissory concept. This is not to criticise Graham’s championing of hope but simply to recognise the need for more definitive research. I suspect researchers who take up that challenge are backing a winner.

Although the book is generally accessible and engagingly written, it has a few imperfections. Statistical jargon (“endogeneity,” “fixed effects”) creeps in occasionally and tables of regression coefficients will glaze many readers’ eyes. Some points become repetitive, and it was an editorial oversight to allow four paragraphs to be repeated with minimal alteration in consecutive chapters.

Even so, The Power of Hope delivers its hopeful message with the passion and gravity the topic deserves. The book will interest academic readers from across the behavioural and social sciences, and anyone curious about the wider social and political relevance of the science of wellbeing. •

The Power of Hope: How the Science of Well-Being Can Save Us from Despair
By Carol Graham | Princeton University Press | US$35 | 200 pages

The post The ambiguity of hope appeared first on Inside Story.

]]>
https://insidestory.org.au/the-ambiguity-of-hope/feed/ 0
President Wilson on the couch https://insidestory.org.au/president-wilson-on-the-couch/ https://insidestory.org.au/president-wilson-on-the-couch/#respond Tue, 16 May 2023 05:29:49 +0000 https://insidestory.org.au/?p=74104

What happened when a diplomat teamed up with Sigmund Freud to analyse the president?

The post President Wilson on the couch appeared first on Inside Story.

]]>
Sigmund Freud’s first venture into biographical writing is a cautionary tale for anyone tempted to apply psychoanalytic ideas to historical figures. His essay on Leonardo da Vinci, first published in 1910, fixed on a memory Leonardo reported from his early childhood of a vulture descending on his cradle and repeatedly thrusting its tail in his mouth. Freud surmised that this “memory” was in fact a fantasy that revealed Leonardo’s homosexuality and his conflicted feelings about his mother.

Freud’s interpretation hinged on the mythology of vultures — including the ancient belief that they were exclusively female and impregnated by the wind — and the frequent depiction of the Egyptian goddess Mut with a vulture’s head and an androgynous body. He argued that Leonardo was preoccupied with vultures and had concealed one in the blue drapery of The Virgin and Child with Saint Anne, which hangs in the Louvre.

There was one small problem. The bird Leonardo recalled was not a vulture but a kite, a creature with no special mythic significance or any hint of sexual ambiguity. The error, made by a German translator of Leonardo’s writings, undermined Freud’s thesis and demonstrated the challenges of doing psychoanalytic interpretation at a distance. When the subject cannot be put on the couch, the already dangerous work of psychic excavation becomes even more hazardous.

This embarrassment might have led Freud to abandon psychobiography altogether, and indeed the general view has been that he did. In the monumental, twenty-four-volume Standard Edition of his work, his English editor and translator James Strachey wrote that “this monograph on Leonardo was not only the first but the last of Freud’s large-scale excursions into the field of biography.”

But that claim only stands if a notorious study of US president Woodrow Wilson written by Freud with American diplomat William Bullitt is brushed aside. This act of repression has been sustained for more than half a century by charges that Bullitt was a reductive amateur who was driven by personal animus towards Wilson and exaggerated Freud’s involvement.

Patrick Weil’s new book, The Madman in the White House, overturns this received view. Weil, a distinguished French political scientist, has written a captivating analysis of the history of the Wilson psychobiography that doubles as a biography of Bullitt. Along the way it vividly documents the shifts in American engagement with Europe from the first world war through the cold war from the standpoint of high-level diplomacy.

The book combines a masterful grasp of political history with original archival research and a humanising appreciation of the quirks and foibles of the dramatis personae. It does much more than resolve the status of a largely forgotten book about Wilson, also making a case that prevailing beliefs about responsibility for the failure of the post–first world war peace are mistaken. More broadly, Weil demonstrates how much personality matters in politics. “Democratic leaders,” he writes, “can be just as unbalanced as dictators and can play a truly destructive role in our history.”

William Bullitt emerges as a kaleidoscopically colourful and complex personality who witnessed the defining events of the first half of the twentieth century up close. After a brief period as a journalist, he was recruited in his twenties to work under Wilson during the negotiations for the Treaty of Versailles. He served as the first American ambassador to Moscow and as ambassador to Paris, helped to negotiate the Korean armistice and advised Chiang Kai-shek in Taiwan. He played major diplomatic and policy roles in both world wars and mingled with the political and cultural A-list: Wilson, Roosevelt and Nixon; Churchill and Lloyd George; Clemenceau and de Gaulle; Hemingway and Picasso; Lenin and Stalin (or “Stalin-Khan,” as he referred to him).

Bullitt’s life wasn’t all memos, starched collars and negotiation tables, and it had many Gatsbyesque elements: tumultuous marriages, hosting a Moscow soirée with performing seals and a champagne-drinking bear, enlisting in his fifties in the French army, landing upside down in a plane in a Leningrad swamp, and being shipped home to the United States from Taiwan in a coffin following a spinal injury.

Woodrow Wilson (standing) in New York after returning from the signing of the Treaty of Versailles. Historica Graphica Collection/Heritage Images/Alamy

There was also a dark side, with depressions, impulsive actions and a tendency to self-destruction, including a fall from a horse that he attributed to an unconscious wish. These symptoms led him to meet with Freud in Vienna for personal psychoanalysis in 1926, beginning a long association that saw the two become unusually close and Bullitt playing a role in helping Freud escape the Nazis via the Orient Express.

Their book project grew out of Bullitt’s plan to write a study on diplomacy that would include psychological analyses of world leaders, with Woodrow Wilson as one case. Bullitt had fallen out with Wilson over his failure to have the Treaty of Versailles ratified by the US Senate in 1919, despite Wilson having been a visionary architect of the treaty and its proposal of a League of Nations to secure global peace. He saw Wilson’s apparent inability to make concessions with Republican senators at critical moments as a colossal sabotage of what Wilson himself had created, an exercise in “strangling his own child,” and he ascribed it to Wilson’s character flaws.

This was a widespread view at the time: Keynes described Wilson as a “blind and deaf Don Quixote.” Freud agreed with his general assessment, once describing Wilson as “the silliest fool of the century if not all centuries” and Bullitt as “the only American who understands Europe.” The two men hatched a plan to collaborate on a study that would focus on Wilson alone.


Psychobiography is often viewed — and sometimes practised — as an exercise in armchair speculation and hatchet work unencumbered by evidence, but the dissection of Wilson’s character was anything but. Freud, perhaps stung by the Leonardo fiasco, insisted on collecting and analysing a substantial body of information on Wilson; Bullitt obliged with not only his extensive first-hand working experience but also interviews with several of Wilson’s high-ranking colleagues, hundreds of pages of notes, and countless diary entries from Wilson’s personal secretary. Then, at least on Bullitt’s telling, he and Freud met and communicated frequently over a period of years to formulate a shared understanding of Wilson’s psychodynamics and edit drafts of one another’s chapters.

The essence of their formulation was that Wilson lived in the shadow of his idealised father, a Christian minister, whom he believed he could never satisfy or equal. This father complex was shown in his driven approach to work, his tendency to present a Christ-like persona when defending his views, his moralising streak, his unwillingness to brook criticism or compromise when he took principled stands on issues, and his passivity towards paternal figures — a stance that led to bitter fallings-out with erstwhile good friends that haunted him for decades.

Bullitt and Freud attributed Wilson’s failures in delivering on Versailles and the League of Nations to this incapacity to make necessary accommodations at the last hurdle. They also drew attention to his tendency to defer to some national leaders during the earlier negotiations to the detriment of the treaty, including allowing Britain to make the excessive demands for postwar reparations that contributed to German resentment.

After extensive reworking over a period of years, the Wilson manuscript was completed in 1932, each chapter signed off by both authors. And yet it was not to be published until 1967. The reasons for the delay initially included Bullitt’s desire not to endanger his employment prospects in future Democratic administrations, a wish not to hurt Wilson’s widow, and an awareness that Wilson’s once tarnished reputation had been restored by mid-century, making a critical study unwelcome. Later, in the 1960s, Bullitt found it difficult to find a publisher and to obtain permission to publish from Freud’s estate. Freud’s daughter Anna, whom Bullitt had helped to rescue from Vienna in 1939, was deeply concerned to protect her father’s legacy and sceptical of Freud’s involvement in the book; she insisted on making numerous revisions, which Bullitt refused.

In the end, the book appeared to a chorus of critical reviews. Erik Erikson, the leading light of psychobiography at the time, attempted to block publication on receiving an advance copy. The book was criticised for being spiteful towards Wilson, repetitive, and clumsy in its psychoanalytic formulations and therefore unlikely to have been genuinely authored by Freud. Bullitt, who died only six weeks after publication day, must have felt crushed.


With the reputation of Thomas Woodrow Wilson: A Psychological Study going down in flames, the question of Freud’s co-authorship might have gradually lost what little intellectual interest it still held, especially as the published manuscript appeared lost or destroyed. Enter Weil, who rediscovered it in the archives at Yale University in 2014.

The Madman in the White House reports two significant findings. First, Freud’s heavy involvement in writing the book is now undeniable, established by his signature on all chapters and evidence of extensive revisions and annotations. Weil backs up this textual evidence with other quotes from Freud that express an unambiguous sense of personal ownership of “our book.” Critics who charged that Bullitt had deceptively Freud-washed his own work are mistaken.

Second, and perhaps just as importantly, Weil shows that Bullitt made several hundred revisions to the “final” manuscript prior to publication. Some of these edits are largely cosmetic: omitting one section on a political conflict that no longer seemed topical, updating some psychoanalytic terminology, and removing some very dated ideas about masturbation and castration anxiety. But many edits were substantive, involving removal of references to Wilson’s supposedly homosexual orientation. This inference didn’t imply conscious awareness or overt behaviour on Wilson’s part, and Freud believed everyone was to some degree bisexual, but Bullitt must have judged the claim too contentious to put in print.

Weil presents these discoveries with scholarly thoroughness but also with a light touch that makes the book a delight to read. Despite his implied criticism of the psychoanalytic establishment’s reception of the Wilson psychobiography, he defends the relevance of psychological insight to the understanding of political leadership. He accepts some of the contours of Bullitt and Freud’s analysis but disagrees about the nature of Wilson’s father dynamic. Joseph Wilson was a less perfect father than his son imagined and had a cruel tendency to humiliate him, Weil suggests. In his view, Woodrow’s political and interpersonal conflicts stemmed from his sensitivity to public humiliation more than anything else. Such an interpretation, invoking wounded narcissism and pathological autonomy rather than father or Christ complexes and latent homosexuality, certainly has a more twenty-first-century feel to it.

Whether or not readers are open to this kind of analysis, Weil makes a powerful case for the role of personality in politics. He closes with a counterfactual history of a Europe in which Wilson had not failed to deliver on his idealistic vision. British and French financial and territorial demands on the Germans following the first world war would have been moderated and less punitive, diminishing German bitterness. Squabbling nations would have been dissuaded from armed conflict. American intervention in the second world war would have been triggered earlier by security guarantees to France. So much carnage might have been averted had the men in charge been less damaged and better able to understand and regulate themselves at critical times.

Woodrow Wilson was in no real sense a “madman” and Bullitt and Freud were hardly unbiased observers. Even so, their book was a significant historical attempt to demonstrate how the psychology of individual leaders can have collective reverberations. With some caveats, Weil would probably agree with the basic sentiment he attributes to Bullitt, that “the fate of mankind was determined over millions of years by geography, over hundreds of years by demography, over tens of years by economics, and year over year by psychology.” His book is a brilliant historical investigation of an early attempt to reckon with those year-by-year influences. Both as a work of scholarship and as a sweeping, almost novelistic tour of twentieth-century political affairs, it deserves a wide readership. •

The Madman in the White House: Sigmund Freud, Ambassador Bullitt, and the Lost Psychobiography of Woodrow Wilson
By Patrick Weil | Harvard University Press | US$35 | 400 pages

The post President Wilson on the couch appeared first on Inside Story.

]]>
https://insidestory.org.au/president-wilson-on-the-couch/feed/ 0
Social fitness https://insidestory.org.au/social-fitness/ https://insidestory.org.au/social-fitness/#respond Wed, 22 Mar 2023 22:17:01 +0000 https://insidestory.org.au/?p=73410

A tight network of interpersonal connections is both a buffer and a blanket

The post Social fitness appeared first on Inside Story.

]]>
Self-help books inundate our bookshelves in a fattening flood that shows no signs of receding. Feeding this cultural La Niña is the widespread conviction that our lives, loves, minds and bodies could and should be better. That sense is amplified by social media comparisons with those who seem happier, thinner, prettier and more fully alive than we do, and by the belief that we are in the throes of a mental health and well-being crisis.

The self-help genre is diverse, ranging from the high-minded to the profane. Some books explore the latest research, bending the scientific branch so that readers can pluck a few peer-reviewed insights. Some give us simple advice, selling in direct proportion to how closely it counsels us to do what we already secretly want, such as not to give a f*ck about anything. All are supremely confident that they hold the key to improving our lives.

The Good Life and How to Live It falls closer to the first end of the spectrum. The evidence on which it rests is unquestionably the largest and arguably the best of any book of life advice. Authors Robert Waldinger and Marc Schulz are director and associate director of the Harvard Study of Adult Development, a research project that has been running since 1938, and their book reflects on its many decades of findings.

The study is an odd hybrid, composed of 268 sophomores from the male-only Harvard College of the early 1940s; 456 inner-city Boston boys from the same period; and more than 1300 of their descendants. Study participants were followed intensively throughout their lives, with an extraordinarily high retention rate by the standards of longitudinal research.

The focus of assessment reflected the changing preoccupations of the times, from skull measurements, handwriting analysis and questions about being ticklish in the early days, to genotyping and MRI scans today. Throughout the four score and almost seven years, though, psychological assessments have been central.

A study whose original 700-odd participants were all male, all white, a mix of gilded youth — John F. Kennedy was a participant — and tenement dwellers, and deliberately selected for being promising rather than representative, might seem an unlikely source of knowledge about normal human ageing. But one of the compensating virtues of the Harvard Study is that despite its demographic restrictions it revealed the diversity and unpredictability of paths through life. In spite of their advantages, many of the Harvard collegians were failures, especially when considering outcomes beyond conventional accomplishments, and their life trajectories ranged from tragic downturns to hopeful redemptions and everything in between.

The Harvard Study was designed as an inquiry into adult development and ageing, but The Good Life reframes it, not always convincingly but with good market awareness, as an investigation of happiness. Its central message is simple. As George Vaillant, the study’s director from 1972 to 2004, once put it, “the key to healthy aging is relationships, relationships, relationships.” The study consistently found that the quality and quantity of relationships, whether with caregivers, life partners or friends, is at least as strongly associated with health and longevity as well-known risk factors such as high cholesterol, smoking or obesity.

This finding squares with other evidence that loneliness and social isolation kill, that social support cures, and that a tight network of interpersonal connections is both buffer and blanket. Although our societies prize personal achievement, our technologies draw us away from in-person engagement and our lives become cluttered with busyness that takes priority over our social connections (“life is always at risk of slipping by unnoticed”), those connections remain paramount.

People tend to underestimate the benefits of linking up with others, over-value self-sufficiency and misallocate their time to asocial activities at the expense of interaction with loved ones. The happiest and most vital Harvard Study octogenarians have managed to avoid these traps.

Waldinger and Schulz examine the centrality of relationships from several standpoints. They discuss the developmental priorities of different life stages, the challenges and opportunities of intimate, family and work relationships, the special importance of friendships, and the ways of coping with stress that strengthen or weaken these relationships.

Given that the gradual withering away of friendships is a bleak and consequential reality for many people, especially men, avoiding interpersonal challenges or the effort of tending to them is a major risk. The Good Life argues for the importance of “social fitness” as an under-recognised source of good health that must be monitored, worked on and taught in schools.

The take-home message that social connection is the key to health and happiness is now almost common sense. But it was not always so. In the mid twentieth century, when the first wave of the Harvard Study began, the dominant view within the behavioural sciences was solidly individualist. Psychoanalysis, at the zenith of its influence in psychiatry, emphasised conflicts within the person as the source of human misery. Adult relationships were seen more as shadowy re-enactments of childhood dramas than as sources of health and strength. Even the psychosocial turn in psychoanalysis, led by Erik Erikson, saw the main developmental tasks of mid and late adulthood as fundamentally inward: finding a sense of personal contribution and integrity.

The leading psychologist of morality of the time, Harvard’s Lawrence Kohlberg, defined the highest level of moral reasoning as moving beyond social conventions and the social contract to a principled personal ethos. Influential humanistic ideas about motivation, such as Abraham Maslow’s famous hierarchy, placed the realisation of the unique self above social needs for esteem and love. Considering this intellectual matrix, the Harvard Study’s conclusions about the value of attachment, belonging and intimacy were not preordained, and they helped to shift the study of adult development and health in a relational direction.

Readers of The Good Life who are looking for a work of science communication will be disappointed. The authors provide few detailed reports of research findings and very few numbers. Although the book sits on a vast body of empirical results, Waldinger and Schulz rely much more on extended case studies of a few selected Harvard Study participants. Their professional identities as psychotherapists and, in Waldinger’s case, as a Zen priest, prevail over their identities as researchers.

As self-help books must, this one contains exercises and worksheets for those who wish to carry out their own relationship audit. But it is the life narratives that do the persuasive work by illustrating the uplifting message that relationships matter, that they can be cultivated and that it’s never too late to change them.

Bookish people and those who read quality online magazines often disdain works of popular psychology and self-help. The advice seems simplistic, the science flimsy and over-hyped, the tone annoyingly upbeat. We can understand and fix our unique and complex selves without that kind of asinine assistance. But there is little reason to think that sceptics have their relational house in order more than anyone else. The Good Life offers a useful guide to resisting the pull of self-reliance, personal striving and materialism, and instead investing our time and attention in other people. •

The Good Life and How to Live It: Lessons from the World’s Longest Study on Happiness
By Robert Waldinger and Marc Schulz | Penguin | $35 | 352 pages

The post Social fitness appeared first on Inside Story.

]]>
https://insidestory.org.au/social-fitness/feed/ 0
A frolic of its own https://insidestory.org.au/a-frolic-of-its-own/ https://insidestory.org.au/a-frolic-of-its-own/#comments Wed, 22 Feb 2023 03:16:32 +0000 https://insidestory.org.au/?p=73112

In a remarkable turnaround, the TGA has eased restrictions on the therapeutic use of psilocybin and MDMA. But will the benefits be fairly spread?

The post A frolic of its own appeared first on Inside Story.

]]>
Many people, even those working for the legalisation of illicit drugs, were surprised when Australia’s Therapeutic Goods Administration gave approval for psilocybin and MDMA to be used therapeutically in cases of severe and treatment-resistant depression or post-traumatic stress disorder. Psilocybin is the active ingredient in “magic mushrooms” and MDMA is also known as “ecstasy.”

Mind Medicine Australia, an advocacy group, was earlier knocked back when it sought to have the two substances legalised. On my reading, its bid had serious deficiencies: in particular, it cherrypicked studies rather than conducting a systematic review of all the evidence. In light of those problems, the TGA commissioned its own review of the evidence. The findings appear to have convinced the decision-makers that allowing limited access to the two substances is justified. In effect, the TGA took this decision on its own motion, offering an interesting case study of delegated authority.

The decision removes therapeutic uses of psilocybin and MDMA from Schedule 9 of the Poisons Standard, which includes heroin, crystal meth and other illicit drugs, and places them on Schedule 8, alongside medicines — opioid pain relief and ADHD treatments, for instance — to which access is heavily restricted. Australia has an inconsistent patchwork of state and territory laws governing illicit drugs, but they all refer to the Poisons Standard, which is administered by the TGA. This puts the TGA in an unusual position: at a stroke, a committee of clinicians and bureaucrats was able to rewrite drug laws across Australia. But it also means the decision was based on clinical and scientific evidence rather than the feverish law-and-order politics that dominates “the drugs debate.”

Some degree of backlash was inevitable. In an opinion piece for the Age, psychiatrist Patrick McGorry and two academic colleagues shared their concern that the benefits of these treatments haven’t been shown to outweigh the risks. Trials to date, they point out, have excluded participants with psychosis, eating disorders, substance use issues and other medical conditions, limiting them to people with the smallest possible risk profile. This means that we can’t extrapolate the likely effect of these substances in real-world practice. McGorry and his colleagues also note the lack of evidence on the long-term effects of the two substances’ use.

But the TGA’s decision is more limited than many commentators realise. It has approved the therapeutic use of these drugs without approving specific products containing them — and the pathway for accessing unapproved products is tortuous. A psychiatrist must apply to a Human Research Ethics Committee for permission to use an unapproved drug with human patients. Permission is only given for serious or life-threatening illness where no approved medicine can do the job. We are talking about severely unwell patients who have already tried everything in the medicine cabinet without any relief.

The psychiatrist must also provide evidence that the potential benefits outweigh the risks, and outline how they will manage any harms. In short, using psilocybin and MDMA in clinical practice faces exactly the same checks and safeguards as McGorry and colleagues must meet in their own clinical trials on psychedelic-assisted psychotherapy.

Most people with severe illness are going to fall at the first of those hurdles. Finding a psychiatrist is extremely difficult; demand far outstrips supply, and the cost of private care is eye-watering. It is going to be even more difficult to find a psychiatrist willing to apply to become an authorised prescriber of otherwise-illicit drugs. Psychiatry is a conservative profession, and it primarily views recreational drugs as a risk factor for mental illness rather than a potential therapy.

A psychiatrist who does become an authorised prescriber must then find a “sponsor” — a pharmacist willing to synthesise psilocybin or MDMA. Given Australia’s patchwork of drug laws, this is not entirely risk-free for the sponsor, even when the drug is intended for therapeutic use. Medical researcher Stephen Bright estimates the cost of experimental treatment with these substances at $20,000 per participant, accounting for study costs (including expert personnel), the substances themselves, and inpatient treatment. Faced with these barriers, many people with PTSD or severe depressive illness might contemplate accessing psilocybin or MDMA informally — outside of an experimental trial or the psychiatric pathway.

The criticism I’ve seen from psychologists is that the TGA has acted on data from phase two studies, which investigate the safety of a substance in human subjects, without waiting for more data from phase three studies, which estimate its efficacy against a given condition. There are two separate issues here: risk and uncertainty. We know the substances are, broadly speaking, safe to use under controlled conditions, but there remains uncertainty about how effective they might be. Experience teaches us to be cautious on that front: promising treatments like ketamine infusion have turned out to have profound but relatively short-lived effects on severe depression, giving people hope that was yanked away when experimental trials ceased.

That said, the TGA decision is consistent with the principles outlined for the unapproved products pathway. These state that the requirement for evidence is lowest when the need is greatest. In case that sounds counterintuitive, think of a patient facing death from an aggressive illness; beneficence suggests letting them consent to an experimental treatment, even if all it offers is hope. The TGA has recognised that people currently suffering with severe and untreatable mental illness are in a comparable situation. Hope is thin on the ground, and the potential benefit outweighs the relatively well-known risks of harm.

Much of our knowledge of these substances comes from informal use rather than experimental studies and clinical trials. This is knowledge that circulates via the “cultures of care” — a term coined by Australian HIV researcher Michael Hurley — that surround recreational use of these substances. Drug users have developed and actively share information and practices intended to minimise the risk of drug-related harms, including judgements about safe doses and titration, creating a safe environment, guiding a person through a “trip,” watching for overdose, and maintaining a safe pattern of use.

Psychiatrists and counsellors may find themselves working with depression and PTSD patients who are exploring informal use of these substances. Both groups may need to familiarise themselves with harm-reduction principles and strategies.

The TGA decision highlights something important for people interested in the debate over how to legalise or decriminalise the use of an illicit drug. Medicalising their use is one possible strategy. When the TGA proposed banning “poppers” — small bottles containing volatile alkyl nitrites for inhalation — I led an advocacy campaign arguing these products were the only way for many men and other queer people to have safe and comfortable anal intercourse. The TGA listened to our arguments and left the most common ingredient in poppers on Schedule 4, available for prescription by doctors.

In doing so, the TGA demonstrated a willingness to listen to communities speaking from expertise gained from decades of informal use of the substances in question — a willingness that may have played a role in its subsequent decisions on psilocybin and MDMA.

In most cases, though, securing legal permission for therapeutic use only benefits people with the resources to access specialist medical care. We have seen this with medicinal cannabis in Australia: people using approved products are paying hundreds of dollars each month for the privilege. People who can’t afford this treatment continue to use cannabis products from informal suppliers and remain vulnerable to policing and prosecution if they are caught. Medicalising illicit drugs reproduces what medical researcher Julian Tudor Hart called “inverse care law,” where medical care is hardest to access for those who need it the most. •

The post A frolic of its own appeared first on Inside Story.

]]>
https://insidestory.org.au/a-frolic-of-its-own/feed/ 1
On not burning out https://insidestory.org.au/on-not-burning-out/ https://insidestory.org.au/on-not-burning-out/#respond Wed, 15 Feb 2023 23:48:31 +0000 https://insidestory.org.au/?p=73055

Is the workplace malaise bigger than two organisational psychologists believe?

The post On not burning out appeared first on Inside Story.

]]>
Stan, a young psychologist in a community health centre, changed over three years from an “avid, eager, open-minded caring person” to an “extremely cynical not-giving-a-damn individual.” Reliant on alcohol and tranquilisers, he could only get through each day at work by approaching it “as if I was working at GM, Delco or Frigidaire” because “that’s what it has become here, a mental health factory!

An unnamed emergency doctor, fuelled by the desire to staunch suffering, worked long days in Afghanistan for six months treating patients with horrific injuries. It was not this experience, however, that led her to quit, but rather her time working in an emergency room of a community hospital on her return to America, where her days felt like a meaningless cycle of “rinse and repeat.”

How are we to make sense of the strange alchemy by which modern workplaces convert some workers’ enthusiasm, commitment and confidence into cynicism, withdrawal and depletion? In their new book The Burnout Challenge, organisational psychologists Christina Maslach and Michael P. Leiter offer two contentions about understanding and tackling these quiet workplace tragedies.

The first is a riff on the historical practice of putting canaries into coalmines to detect toxic air that could kill miners. “Let’s say our hope was to keep more birds singing in the mines. What would be our best approach? Should we try fixing the canary to make it stronger and more resilient — a tough old bird that could take whatever conditions it faced? Or should we fix the mine, clearing the toxic fumes and doing whatever else necessary to make it safe for canaries (and miners) to do their work?”

The second is the claim that an “increasing mismatch between workers and workplaces” is the cause of burnout, and that the mismatch takes six forms: work overload, lack of control, insufficient rewards, breakdown of community, absence of fairness, and values conflicts. The self-described “burnout shops” of Silicon Valley are archetypal mismatched workplaces, the authors say: employers brazenly boast of having an “always on” work culture, and workloads are so obviously unsustainable that a job in a startup is only feasible for a few years.

How or why this particular chamber of the “mine” got so toxic for its canaries, though, or its relationship to the other kinds of workplaces that make up the mine, is not something the authors attempt to explain. Befitting experts who have made major professional contributions in the form of taxonomies and systems for data collection (Maslach was the creator of the “Maslach Burnout Inventory,” Leiter the “Areas of Work Life Survey”), The Burnout Challenge takes a highly schematised approach in guiding managers to incrementally improve levels of worker efficiency, health and happiness in their workplace.

It turns out that decades of peer-reviewed research shows “receiving good vibes or compliments” to be “an uplifting experience” and that involving staff in rostering decisions can increase their sense of control. Evidence also backs the idea that “there should be sensible limits on work hours and a focus on how employees will complete tasks within that time rather than be forced to donate personal or sleep hours to finish their work.” It is particularly dispiriting for employees to have an overfilled plate of tasks when those tasks are lower than their capability level. The blunt assumption that “people deliver under pressure” is wrong.

Many employers could genuinely benefit from reading this advice, and not all of them are quiet admirers of the rise-and-grind, move-fast-and-break-things management philosophy of Silicon Valley. Maslach and Leiter provide lots of neatly enumerated lists illustrated by breezy narrative vignettes and even simpler graphics (the section on “insufficient rewards” gets a thumbs-down emoji; the chapter on “values conflicts” shows a broken heart lying in a pool of blood). If you find ticking off a to-do list an appealing mechanism for keeping the complexity of the world at bay, you may find all of this soothing.

Readers drawn to thinking about the deeper causes of burnout may be struck by how eye-wateringly obvious and familiar these statements are. Since F.W. Taylor popularised “hard” scientific management in the early twentieth century, observers have noted the self-defeating and demoralising consequences of attempting to wrest maximum value from employees.

Elton Mayo’s book The Human Problems of an Industrial Civilization, for instance, published in 1933, included chapters on fatigue, monotony and morale, all themes taken up by Maslach and Leiter. Like them, Mayo urged managers to view employment in relational terms. What workers required was recognition, community and respect.

Rather than asking ourselves what is wrong with this kind of advice, we may be better off considering why, ninety years on, books on these themes are still in demand. A simple response might be that, like the wisdom of a kindergarten teacher, basic lessons in how to behave well at work are always relevant and it never hurts to have them repeated: listen carefully, don’t demand impossible things, practise give and take, don’t steal, apologise when you make a mistake, and it is not okay to threaten people to get what you want.

Another, less rosy answer is that books of this kind not only are easy-to-read digests of academic research but also reinforce the longstanding idea that burnout is a consequence of inadequate managerial technique rather than inadequate worker power.

For their work to resonate with their time’s commonsense ideas of wellbeing, every generation of scholars in the human relations school has to tackle their task in a slightly different way. In Mayo’s day, employers were invited to understand themselves as grappling with “problems of human equilibrium,” “social disorganisation” and anomie. Today, Maslach and Leiter suggest, employers play a six-dimensional game of workplace matching with employees, a framing of the problem that resonates well with what theorists Luc Boltanski and Eve Chiapello dubbed the new spirit of capitalism — the emphasis on individual autonomy, initiative, intimacy, pleasure and creative expression that has prevailed for the last four decades.

The mismatch metaphor diverts attention away from the structural determinants of power and towards the agency of individual workers and their managers. Other actors — trade unions and the governments that set the terms of workplace relations — are relegated to the background.

The idea that tackling the burnout “challenge” involves aligning two puzzle pieces also invites us to assume that the parties negotiate with roughly equal bargaining power, and that if a match is not found either party may walk away in search of another, better match. If we view this drama as a mutual desire for a satisfying “click,” we are less likely to notice that the employer’s puzzle piece is usually made of unyielding metal while the employee’s is more often soft and pliable or, worse, brittle and prone to shatter.

Maslach and Leiter don’t wholly ignore unions. They credit the Australian stonemasons for establishing the norm of an eight-hour working day in 1856. But they don’t acknowledge unions as manifestations of the “beneficial social communities” they praise in other contexts. The Burnout Challenge does mention union-like formations on several occasions, but using terms like “cuddle huddle” and “social pod” that implicitly distinguish these “good,” “customised” forms of collectivity from “bad” ones that come “off the shelf.”

Digital technologies also make an appearance in The Burnout Challenge, but primarily as carriers of additional workload. Maslach and Leiter declare that “technology must be managed — and managing it means balancing on and off time.” This perspective ignores the vast potential of digital technology for workplace surveillance. Numerous observers have noted, for instance, that Amazon’s use of technology to monitor employees’ work rates is related to the dangerous intensification of work at the company. In the gig economy, of course, work may be coordinated and managed at scale in the absence of an employment relationship.

Nor do the authors touch on the digital dynamics quietly eroding anti-discrimination law, labour law, anti-trust law and privacy law through what legal scholars Richard Bales and Katherine Stone call “the invisible web at work.”

Maslach and Leiter coyly allude to the relationship between insecure employment and burnout in a discussion of “tentative relationships,” noting that workers may experience “anxious feelings about disposability” and that “it is difficult to exercise control in a relationship without confidence that the relationship will continue.” The logical extension of such observations — that it might be wise for managers to offer secure employment as a burnout-prevention measure — goes unstated.


In airbrushing power from the narrative, The Burnout Challenge also diminishes employer powerlessness, even though recognising the limits of what managers can do is surely crucial to understanding the phenomenon and its solutions.

Many of the “values mismatches” the writers refer to are the product of institutional frameworks over which employers exercise little control. Consider, for instance, the clash between market-based mechanisms and professional–client obligations in care work and other highly “emotionally connective” jobs. As sociologist Allison J. Pugh has brilliantly observed, a panoply of workers — therapists, teachers, doctors, carers and more — need to be present and receptive, and bear witness, to the emotional truths of others.

Such acts of recognition are inherently particular, and often experienced as among the more meaningful aspects of work. They are also, thanks to New Public Management approaches, subject to massive pressures from rationalisation, standardisation and accountability, via manuals, checklists, scripts, templates and digital platforms, and are organised within “regimes of scarcity.”

A generation of research has observed that psychosocial injury is linked with high quantitative demands, insecure work conditions, compromised skill development, low control of work tasks, and market-based systems. These conditions are not unfortunate accidents; they are the inevitable concomitants of treating care work as a commodity to be delivered “just in time.”

Such dynamics are very difficult to grasp when burnout is framed as a workplace-level issue, as Maslach and Leiter do when they recount what happened after the takeover of a rural hospital by a private health organisation. Following a “values clarification process” of town hall meetings, focus groups and unit-level discussions, a new hybrid values system emerged, with neither hospital nor new owner “winning.”

The authors suggest that managers struggling with similar values clashes should consider the parable of the janitor at NASA who, in 1961, encountered president John F. Kennedy. When the president asked why he was working so late mopping floors, he responded, “I’m helping to put a man on the moon!” Maslach and Leiter distil a set of “sense-giving” steps for managers from this tale: “narrow the focus on one goal; shift from an abstract description of the goal to a concrete one; set up clear milestones to the goal; give life to the idea by using persuasive language and rhetorical techniques.”

A radical indifference to context is apparent here. Not only does cold war–style talk of national purpose no longer apply, but a cleaner would now, more likely than not, be an outsourced worker with an entirely different manager from the “core” workforce to whom the “purpose talk” applied.

The assumption that healthcare is, in essence, similar to putting a man on the moon is also highly questionable. It would only be exaggerating slightly to suggest that true burnout-minimising leadership in a care context involves flipping each of the proposed “sense-giving” steps on its head: encourage employees to widen the focus from the goal to the person; embrace the abstract and expansive goal of hearing someone; reassure employees that work of this kind is not linear and that hearing someone’s needs in a meaningful way takes time; recognise the overwhelming significance of employees having the security, time, skills and trust to form relationships.

It is unsurprising, perhaps, that The Burnout Challenge also has little to say about how the stories that live in our heads and in our culture contribute to burnout. Maslach and Leiter offer scant recognition of how the idea of “passionate work” saturates our culture to the extent that it forms an “interpretative schema” for the world, long before any individual crosses the threshold of a particular workplace.

Drawing on the work of Raymond Williams, communications researcher Renyi Hong has argued that passion is now a “standard post-Fordist affect.” The idea of “passionate work” fosters a sense of an ideal capitalist self: well managed, resilient and capable of coping with economic fluctuations and precarity. It is an argument that resonates closely with Lauren Berlant’s concept of cruel optimism: relationships made available at work under capitalism offer themselves as objects of desire while also representing profound obstacles to flourishing. The relevance of both ideas to employee burnout is obvious.


As a concept, “burnout” need not be bereft of political potency. It can invite critical thinking about limits, energy and reproduction, and help reframe the “non-productive” as crucial and valued. It can question the logic that intensity is inherently good. In the climate change era, these questions are excellent ones to be asking, both of ourselves as individuals and of societies as a whole.

A politics built on the refusal to be “burned out” recalls the insistence that labour is not a commodity and invites its extension to other things — such as rivers, skies, forests and all the ecosystems of which they are part — that we should not readily accept as simply raw materials for production. It asks that we consider how we will revalue and protect all of the human spaces founded on non-capitalist values — families, schools, welfare states, unions and Indigenous communities, to name a few. It would be a politics, in other words, built on deliberate and defiant mismatches.

The Burnout Challenge, needless to say, is not concerned with such a politics. While its authors may have rejected the project of turning the canary in the coalmine into a “tough old bird” capable of enduring toxicity, their book doesn’t do much to analyse or adjust the underlying sources of the mine chamber’s fumes. Much less do they consider inviting the bird to participate in the endeavour.

Maslach and Leiter offer modest and practical suggestions that may help some bad workplaces become less bad. But the bird they construct as their object of concern is a lonely and flighty one, primed to glide into the next chamber in search of better air, sustained by the hope that their “match” is just around the corner. •

The Burnout Challenge: Managing People’s Relationships with Their Jobs
By Christina Maslach and Michael P. Leiter | Harvard University Press | $51.95 | 272 pages

The post On not burning out appeared first on Inside Story.

]]>
https://insidestory.org.au/on-not-burning-out/feed/ 0
Appointment with death https://insidestory.org.au/appointment-with-death/ https://insidestory.org.au/appointment-with-death/#respond Mon, 06 Feb 2023 06:33:16 +0000 https://insidestory.org.au/?p=72625

How best should we cope with our awareness of death — and a desire to control when it happens?

The post Appointment with death appeared first on Inside Story.

]]>
Even in our darker moments, few of us are likely to agree with philosopher David Benatar that it would be preferable not to have existed. Living brings pain and suffering, Benatar reminds us, which eclipse pleasure and happiness. Non-existence nullifies pain — a good thing — and means no one is around to miss out on pleasure — no bad thing. Hence, as Benatar’s 2006 book title bleakly announces, it’s Better Never to Have Been.

Although they may not have reached these heights of nihilism, many people do wish their lives would end, or at least that they could be cut short if they became unbearable. With assisted dying increasingly in the news, Caitlin Mahar’s new book, The Good Death Through Time, presents an enlightening history of the desires of people suffering from terminal illness or planning for a dignified ending, and of the cultural shifts, religious values and medical advances that have shaped, supported or obstructed them.

Before acquiring its more familiar contemporary meaning about 150 years ago, euthanasia simply meant a good death. Dying was seen as a spiritual ordeal to be endured with Christian patience, and thus a test of courage and character. Much emphasis fell on what came after death — salvation or something much worse — rather than its attendant agonies. “For the faithful,” Mahar writes, “a good death was marked by the embrace or overcoming of suffering rather than its elimination.”

Just as well: doctors at the time had no power to alleviate pain. In fact, they believed it was beneficial to health, and were more apt to cause than cure it with their treatments. In any event, preparing the soul for death was judged more necessary than dulling the mind.

Some of this changed in the mid nineteenth century with the advent of opiates and other anaesthetics, prompting the earliest medicalisation of dying. Euthanasia came to refer to deaths eased by a physician’s care with the aid of narcotics. Pain was increasingly seen to lack redemptive qualities; reducing it might even help the dying to focus on spiritual matters. Mahar argues that this shift in attitudes reflected a more general rise in people’s dread of suffering and sensitivity to discomfort.

That rise, which William James characterised as a “strange moral transformation,” drove campaigns to reduce needless pain by outlawing vivisection, corporal punishment and blood sports. But it also provoked a backlash that foreshadowed present-day sneering at thin-skinned progressive “snowflakes.” A British critic of the voluntary euthanasia movement in 1906 ridiculed it as the home of pain-averse “literary dilettanti” and “neurotic intellectuals,” a charge later echoed by an opponent of euthanasia legislation who worried “we were getting too soft as a nation and too sensitive to pain.”

Mahar offers a compelling account of the rise of British voluntary euthanasia activism in the 1930s, a movement that originated within the medical profession and aimed to give doctors the power to accelerate lingering deaths using morphine and other narcotics in strictly limited circumstances. Despite having eminent supporters such as George Bernard Shaw and H.G. Wells, legislation failed after opponents raised concerns about the potential for abuse by relatives, slippery slopes, medical overreach, and the challenges of regulation.

The revelation that the Nazi regime euthanised well over 100,000 disabled people further damaged the voluntary euthanasia cause, reversing prior support within the medical community and undermining public support for the idea that some lives are “not worthy to be lived.” Mahar shows how eugenics-inspired advocacy for involuntary euthanasia of the intellectually disabled — advanced in Australia by University of Melbourne anatomy professor Richard Berry, whose name was permanently scrubbed from a campus building in 2016 — has tarnished the voluntary euthanasia movement.

The Good Death Through Time provides an authoritative examination of euthanasia debates, court cases and initiatives from the 1950s to the present. Mahar identifies shifts in the groups viewed as suitable for euthanasia, including people on life support or in unrelenting pain not linked to a terminal or incurable condition, as well as in the rationales offered for the practice. Although reducing suffering remains paramount and fear of pain may paradoxically have grown with medicine’s rising capacity to palliate it, voluntary euthanasia has been framed increasingly as a matter of rights, dignity and personal empowerment rather than alleviation of distress.

Australia has been near the forefront of legislative developments. Advocates for voluntary euthanasia argue that overly narrow eligibility requirements have led to unnecessarily bad deaths for those excluded. Disability activists, on the other hand, caution against broadened criteria, citing the Dutch experience of rising euthanasia among people with dementia or mental illness. Mahar concludes with a concise epilogue covering this recent context.

The Good Death Through Time is a lucid and well-documented guide to a challenging topic. Mahar provides a sympathetic but clear-eyed picture of euthanasia’s many protagonists and perspectives without forcing a single view onto the reader. The scholarship is global, but the focus on Australia and Britain adds to the book’s local relevance.

Mahar’s work is especially compelling as an account of the medical profession’s role in euthanasia, in all its meanings. The profession’s views on the desirability and scope of euthanasia have waxed and waned, its pharmacological tools enabling the practice while altering popular attitudes and increasingly pathologising pain. There is no better guide than this one to the wider context of current debates about assisted dying.


Philosopher Dean Rickles’s Life Is Short approaches death from a quite different angle, though he would agree with proponents of voluntary euthanasia that how we fashion our lives and deaths should be a profoundly personal choice. In re-visioning Seneca’s On the Shortness of Life, he wants to persuade us that although we may dread the end of life and entertain fantasies of eternal youth and immortality, it is life’s finitude that gives it significance.

“To have a meaningful life,” he writes, “death is necessary.” Only by having and recognising limits — “the very stuff of meaning” — can we make purposeful choices to create our selves and realise our futures, rather than being tossed around by life.

Life Is Short takes this idea and runs with it through eight brief but somewhat meandering chapters. Rickles suggests that the desire for immortality, or even just for a longer life, is often driven by a reluctance to foreclose future possibilities by making hard choices in the present. He dissects the difficulties individuals face in dealing with our future, notably temporal myopia — discounting the future relative to the present — and the less familiar but no less destructive favouring of the future at the present’s expense.

The key to overcoming these “diseases of time,” he suggests, is to develop a strong sense of connection with one’s future self rather than seeing it as a stranger. “[O]ur present self just is the future self of our past self! Treat every future time as equally as Now, because it will be Now later, and it will be your Now.”

How we should go about making a more meaningful life comes down to making it a project (“Project Me”), carving out a future by choosing and acting rather than leaving options forever open. Doing this requires us to overcome the sense that life is provisional and not yet quite real, which Rickles dubs “onedayism.” That process of overcoming involves understanding ourselves and our motives better. We must move beyond the childish feeling of being unbounded and invulnerable to a mature commitment to a purposeful life and work, dull as that may sound.

Despite his general breeziness and references to contemporary popular culture, Rickles’s intellectual influences have an oddly mid-twentieth-century flavour. Existentialist writers (Sartre, Camus, Heidegger, early Woody Allen) get guernseys, with their ruling image of solitary individuals creating heroically authentic selves against a backdrop of cosmic meaninglessness.

Carl Jung takes centrestage in the book’s second half; not the kooky, occult Jung of mandalas, the collective unconscious and flying saucers but the wise Jung of personal identity and the process of maturation. Rickles discusses at some length Jung’s ideas about individuation — the development of a coherent self through understanding our unconscious motivations — and how the archetypes of the present-oriented child (Puer) and the prudent elder (Senex) shape how we age.

What is noteworthy about this cluster of ideas is not just how much they have been generationally cast aside, but also how they portray our orientation towards life and death as fundamentally lonely and stoical. To Rickles, the authentic, unprovisional life is one in which individuals exercise their will by making resolute choices, pruning the branches of their tree of possibilities, and committing to a specific future.

There isn’t much room for other people in this vision of autonomous self-creation. They tend to figure primarily as the conformist horde who stand in the way of us becoming authentically ourselves by tying us down with their norms and expectations. Yes, each of us exists as a solo being with a unique beginning and end, but something is missing in an account of life’s meaning when relationships and social life are so apparently incidental.

It is well worth spending one of the last thousand or so Saturday afternoons we have left on Life Is Short, but in some ways it is an odd book. Contrary to its subtitle, it offers few concrete prescriptions for living a more meaningful life, so it is not a self-help book, however highbrow. Despite the amiable, self-disclosing persona of the author, its level of abstraction is too high for it to be accessible in a de Bottonian way, although Rickles sprinkles it with some memorable epigrams (“death anxiety is the ultimate FOMO”). Its intellectual style is too associative and wandering to be a philosophical treatise on the nature of life’s meaning.

All the same, as a meditation on a very big question — perhaps the biggest of them all — Life Is Short achieves its goal of making us think about the unthinkable. •

The Good Death Through Time
By Caitlin Mahar | Melbourne University Press | $35 | 256 pages

Life Is Short: An Appropriately Brief Guide to Making It More Meaningful
By Dean Rickles | Princeton University Press | $34.99 | 136 pages

The post Appointment with death appeared first on Inside Story.

]]>
https://insidestory.org.au/appointment-with-death/feed/ 0
Captains unpicked https://insidestory.org.au/captains-unpicked/ https://insidestory.org.au/captains-unpicked/#respond Fri, 03 Feb 2023 05:02:54 +0000 https://insidestory.org.au/?p=72894

What impact do biographies of living politicians have on their subjects?

The post Captains unpicked appeared first on Inside Story.

]]>
Do biographies of living politicians affect their careers? Sometimes, as with campaign biographies, that’s what they’re intended to do; at other times, the impact might be unintentional.

When Julia Gillard was deputy prime minister, Chris Wallace began work on an unauthorised biography. Kevin Rudd’s government had yet to flounder on the shoals of his inability to manage the business of government and deal with factional ambitions. Gillard was in line to be Australia’s first female prime minister, but not yet, and not in circumstances where she could forever be blamed for stabbing Rudd in the back.

By the time the biography was finished and about to be published by Allen & Unwin, Gillard was prime minister and leading a minority government. Her ascension to power, both the fact and the manner of it, had unleashed an appalling campaign of misogyny, sanctioned if not overtly led by opposition leader Tony Abbott, supported by right-wing journalists and carried in a stream of filth on social media. She was also being undermined from within, by Rudd and his supporters.

As publication date approached, the publisher’s publicist told Wallace to expect a media frenzy. Gillard’s enemies were waiting to pounce, to comb the book for dirt.

Wallace already knew how a biography can be used against its subject. In 1993 she had published a biography of then opposition leader John Hewson, who looked likely to defeat Paul Keating at the forthcoming election. Her motivation was public interest — Hewson was relatively unknown — but the Keating camp scoured the book for scandals and insights into Hewson’s vulnerabilities.

This could happen again, she realised. Gillard is a human being and had her flaws, but Wallace didn’t want to contribute to the destabilisation of her government and the election of the Abbott-led alternative. So she pulled the book and paid back the advance. She felt, she told her disappointed publisher, that she had “no other moral choice.”

“I did not intend the Gillard biography to help or hurt Gillard’s political fortunes,” she writes in her new book, Political Lives: Australian Prime Ministers and Their Biographers. “When I conceived it, I did not foresee its potential exploitation by ‘bad actors’ in the supercharged political climate which developed.”

The experience got her thinking about other political biographers who had written about living subjects. Did any of their books influence their subjects’ career, for good or ill? So she embarked on a PhD, researching every biography written during the active career of every twentieth-century Australian prime minister and interviewing every living Australian PM and every living biographer of a contemporary PM.

What becomes clear as the book proceeds is that her question really only applies to the period since Whitlam became leader of the Labor Party in 1967. For the first half of the twentieth century her research question barely applies. Before John Gorton, only two contemporary biographies of prime ministers — of Billy Hughes and John Curtin — were published.

But Wallace did discover some unpublished gems in the archives: Herbert Campbell-Jones’s “The Cabinet of Captains,” which covers the first six prime ministers, and Alfred Buchanan’s “The Prime Ministers of Australia.” Both men were journalists, as have been most of the writers of contemporary biographies.

Also in the archives was a manuscript biography of Robert Menzies by Allan Dawes, based on interviews with Menzies undertaken in the early 1950s, before his electoral dominance was established. Wallace speculates that as Menzies’s popularity rose he saw less need for a humanising biography.

In the first half of Political Lives, rich in detail, Wallace proves herself to be a consummate archival detective. One example touches on my own work. Part of the Dawes manuscript is in Menzies’s papers at the National Library, and I drew on it for my book, Robert Menzies’ Forgotten People. But Wallace found more in Dawes’s papers, where I hadn’t thought to look. What she found has much historical interest, though the men and the times are now a long way from popular memory.


The second, livelier half of Political Lives begins with Alan Trengove’s book John Grey Gorton: An Informed Biography, which revealed the potentially controversial fact that John Gorton’s mother was unmarried. Published after Gorton succeeded Harold Holt as prime minister in early 1968, and cleared by Gorton himself, the revelation had little impact on his reputation. But it marked a turning point in Australian contemporary political biography. Revealing rather than concealing personal information now became an option, and politicians could consider managing the release of awkward facts.

But the genre really takes off with Gough Whitlam. When he became Labor leader in 1967 he was little known outside political circles, and many of those who did know of him were puzzled as to what this supremely confident, urbane, middle-class man with a penchant for quoting the classics was doing in the Labor Party. In 1970 the young Laurie Oakes began researching a biography, though Whitlam PM didn’t appear until late 1973, a year after Labor won office.

Oakes’s book was a revelation to many, especially in its account of his subject’s childhood and parents. Growing up in Canberra as the son of the Crown solicitor made Whitlam a social democrat with a belief in the capacities of government to improve people’s lives.

Oakes wrote a quartet of books about Whitlam (two of them with David Solomon), but Whitlam’s great biographer was his legendary speechwriter, Graham Freudenberg, who was so close to Whitlam that he could speak on his behalf. Freudenberg’s A Certain Grandeur was published in October 1977, after Whitlam’s government was dismissed and just before the election that would be his last. The book didn’t have enough time to influence the outcome of the election, after which Whitlam retired from active politics, but it did put the travails of his government into the broader context of the economic and political forces that battered it.

The book that most fits Wallace’s question is Blanche d’Alpuget’s biography of Bob Hawke, published in 1982 when he was positioning himself to wrest the leadership of the Labor Party from Bill Hayden and become prime minister. D’Alpuget, a successful novelist, had just completed a well-regarded biography of Sir Richard Kirby, former president of the Conciliation and Arbitration Commission. She knew Hawke well — they were lovers — and he trusted her to write a “warts and all” biography, with his womanising and drinking included alongside his political achievements and talents.

Through Angus McIntyre, whom she had known when she was living in Indonesia with her diplomat husband, d’Alpuget had got to know Melbourne University’s Graham Little and others in the Melbourne Psycho-Social Group, a loose association of academics, psychoanalysts and therapists interested in the application of psychoanalytic ideas outside clinical settings. D’Alpuget’s biography is far more psychologically acute than the usual political biography, with a rich account of Hawke’s childhood and family relationships, especially with his difficult and demanding mother.

The book gave the Labor caucus a more nuanced understanding of the man, winning him votes in the leadership contest with Hayden. More interestingly, it gave Hawke greater insight into himself. D’Alpuget conducted long interviews with him, stirring up memories of his childhood which, she believed, helped him conquer his compulsive drinking. Wallace’s chapter on Hawke, with its intriguing insights from d’Alpuget about how she wrote the book, and its effects, is the highlight of Political Lives, and d’Alpuget and Hawke appear on the cover.

Wallace also discusses Stan Anson’s Hawke: An Emotional Life, published just after Hawke had survived a leadership challenge from Paul Keating. Anson made overt use of psychoanalysis and ideas of narcissism to understand Hawke, and drew heavily on d’Alpuget’s account of Hawke’s childhood. She was furious and threatened legal action under the Trade Practices Act, accusing Anson of passing off her work as his own.

Wallace concludes that the fuss stopped psychoanalytically informed biography dead in its tracks in Australia, at least when it came to contemporary politicians. This is probably true, but it doesn’t mean that psychoanalytic ideas disappeared from writing about Australian politics. Psychoanalysis was one of the intellectual traditions shaping my Robert Menzies’ Forgotten People (1992) and Graham Little drew on psychoanalytic ideas in the profiles of leading politicians he published in the press and in his last book, The Public Emotions (1999).

Wallace ends her book with a plea for more psychologically sophisticated biography writing. Most contemporary biographies of politicians are written by journalists, who often uncover hitherto unknown facts but are generally short on ideas and thin on interpretation. That’s why d’Alpuget’s biography of Hawke stands out, as does, more recently, Sean Kelly’s The Game, a devastating dismantling of Scott Morrison’s smoke and mirrors.

Kelly may not have drawn explicitly on psychoanalytic theories, but by paying close attention to his subject’s language and characteristic rhetorical strategies he showed what can be done within the bounds of the genre. His book probably didn’t damage Morrison’s political fortunes, however. That was something Morrison was already doing for himself. •

Political Lives: Australian Prime Ministers and Their Biographers
By Chris Wallace | UNSW Press | $39.99 | 336 pages

The post Captains unpicked appeared first on Inside Story.

]]>
https://insidestory.org.au/captains-unpicked/feed/ 0
Threshold moments https://insidestory.org.au/threshold-moments/ https://insidestory.org.au/threshold-moments/#comments Fri, 16 Sep 2022 00:42:05 +0000 https://insidestory.org.au/?p=70748

Is it any surprise that we cling to old rituals and invent new ones?

The post Threshold moments appeared first on Inside Story.

]]>
There is something special about doorways. Around the world they are accessorised with symbols and customised with ceremonies. In ancient Rome, doors were adorned with wreaths and anointed with oil and wool during weddings, and hung with cypress branches for nine days after a death. Following the birth of a child, three men impersonating deities attacked the threshold with an axe, a pestle and a broom to ward off evil. Less ancient rites, recommended by wellness influencers, instruct modern homeowners to blow cinnamon through their front doors to ensure prosperity, or at least to create the illusion of baking.

The same strange theatre takes more idiosyncratic forms. In his famous Life, James Boswell recounts how Samuel Johnson would approach every door with “anxious care,” counting out a precise number of steps before taking a great lunging stride across the threshold, always leading with the same foot. Others noted how Johnson “whirled and twisted about to perform his gesticulations” before dramatically entering a house, abandoning his blind housemate Mrs Williams to grope around on the front steps.

Welcome to the world of ritual, where apparently odd and excessive actions are saturated with meaning and obligation. Literal thresholds feature in some cases, but more often the thresholds are figurative: portals between stages of life, social positions, and states of mind, health or sacredness. In his new book, Ritual: How Seemingly Senseless Acts Make Life Worth Living, the anthropologist Dimitris Xygalatas provides a compelling account of the cultural and psychological dimensions of these practices. In the process he reveals a new kind of interdisciplinary science, far from the armchair speculations of an earlier generation of scholars.

Having made a lifelong study of ritual, Xygalatas has come to view it as a fundamental part of human nature. Although ritualistic actions can be observed among birds and mammals, including the scrotum-grasping rites of male baboons, humans alone are “the ritual species.” Archaeological evidence suggests that rituals were enacted by early hominids 400 millennia ago. They serve important psychological and social functions, Xygalatas argues, deepening group solidarity, providing a sense of control over uncertainty, and contributing to wellbeing. We have a profound and innate need for ritual; and by some accounts obsessive-compulsive disorder, which troubled Dr Johnson, is a pathological expression of that urge.

Rituals, to Xygalatas, are “traditions that involve highly choreographed, formalised and precisely executed behaviours that mark threshold moments in people’s lives.” Their key elements are rigidity, repetition and redundancy: ritualistic acts must be performed in a strictly prescribed way, repeatedly, and longer or more frequently than seems practically required.

How rituals produce their effects is causally opaque: people who perform them rarely have an explicit understanding of the mechanisms that connect ritualised acts to desired outcomes or even of the goal of those acts. Instead, they answer the “why” of ritual by appealing to tradition. We have always done it this way.

Although the goals and modes of influence of rituals may be obscure, their functions are not. Xygalatas emphasises two main types, one to do with taming uncertainty, the other with fortifying group cohesion as a form of “social glue.” In the former case, rituals attempt to exert control over a capricious world and reduce anxiety by creating a sense of predictability and order. Ritual flourishes in societies that face greater threats and in zones of everyday activity where uncertainty reigns, such as sport and gambling. Studies find that ritualistic behaviour increases when fear is triggered, and that ritual performance is effective in diminishing it.

Xygalatas gives more attention to the social functions of ritual. He argues that collective rituals reinforce solidarity, mark group identities, and generate strong and enduring feelings of belonging. They demonstrate the loyalty and trustworthiness of group members, especially when rituals are onerous and costly, and promote generosity and mutual helping. Research shows that engagement in rituals has an array of benefits for individuals, from enhanced fertility to tighter and more supportive social networks to better mental and physical health.

These benefits may be especially strong for the extreme rituals that generate what Durkheim called “collective effervescence.” Xygalatas shows how participation in high-intensity rituals, which often involve extravagant pain and are more common in societies facing severe threats, helps to expand people’s sense of self and fuse their personal and group identities. The active embrace of shared suffering is uniquely powerful in strengthening social bonds and in healing the sick and downtrodden, who tend to seek out the harshest ordeals and derive the greatest benefit from them.

Xygalatas illustrates his ideas with many vivid descriptions of rituals, including examples from his own research. This research will be eye-opening for readers who have learned about ritual from ethnographies or journalistic sources. Xygalatas advocates for a new “ritual science” that combines the best aspects of anthropology, psychology and cognitive neuroscience and avoids their weaknesses. Anthropology contributes the virtues of thick description and direct observation in the field, but without the tendency towards theoretical obscurity. Psychology contributes the experimental method and quantification, but its tendency to wrench behaviour from its context is jettisoned. Neuroscience offers a systematic way to examine the embodiment of ritual, but without the reductionist conclusion that ritual is, in essence, merely a set of brain processes.

Aspirationally at least, this synthesis offers a rigorous way to investigate ritual in the field. Xygalatas describes at length his studies of fire-walking rituals in Spain and body-piercing kavadi rites in Mauritius, both conducted onsite with full community engagement. Participants in these studies wore monitors that tracked their heart rate and other physiological signs, and were found to experience extreme levels of arousal, stress and pain while performing the rituals.

More surprisingly, perhaps, people who wore the same devices while observing the fire-walkers showed very similar, synchronised patterns of arousal, with greater similarity the more socially close they were to the walker. Related studies have examined shifts in hormonal activity among participants in wedding ceremonies. Not only is ritual embodied in the physiology of individuals, but that embodiment is embedded in social relationships.


Before reading a book like this one, many of us might be inclined to see rituals as pre-modern oddities and superstitions, things we would need a passport to observe firsthand. Xygalatas makes us ponder the persistence of ritual in our secular times and helps readers recognise that it is all around us.

The ceremonial accession of a new British monarch, and the outrage at people who dare to puncture the membrane of solemnity that surrounds it, make this fact salient in a once-in-a-generation way. If, as Xygalatas maintains, “rituals fulfil primal human needs,” it should be no surprise that we cling to old ones and invent new ones, like the recent Japanese divorce ritual of writing marital grievances on pieces of paper and flushing them down the toilet.

But ritual is also expressed in so many more everyday forms. If rigidity, repetition and redundancy are its hallmarks, we can see it in the choreographed way our football seasons approach their crescendo, in the coming wave of graduation ceremonies, and in any number of celebratory events.

Traces of the same elements can be seen in how we write reference letters, format papers, make apologies, order meals, start conversations, run meetings, and so on and on and on. What justifies itself as efficient, rational and modern often retains a whiff of ritual incense. And, as Xygalatas might say, that’s not necessarily a bad thing. We can’t escape ritual, and if we could, we might find ourselves more alone, anxious and alienated than we are already. •

Ritual: How Seemingly Senseless Acts Make Life Worth Living
By Dimitris Xygalatas | Profile Books | $56.50 | 320 pages

The post Threshold moments appeared first on Inside Story.

]]>
https://insidestory.org.au/threshold-moments/feed/ 3
Even amoebas https://insidestory.org.au/even-amoebas/ https://insidestory.org.au/even-amoebas/#comments Sun, 04 Sep 2022 01:35:45 +0000 https://insidestory.org.au/?p=70481

A prince and a psychologist detect more of the Good Samaritan in humans than we might imagine

The post Even amoebas appeared first on Inside Story.

]]>
Students in a famous social psychology experiment were instructed to prepare a short talk and then go to a nearby building to record it. Some were told they had ample time to get there, others that they were right on time, and still others that they were running late. As they walked along a narrow alleyway to the second building, they encountered a slumped and groaning man in obvious trouble: an actor planted by the experimenters, of course. The rushing students were much less likely to offer help to the faux victim than their less hurried peers.

So what, you might say. Of course time pressure makes us less inclined to help others. But here’s the kicker: the students were seminarians, and the talk was on the parable of the Good Samaritan. It seems that even people called to minister to those in need — people who had just been primed with the virtues of altruism — generally fail to assist someone in distress for the most trivial of reasons.

Studies such as these reinforce a widespread cynicism about human nature. We are a selfish species and there is a canyon-sized gap between our high principles and our unedifying behaviour. We might pretend to be compassionate and concerned for the greater good, but our actions show us to be morally myopic and selfish.

Stephanie D. Preston, a psychology professor at the University of Michigan, wants to undermine that cynicism. In her new book, The Altruistic Urge: Why We’re Driven to Help Others, she makes the case that we have an instinct to help the vulnerable that is rooted in an offspring-care system we share with other mammals. That basic motivation is underpinned by shared brain circuitry and expressed in actions to protect those in urgent need. The hero who plunges into a raging torrent or inferno to save lives replicates the rat who retrieves its pup — an ultrasonically squeaky pink jelly bean — when it strays from the nest.

Preston offers the rat’s pup retrieval as a prime example of the altruistic urge she proposes. Its neurobiology is well understood, it is homologous to some human acts of helping, and it has a prototypically vulnerable target, the helpless infant. She argues that at least some human altruism is closely akin to offspring retrieval. It is instinctive rather than socialised, it is action-oriented rather than simply involving empathic feeling, and it doesn’t depend heavily on higher cognition. Although we may flatter ourselves that our altruistic behaviour springs from moral deliberation, it is sometimes simply automatic.

Calling the altruistic urge an instinct might seem to imply it is a rigid tendency applying only to non-human animals. Preston rejects this anthropocentric assumption and clarifies how the contemporary view sees instincts as flexible and open to being overridden. “Even amoebas,” she writes, “exhibit context-sensitive altruism.”

The altruistic urge involves a neural approach tendency deep within the brain that constantly contends with an avoidant tendency not to help. Helping may only happen when the helper is properly prepared — hormonally or through prior experience, in the case of the parental rat — and when the probable effectiveness of action is judged to be high and the risks low. When it happens, helping of this kind is experienced by the helper as rewarding, enabled by dopamine and oxytocin, the so-called pleasure and love molecules of pop psychology.

Preston reviews other accounts of altruism to highlight the distinctiveness of her own “altruistic response model.” Psychologists often explain helping as driven by empathic concern for others or by the desire to relieve vicarious distress at another’s suffering. Preston argues that empathic emotion is not strongly tied to helping actions, that rapid efforts to help sometimes don’t seem to require it, and that many creatures that lack the capacity to imagine themselves into another’s pelt engage in altruistic helping.

Preston acknowledges that some forms of altruism can be explained ultimately by tendencies to help kin (inclusive fitness) and by tendencies to reciprocate helping among non-kin (reciprocal altruism). But these evolutionary accounts fail to specify the mechanisms that give rise to helping, and in any event altruistic acts — even among rodents — often help strangers and have no credible chance of being reciprocated. Preston offers her offspring-care model as one of several potential mechanisms supporting human and mammalian sociality, and proposes that it complements existing evolutionary explanations. Empathy, reciprocity, strategic virtue-signalling, social norms and reflective morality no doubt also have their place.

The Altruistic Urge takes the reader through the many complexities of altruism. Preston clarifies the characteristics of helpers and beneficiaries that make altruistic actions more or less likely. Ideal saviours are confident and possess relevant expertise rather than necessarily a more compassionate personality, and ideal victims are in grave danger, helpless and cute. She rightly criticises psychological and economic research on the topic for studying monetary donations in artificial games rather than more ecologically real and urgent forms of helping.

Preston’s repetitive insistence on the unique strengths of her model, which alone “carves nature at its joints,” can become wearisome. But as a thorough exploration of one basis for human and animal altruism, grounded in but not restricted to parental care, the book makes a strong and scientifically well-supported case.


Preston’s book makes a delightful contrast with another published 120 years ago and reissued this month by Penguin Classics. Mutual Aid: A Factor of Evolution, the work of the Russian revolutionary anarchist Peter Kropotkin, is a sweeping account of the development of cooperation and reciprocity across the animal kingdom and through human history. Growing out of Kropotkin’s deep interest in natural history, the book argues against the centrality of competition and selfishness in biological and societal evolution.

Referring to his own expeditions into Siberia and northern Manchuria as a younger man, Kropotkin writes that “in all these scenes of animal life which passed before my eyes, I saw Mutual Aid and Mutual Support carried on to an extent which made me suspect in it a feature of the greatest importance for the maintenance of life, the preservation of each species, and its further evolution.”

Friendly to Darwin but antagonistic to social Darwinism, Kropotkin argues, like Preston, that there is something fundamental and deeply rooted about caring for others. Solitude and within-species competition are exceptions in the animal world and individualism is a pathology in that corner of it we humans inhabit. “Sociability,” Kropotkin writes, “is as much a law of nature as mutual struggle.”

Mutual Aid begins with lovingly sketched descriptions of cooperative animal behaviour, showing a special fondness for birds but also admiration for insects and mammals. We encounter walruses and vultures, bees and parrots, “the great tribe of the dogs,” and the “egoist she-goose.” The musk-rat is credited with “a very high degree of intellectual development.” These sketches feed into Kropotkin’s key proposition: that sociality is essential not only in obviously relevant activities such as mating and care of offspring, but also in defence, migration, hunting, healing the sick, and play. Being gregarious is grounded not only in the need to reproduce; it is also a way of communal being running through all aspects of life.

Kropotkin was attempting to challenge an influential line of thought that takes the fitness of the individual organism as the driver of evolution and sees individuals engaged in a battle for selfish advantage. He emphasises the collective fitness of animal groups and how the fitness of the groups’ members is entirely dependent on their capacity to coordinate with one another.

This argument prefigures the idea of group selection in evolutionary theory. As the American biologist E.O. Wilson put it, “In a group, selfish individuals beat altruistic individuals. But, groups of altruistic individuals beat groups of selfish individuals.” Group selection has been controversial in a field where the unit of fitness is more often understood to be the individual organism or the gene, but it endures as a minority view.

The idea that animals are fundamentally cooperative — that “the war of each against all is not the law of nature” — has political implications for Kropotkin. If humans are cut from the same cloth as other animals, as he emphatically believed, then functional societies will arise spontaneously out of our sociable instincts, and people will not have to be tamed and trampled by government or other authorities to achieve social order.

Kropotkin reserves special scorn for the view that “the so-called ‘state of nature’ was nothing but a permanent fight between individuals, accidentally huddled together by the mere caprice of their bestial existence.” The remainder of Mutual Aid is an extended argument against that pessimistic account of human society associated with the philosopher Thomas Hobbes.

Turning from zoology to ethnology, Kropotkin describes the evidence for cooperation among indigenous people and the tribal groups who invaded Roman Europe (“savages” and “barbarians” respectively, in the language of the day). He finds in their social lives evidence of a primitive form of communism, principled morality, love of peace, and loyalty to ever-broadening social units beyond the family, from clans to villages to confederations to nations.

Extending his historical narrative into medieval Europe, Kropotkin extols the value of social organisation based on guilds, and points to the technological and creative development they made possible. The subsequent rise of centralised power was a backward step, he argues, obstructing the free development of communal self-rule. “[T]he all-absorbing authority of the State” leads to “the development of an unbridled, narrow-minded individualism” and the decay of communal institutions, he writes. Even so, the human need for mutual aid organisations, and the rise of socialism in his time pointed a way forward.

As biology, Kropotkin’s book is no doubt amateurish and anecdotal, and the contrast with Preston’s social neuroscience demonstrates how far biological science has progressed in the past century. But work such as Preston’s supports Kropotkin’s faith in the innate sociability of animals and humans and fills in some of the detail. Altruism driven by a mammalian offspring-care mechanism will account for some fraction of our social motives and capacities.

What is less clear is whether scientific evidence for inbuilt altruism should influence how we think about human nature and its societal implications. Where Kropotkin draws a grand world-historical drama, Preston resists speculation and stays close to the data, as modern scientists tend to do. In different ways, their work shows how there is more Good Samaritan in us than we might have imagined. •

The Altruistic Urge: Why We’re Driven to Help Others
By Stephanie D. Preston | Columbia University Press | $57.95 | 344 pages

Mutual Aid: A Factor of Evolution
By Peter Kropotkin | Penguin Classics | $22.99 | 336 pages

The post Even amoebas appeared first on Inside Story.

]]>
https://insidestory.org.au/even-amoebas/feed/ 4
Stranger danger https://insidestory.org.au/stranger-danger/ Fri, 03 Sep 2021 01:56:45 +0000 https://staging.insidestory.org.au/?p=68432

An American take on the benefits of talking to strangers has a message for Australians

The post Stranger danger appeared first on Inside Story.

]]>
Who among us has not experienced that bilious mix of disappointment, irritation and dread that comes when someone shuffles down the aisle of a plane and claims the empty seat beside us? Our bubble of privacy is burst, and we hurriedly blow a new, smaller one. We gaze intently at our phone, bury our faces in rapt attention in the inflight magazine, or feign sleep. In extreme cases, if the intruder offers a friendly greeting, we may pretend to be deaf, or French.

Alternatively, we could do the unthinkable and return our new seatmate’s volley with a greeting of our own. Joe Keohane’s new book argues that our lives, societies and polities would be much improved if we took this path. Talking and listening to strangers helps to humanise them, to broaden us, and to include us and them in a more expansive we. Connecting to unfamiliar people builds social trust, overcomes loneliness, enhances appreciation of diversity, depolarises attitudes and dissolves group boundaries.

This kind of connection is also a profoundly satisfying experience for individuals, Keohane finds, in an era when technology seems to be replacing the rough and tumble of personal contact with frictionless but impersonal efficiency. Once the initial awkwardness lifts, conversing with people who have different views, stories and backgrounds from our own is enriching, enlivening and enjoyable.

Why then do we avoid these conversations, freezing out our companion in seat 34B? Keohane lets us count the ways. There may be a primal fear of strangers. We may view them through the lens of caricatures and group stereotypes. We may see them as lacking mental complexity, in a banal form of dehumanisation that Keohane calls the “lesser minds” problem.

Our reasons for failing to engage may also reflect how we think they see us as much as how we see them. We may worry about being rejected if we reach out, or assume that the other person won’t find us interesting. People may simply not know how to start the conversation, or feel embarrassed and tongue-tied at the prospect. And there may be a social norm against talking to strangers in some settings.

Despite these obstacles, a raft of social psychology research shows that conversations with strangers tend to go better than people expect and boost our mood and sense of belonging. Indeed, people systematically overestimate the risks of interactions with strangers and underestimate the benefits.

Keohane is not offering a one-sidedly psychological account of the importance of social connection, however. He presents a history of our hypersocial species from the deep evolutionary time of our primate ancestors through to the contemporary reality of urban living. Far from being Hobbesian beasts — we’re more akin to bonobos, the so-called “hippie apes,” than to murderous chimpanzees — humans have evolved to cooperate.

In addition to having a neurochemistry that fosters attachments to those close to us, we have strong tendencies to link up with those more distant by reciprocating their behaviour and extending honorary kinship to them. As early humans settled into agrarian communities and eventually congregated in cities, new ways of welcoming and integrating strangers emerged. Ritualised hospitality became a virtue as trade routes drew people from foreign lands. Religions, for all their bad press, created communities of belief out of mutually suspicious tribes and made a virtue of kindness to strangers.

Mutual suspicion is part of modern life, and withdrawing from unwanted contact is increasingly easy, so Keohane goes in search of people offering solutions. He attends a meeting of a group called Conversations New York, where strangers come together for a supervised discussion of topics of the day. In London he takes part in an extended workshop on how to talk to strangers, complete with an assortment of inhibition-busting homework assignments. In Los Angeles he observes urban confessionals where people offer free listening to passers-by. In the American Midwest he witnesses the convention of an organisation that tries to bring together Democrats and Republicans, identified by blue and red lanyards, for respectful conversations with ground rules that forbid grandstanding and pointscoring.

So much for clubs, classes, eccentrics on street corners and tongue-biting discussions with ideological opponents. Could we produce xenophilia on a grander scale? Keohane implies that this aspiration might be challenging. Many cultural trends are pushing in the other direction, deepening our isolation, accelerating the speed of life, and shielding people from direct human contact. Levels of general trust in others have declined in many societies.

We are warned of the threats posed by strangers while the statistically greater threats tend to be closer to home. Increasingly we communicate impersonally, without the empathy-inducing presence of facial expressions, voice and touch. Our cities create sensory overload that fosters “civil inattention” to others and sends us to the self-checkout lane so we can avoid a perfunctory chat with a bored cashier.

Beyond these social forces, Keohane argues that well-functioning and egalitarian societies may be paradoxically less open to strangers than more unstable alternatives. Research finds greater reserve and lesser hospitality in more highly developed societies, perhaps reaching a peak in Scandinavia, where Keohane observes well-meaning efforts to defrost self-contained Finns. Civility fails to promote openness to strangers because there is little felt need to interact when life is safe and secure. Abstract generosity in the form of strong welfare provision and large refugee intakes can coexist with coolness and incuriosity towards people outside our circles of friends and family.

If wealth and security fail to generate the desire to approach and embrace strangers, what does? Better-designed public spaces have a role in encouraging mixing and sharing, Keohane says. Ultimately, though, we need initiatives that bring diverse people together and do it wisely, fully aware that connecting with strangers requires some orchestration.

Without guidance and new rules of engagement, conversations with people who differ from us tend to descend into excesses of argument and deficits of listening, reinforcing our prior belief that they are fools and villains. Mere contact with people who are different from us will not reliably increase mutual trust and understanding in the absence of shared goals and genuine equality.

But besides new forums for conversations with strangers, we need people of goodwill to emerge from their family burrows, their digital echo chambers and their social inhibitions to take part in them. The Power of Strangers concludes with an exhortation to us all to do just that.

Some readers may be sceptical of Keohane’s bottom-up approach to patching the social fabric and his emphasis on grassroots initiatives to connect strangers. In an age when group identities are so salient, it might seem odd to promote solutions that boil down to individuals having more and better one-on-one conversations.

There is certainly something deeply American about this combination of individualism and communitarianism, not to mention the idealistic impulse behind Keohane’s prescriptions. E pluribus unum could be his motto, albeit with an enlightened awareness that the unum is diverse. Even so, Australian readers should find The Power of Strangers an inspiring and illuminating read. It’s not as if loneliness, tribalism and political polarisation are strangers to us. •

The post Stranger danger appeared first on Inside Story.

]]>
The art of disagreeing https://insidestory.org.au/the-art-of-disagreeing/ Mon, 23 Aug 2021 06:20:05 +0000 https://staging.insidestory.org.au/?p=68238

“We should be civil with those we don’t know, and aim to know them well enough that we can be uncivil,” argues a new book

The post The art of disagreeing appeared first on Inside Story.

]]>
This is a self-help book for the world. Ian Leslie worries that “people with differing views seem to find it increasingly hard to argue productively.” In public debates and private disputes, “our inability to disagree well seems to act as a roadblock to progress.”

He doesn’t want to smooth over conflicts. “The only thing worse than having toxic arguments is not having arguments at all.” And he doesn’t just want people to reach agreement through compromise. He wants “productive disagreements,” the kind that “neither reinforce nor eradicate a difference, but make something new out of it.” Humans cannot aspire only to “put our differences aside,” they must put them to work.

Such work is tough. Productive disagreement, Leslie thinks, is “not a philosophy so much as a discipline, and a skill.” To help, Conflicted offers ten Rules of Productive Argument and a “toolbox” of pithy cues that add up to “a universal grammar of productive disagreement, available for any of us to apply to our lives.” Learn what is in the box and open it “when you next embark on a difficult conversation.”


Conflicted starts by making the case for conflict. It can bring us closer, make us smarter and inspire us. The strongest relationships and workplaces don’t skirt conflict: heat can bring light. Human intelligence is interactive, and disagreements enable individuals and groups to learn from each other and to “think harder about why we believe what we believe.” Artists rebel against prevailing tastes, entrepreneurs against established business models, political leaders against old injustices.

To find out how to disagree well, Leslie talked with people who make their living out of conflict. He found “remarkable similarities” between the challenges faced by professional hostage negotiators, police interviewers, therapists, mediators and diplomats, and those dealt with every day by the tyros of conflict in families, friendships and offices (where the politics is “passive aggression at scale”). The professionals are “masters of the conversation beneath the conversation,” experts at “retrieving something valuable from the most unpromising of circumstances.”

Leslie gives his ten Rules of Productive Argument a chapter each: first, connect; let go of the rope; give face; check your weirdness; get curious; make wrong strong; disrupt the script; share constraints; only get mad on purpose (as Tony Blair is supposed to have said). The final, golden, rule, the one to which all the others are subordinate, is “be real” — that is, “make an honest human connection.”

The first and the last rules draw from the same well: that is, the quality of the relationship where the conflict occurs. “Better relationships lead to better disagreements,” Leslie says. The experts are especially good at this, nudging the substance of a disagreement down the road while they mould a relationship. In a crisis, they may not have much time. Even if they are talking a stranger down from a ledge or administering first aid after an accident, they still don’t omit the first step. “It just means you need to work fast.”

Along the way, Leslie illuminates the argument and the rules with cases. Southwest Airlines in the United States built its competitive advantage by managing a specific kind of internal conflict that bedevilled airlines. Deep-seated differences over status and perceived competence among pilots, cabin crew, baggage handlers and other ground staff produced endemic disagreement that slowed aircraft turnaround times. Fixing it, and keeping it fixed, increased profits because planes spent less time on the ground and more in the air moving paying customers to their destinations.

The Wright Brothers, apparently, were “bare-knuckle debaters,” trained by their father to argue after dinner “as vigorously as possible without being disrespectful.” Wilbur thought Orville “a good scrapper.” Himself? “After I get hold of a truth I hate to lose it again, and I like to sift all the truth out before I give up on an error.”

Nelson Mandela, “a genius of facework,” famously “gave face” to so many who oppressed him and other black South Africans. Leslie describes him answering the door and personally serving tea, with milk and sugar, to Constand Viljoen, “apartheid’s last best hope,” in September 1993.

In February 1993 the FBI negotiators at Mount Carmel, near Waco, Texas, brought a worldview that seemed as weird to the Branch Davidians as the Davidians’ appeared to the FBI. “They [the FBI] did everything by the book,” says Leslie. “But the book turned out to be missing a crucial chapter.” Military hardware was less terrifying to the Davidians than God’s judgement. The transcripts of “negotiation” provide excruciating evidence of mutual weirdness, unchecked, though they can also be seen as an extreme version of most human interactions. Every individual is a “micro-culture,” says Leslie; we are all “a little odd.”

“Letting go of the rope” means resisting the “righting reflex,” the urge to tell someone who is sounding a bit rattled to “calm down,” or a person who is worried by apparent trivia that “there is nothing to worry about.” In the Covid era, which hadn’t begun when Leslie’s text was finalised, it might mean resisting the impulse to tell vaccine-hesitants they are being delusional if they ask any questions about vaccines that have not received final approvals from food and drug administrations.


Leslie is a persuasive advocate for his position, and for his rules and tools, though he worries it is all too reasonable, the whole book “so… polite.” He is “not one of life’s natural warriors; even mild confrontation can make me itch with discomfort.” He knows marginalised people are sick of being told they need to be reasonable, that they need to listen to their oppressors and not just speak loudly and insistently themselves. Mandela had the muscle of international support to add to Viljoen’s tea along with the milk and sugar.

Some people really are impossible to engage with, Leslie acknowledges, but he thinks “we have a persistent, habitual tendency to overestimate the size of this group. Especially when we haven’t had much sleep.” He likes politeness, civility, ground rules for engagement, because he thinks they generally help people to disagree productively. Civility is “the minimum standard of behaviour necessary to encourage your opponent to talk back.”

It is the talking back he is after, not the civility itself. That returns him to the quality of the relationship that underpins the disagreement. “Ultimately all rules are a crutch, or a guide rail, that we can dispense with if the relationship is strong enough. We should be civil with those we don’t know, and aim to know them well enough that we can be uncivil.” •

The post The art of disagreeing appeared first on Inside Story.

]]>
Friends with benefits https://insidestory.org.au/friends-with-benefits/ Mon, 02 Aug 2021 05:46:39 +0000 https://staging.insidestory.org.au/?p=67876

When and why did friendship slide down our hierarchy of relationships?

The post Friends with benefits appeared first on Inside Story.

]]>
What would Aristotle say about Australia’s Covid arrangements? He’d most likely applaud the respect being given to scientific opinion and lambaste the vaccine rollout as gross mismanagement of the health of the polis. I could go on — his oeuvre is voluminous — but one thing I am certain he would deplore is the impoverished status of friendship in our lockdown laws.

For Aristotle, friendship was not just a pit stop on the way to the real goal of romance or a family, nor was it a few cheeky wines after work in the Agora with his mates Hermias and Pythias. Friendship, he wrote, “is most necessary for our life.” No one, he said, “would choose to live without friends even if he had all the other goods” (such as wealth or health).

It wasn’t just the deep and authentic connection that attracted him, nor the fact that he believed happiness to be impossible without friends. Aristotle thought of friendship as the foundation of all political life. Good legislators pay more attention to friendship than justice, he wrote. Friendship, with its focus on reciprocity, trust, mutual goodwill and a willingness to make sacrifices, “hold[s] cities together.” To this extent I am certain he would rail against any government that holds friendship in contempt.

Here in Sydney, people who live alone, are a single parent or have a disability have only just been permitted to nominate a friend to enter their “social bubble.” They had to endure five weeks of isolation before being granted this very basic right. Victorians had to wait two months during their long lockdown last year. Until then, they were permitted to visit an “intimate partner,” possibly a stranger they’d met on Tinder, but were not allowed to visit a lifelong friend.

Before social bubbles, visits to intimate partners fell under the heading of “caring and compassionate visits” but visiting friends didn’t. Intimacy, in other words, was defined according to sex rather than, say, trust, laughter, shared intellectual or cultural interests, or reciprocal care — the values we associate with friendship. During a time of mass low-level anxiety, the laws implied that single people would find more solace in a random person with whom they might want to have sex than their closest friend.

It’s a good thing that the laws have finally changed in New South Wales, but what can we learn from the debate, and how might it inform the way we live post-lockdown?

Above all, I’d argue that the debate about lockdown visits was poorly framed. The commentary overwhelmingly treated the problem as if it were all about the plight of sad singles — pale-faced loners who peer glumly out of windows while smug couples jog by, laughing, with their labradors. Couples and families were the norm against which all other relationships were measured and declared lacking. Singles were seen as grievously incomplete, pathologised and pitied for what are in fact perfectly legitimate and often happier life choices. The logic was circuitous: they felt alone during Covid because they lived alone, not because they were denied the right to visit a close friend.

But what if we treat this as a debate less about the problems of singledom and more about the diminished status of friendship in our laws? After all, this is clearly a moment when the law has intervened in our emotional lives and sought to privilege one status (romantic) over another (friendship). Justified as a necessary measure to prevent movement between households, the laws failed to acknowledge that movement was already happening, it’s just that it was compelled to involve lovers rather than friends. In a situation that can best be compared to the dystopian film The Lobster, singles were told to couple up or face months of psychic torture.

If people living alone in New South Wales have felt isolated over the past five weeks then it was because they were denied the right granted to every other person in Australia to have someone with whom they could cheer Olympians, order in dinner and generally consume too many negronis. Had friendship been enshrined in law as something to be protected, much like marriage or family, then social bubbles would have been automatically granted and fewer people would have suffered.

There is nothing natural or inevitable about the low status we accord friendship. In other places and at other times friendship has occupied an exalted social and legal position, equal or even superior to the involuntary bonds of family or the caprices of romance. Aristotle’s thinking offers us a window into the reverence accorded to friendship in ancient Greece, which eighteenth-century philosophers and voyagers drew on when encountering cultures that sanctified friendship.

Philosopher Alberto Fortis was one of them. In his bestselling anthropological tract Viaggio in Dalmazia (1774), he observed that the Morlacchi people of Venetian Dalmatia (present-day Croatia) had a “nobler capacity” for friendship than modern, civilised Europeans; in fact, it was almost a “point of religion” among them. Morlacchi friends would “tie the sacred bond at the foot of the altar” in a ritual, much like a wedding, that “contains a particular benediction, for the solemn union of two male or two female friends, in the presence of the congregation.” He claimed to have been present “at the union of two young women, who were made Posestre (sacred friends) in the church” and saw how satisfaction “sparkled in their eyes, when the ceremony was performed.”

Joseph Banks and the French explorer Louis Antoine de Bougainville were among the European voyagers who were delighted to stumble on friendship pacts, or taio, in Tahiti. Formalised between people of the same sex, status and age, these pacts blended intimacy with instrumentalism — taios would offer each other emotional support as well as food, labour, land, and sexual partners. It was also through friendship pacts that cross-cultural exchange occurred. Banks’s journal records how he solemnised his friendship with Cook by being wrapped in cloth and presented with a green bough, after which both men “lay our hand on our breasts” and said taio, “which I imagine signifies friend.”

Eighteenth-century voyagers quickly understood the taio bond because they came from a culture that valued friendship and elevated it into law. Natural law theorists from Cicero to Francisco de Vitoria and Hugo Grotius justified imperial commerce on the basis that God willed (in Grotius’s words) that “human friendships should be fostered by mutual needs and resources.” Jobs were filled, money distributed, intimacies forged, and identities constructed all through the framework of friendship.

Matthew Flinders could write to his friend George Bass that “there was a time when I was so completely wrapped up in you, that no conversation but yours could give me pleasure” without anyone raising an eyebrow. Jane Addams and Mary Rozet Smith, two social reformers in America, could be more devoted to each other than to their husbands without a hint of scandal. When Addams travelled without Smith, she would often lug an enormous portrait of her friend with her, and when they journeyed together, they would demand a double bed.

When and why did friendship slide down our hierarchy of relationships? To my mind, the process began in the early twentieth century with the rise of the nuclear family, and with it, the notion that all our emotional needs could be fulfilled within the four walls of the home. Any society that subscribes to this myth necessarily devalues the succour of friends. Care becomes privatised and limited to blood relations, and as families become smaller so too do their moral visions. Promoting that shift was the rise of homophobic sentiment in late nineteenth-century works of sexology and popular Freudianism, which culminated in the anti-gay witch-hunts of the 1950s.

As someone who experiences unbounded elation through long chats with friends and despondency at their absence, I would love to see friendship protected by law. In the short term, this would mean that social bubbles would be built into the architecture of lockdown laws. In the long term, it would mean changing our laws to enshrine the rights of friends, including (as legal scholar Ethan Leib has argued) broadening paid medical leave to allow friends to take care of one another during sickness; allowing friends to sue on another friend’s behalf; and giving friends a legal right to make medical decisions on our behalf.

And why should romantic couples have the monopoly on lavish weddings? Friends’ registers could be established and maybe we could bring back those friendship necklaces we all loved so much in year nine. And who wouldn’t be happier attending a lavish affair solemnising the commitment of two adoring friends than another boringly ostentatious wedding? Throw in a trip to Tahiti or Croatia and I’d be there with bells on. •

The post Friends with benefits appeared first on Inside Story.

]]>
New tricks https://insidestory.org.au/new-tricks/ Fri, 30 Jul 2021 06:26:05 +0000 https://staging.insidestory.org.au/?p=67836

We might not be able to change who we are, but we can certainly change what we do

The post New tricks appeared first on Inside Story.

]]>
According to Erik Erikson, wise but unfashionable psychoanalyst, the key task of middle age is to avoid stagnation. The challenge is to develop what he dubbed generativity, a concern for making the world a better place by contributing to the next generation. How better to do that than become a teacher? The next generation is corralled in your classroom, bright-eyed and half-formed, just waiting to be educated.

Of course, teaching is not as simple as transferring a half century of life lessons into grateful young minds. Even so, the profession exerts a pull on many midlife adults seeking a second career. Studies find that they are often motivated by desires to do more meaningful work, to have more flexible work schedules and to share what they have learned with young people. In the words of Dan, a former winemaker profiled on the website of Teach for Australia, an organisation that helps people make the transition, “It sounds trite, but I genuinely wanted to give back.”

Lucy Kellaway’s new book, Re-educated, is an affecting account of her own fateful decision to leave one outwardly successful life behind and begin another in front of a class at a disadvantaged London school. Working in “an office full of hacks” at the London Financial Times for two decades as the writer of a celebrated column, mother of four children and dutiful daughter of an ageing father (an émigré from 1940s Melbourne’s “cultural desert”), she was inwardly stagnant. Two years later, her father dead, she has quit her job, separated from her husband, bought her dream home and trained as a teacher. Along the way, and before even stepping into a classroom, she co-founded Now Teach, a charity that facilitates the career shift she is navigating.

Kellaway’s description of her pre-revolutionary life is unsentimental. She was more bored than burnt out by her work. “What sustained me all those years was a personality flaw that is common in journalists,” she writes. “I was both insecure and a show-off.” As the status anxiety and fear of failure that sustained her work diminished, she lost interest. At least in the telling, the dissolution of her marriage was equally deflating, more a matter of frustration with her husband’s tendency to clutter up the house than of bitter fighting.

Kellaway recognises herself as someone with a constant need to be busy, and it is her energetic pragmatism that drives her “to do something more useful” as a teacher and to launch Now Teach. Its success in attracting middle-aged recruits is a pleasant surprise to her, although not without wrinkles. One applicant signs up only to withdraw his application the next day after being reminded by his wife that he doesn’t like children.

In her illuminating treatment of the everyday experience of teaching, Kellaway observes that she works harder than she ever did as a journalist, although perhaps not as hard as in her first job as a banker, which she describes as “a lethal combination of stress and boredom.” The challenges of enforcing student discipline and buckling under the institutional discipline of endless forms, guidelines and new educational technologies are made vivid in a chapter that sketches a typical day.

Kellaway is astute on how teaching is both harder and easier for mature-age teachers than for their younger peers, and how the former’s success hinges on feeling comfortable with public failure, being bad at something all over again, and experiencing a loss of status and ambition. Over time, interestingly, she becomes more favourably disposed to aspects of teaching practice that can seem cynical or authoritarian to outsiders, such as teaching to the test and firmly enforcing arbitrary rules of behaviour and dress. Carrots must, at times, be supplemented by (non-literal) sticks.

Re-educated is mainly a story about the concrete realities of becoming a teacher, but much of its charm and heft come from its broader personal narrative. It demonstrates the value of female friendship, the disappointments of midlife dating, and Kellaway’s struggle with abandoning hair dye and going grey, accompanied by some characteristically sharp-witted remarks about “follicle sexism.” Men tend to come off poorly, but the embrace of family is celebrated.

Kellaway’s reflections on educational privilege are telling. “[T]he just-be-happy-darling school of education was a luxury for the middle classes,” who have a safety net in case of underachievement that is unavailable to other students. Her own elevation from indifferent high school student to gilded Oxford undergraduate to overpaid City of London banker is a case in point.


This memoir of reinvention makes an interesting contrast with American journalist David Brooks’s The Second Mountain, which I reviewed for Inside Story in 2019. Like Kellaway, Brooks underwent a midlife career switch and exited a longstanding marriage. His book chronicles a similar experience of stuckness followed by a pivotal decision to follow a new path through the remainder of life. That’s where the similarities end. Where Brooks likens the new path to climbing a second mountain, Kellaway presents it as not so much a grand quest as a process of stumbling through incompetence towards being a good-enough teacher. Her path is not so much up as down or sideways.

Brooks’s second mountain demands a profound moral transformation, in his case accompanied by a spiritual awakening, whereas Kellaway sees herself not so much uplifted as simply more useful. For Brooks, the first mountain represents individualistic achievement, the greying mountaineer leaving it for a second peak of belonging and community. Kellaway’s new life has its social benefits — she enjoys pub nights with her fellow teachers — but part of what motivated it was a desire to escape from the demands of belonging: the decades spent looking after children, husband and ageing parents.

Kellaway scripts her change-of-life direction half as comedy, complete with humiliations and faux pas, and half as factual reportage on the nitty-gritty of daily life in school. Throughout, she avoids generalising her experience to others, let alone pronouncing on the human condition, and she makes few prescriptions aside from recommending we become second-career teachers. She concludes that her life’s sharp turn has not really changed her, but merely changed the settings in which she operates and brought out aspects of herself that had been neglected.

Brooks’s script is more of a romance and a rallying cry. He documents a redemptive personal transformation, theorises it as a moral calling, and advocates it to others. The reader can decide how much these different ways of storying life changes should be attributed to transatlantic cultural differences, and how much to gender or personality.

Re-educated is an enjoyable and edifying read. Without having any of the usual trappings of inspirational writing, it somehow convinces readers of a certain age that life change is an enlivening option, while reminding them that it is neither easy nor reliably fun. Kellaway doesn’t pretend that we can change who we are merely by acts of will, or that all of us can sustain a new calling without disillusionment.

What is refreshing about this story is that it focuses on changing not who we are but what we do. The key is to be “released from the force of habit,” to take the leap into new environments, and to declutter one’s priorities. Her message is heartening: “it is possible, desirable and perfectly natural for people to start again in their 50s.” •

Re-educated: How I Changed My Job, My Home, My Husband and My Hair
By Lucy Kellaway | Ebury Press | $39.50 | 256 pages

The post New tricks appeared first on Inside Story.

]]>
On seduction, brainwashing and being converted https://insidestory.org.au/on-seduction-brainwashing-and-being-converted/ Tue, 15 Jun 2021 00:51:49 +0000 https://staging.insidestory.org.au/?p=67210

A characteristically elliptical new book from the famed British psychotherapist Adam Phillips

The post On seduction, brainwashing and being converted appeared first on Inside Story.

]]>
If you were to judge this slender book by its cover, your first impression might be that it is a tequila sunrise. Judging from its title and blurb, though, you might decide it is an examination of the urgings and obstacles to personal change. In fact, it is neither. It is an extended meditation on the idea of conversion, whether personal, religious or political.

Adam Phillips is a prolific British essayist on matters literary, cultural and psychoanalytic. As a practising psychotherapist and general editor of the Penguin Modern Classics translations of Sigmund Freud, he has serious credibility as a psychoanalyst, but his style of psychoanalytic writing refreshingly lacks the usual heaviness and homage to the master. In what is a fundamentally interpretive approach to understanding human behaviour, he treats the psychoanalytic tradition as a source of interesting ideas and inversions of common sense. His books have put that approach to work in elegant discussions of everything from tickling to monogamy (“A couple is a conspiracy in search of a crime”).

His new book began as four lectures on conversion, in the sense of a dramatic process of revolutionary change, a rupture or break from the past. It begins with an exploration of our tendency to view conversion as an outcome of seduction or brainwashing. Enlightened folk favour a more gradual and educative approach to change, Phillips argues, rather than what they see as escapes from complexity into submerging group identities. For all the current talk of becoming one’s best self, most of us just want to become better by degrees.

For Phillips, the psychoanalytic notion of conversion, in which the emotion attached to a troubling idea is transformed into a physical symptom while the idea itself remains unchanged, offers an alternative way to think about it. We can then see personal conversions as a kind of substitution rather than a genuine transformation. Apparent conversions may be ways of staying the same.

Phillips’s second chapter continues the theme of our ambivalence towards personal change — wanting but also dreading and resisting it, recalling Freud’s quote that neurotics will defend their neurosis “like a lioness her young” — and our inconsistent beliefs about it. Although we may be sceptical of dramatic positive change, Phillips observes, we increasingly embrace the idea of trauma as a cause of negative transformation. People also wrestle with the competing forces of stability and disruption: biology and fate on the one hand; choice and self-fashioning on the other. They are both changed and un-changed, with a sense of having a true self that is out of reach of the false one they inhabit.

These conflicts and self-doubts sometimes motivate efforts at personal conversion, with results that can be less transformative than advertised: “one of the ways the new makes a name for itself is through the rhetoric of hyperbole; great claims are made, but over time the past begins to show through.” It is difficult to let go of a former identity. Therapy might offer a secular form of conversion, but the change it offers is less than dramatic. Sometimes “people aren’t cured, they just lose interest in their symptoms.” Close studies of famous converts Paul and Augustine round out these explorations.

Conversion in the political sphere comes in for a less satisfying analysis through ruminations on the ideas of French theorist Étienne Balibar and American political scientist Wendy Brown. To Balibar, political transformations involve a conversion of violence into civility. Phillips suggests that believing in such a transformation relies on an optimistic but sometimes naive view of the mutability of human nature. Utopian political ideals depend, he argues, either on the belief that human nature can be reinvented or on the recovery of an idea of human nature that is supposed to have been lost. For people seeking political transformation, what ultimately has to be converted is doubt about the possibility of change.

Phillips’s final essay explores conversion from the standpoint of the outside observer. How much are converts really seeing new truths and how much are they just seeing things as a new group defines them to be? Conversion, he suggests, is often a matter of identifying with someone else, although converts tend to believe they are becoming more truly themselves. Socrates is put forward as an authentic self-fashioner who made himself unique without attempting to become someone else; and the psychoanalytic theory of perversions is examined for clues to what drives some sorts of pathological conversion.


There is something almost hypnotic about Phillips’s writing. He assembles simple words into complex thoughts, cas-ually cites French intellectuals, drops aperçus that demand a second and third reading, and then heads off on new tangents. His writing has a certain cadence as well. The classic Phillipian paragraph starts with a few unshowy statements, follows them with an arresting remark in parenthesis, builds to a conclusion and ends with an aphorism that may not qualify as a grammatical sentence (“Targets that must be missed in order to be met”). The rise and fall of these paragraphs is transporting, and even if they rarely deliver an obvious message or hammer a lesson home, the reader is left with a sense of having been taken on an intriguing journey.

It would be a mistake to read Phillips seeking guidance on how to change oneself or a systematic theory of the nature of conversion. Any reader hoping for an answer to why, at this historical moment, some things are believed to be convertible (sex and gender) and others not (race and sexual orientation) will be disappointed. Phillips has no advice to give or grand theoretical statements to make, just reflections. Those reflections are food for thought, and readers wishing for a second course can look forward to the publication of its sequel, On Getting Better. •

The post On seduction, brainwashing and being converted appeared first on Inside Story.

]]>
Love and fear https://insidestory.org.au/love-and-fear/ Mon, 10 May 2021 06:07:36 +0000 https://staging.insidestory.org.au/?p=66568

With the pandemic under control, Australian researchers have resumed their quest for a psychedelic approach to mental health

The post Love and fear appeared first on Inside Story.

]]>
March 2020. In a darkened room in a Melbourne hospital, a slight, dark-haired woman sits at the bedside of a lone patient. Outside, Covid-19 webs its way silently through the city; inside, the patient rests back on the day bed, eyes covered with a soft eye mask, ears enclosed in noise-cancelling headphones through which a specially curated playlist will rise and fall over the next six hours or so. The seated woman — watching, listening, close enough to touch — is a clinical psychologist named Margaret Ross. After eighteen months of intense negotiations, preparations, crossing of fingers and dotting of i’s, she and a colleague, psychiatrist Justin Dwyer, have just given their charge a small white capsule containing a substantial dose of the mind-altering compound psilocybin, best known as the active ingredient in “magic mushrooms.”

It is a small but historic moment. And a strange one. Psilocybin and the fungi from which the compound is derived have been used in shamanic and other Indigenous rituals for centuries. The mushrooms are said to be depicted in artworks thousands of years old. They have been consumed by hippies in the Summer of Love, decried by former US president Richard Nixon and eventually outlawed. They have even informed their own theory of human evolution (US ethnobotanist Terence McKenna’s ‘Stoned Ape Theory’). But today they are being harnessed by Western medicine to try to ease the terror of dying.

The figure on the day bed is the first of about thirty-five desperately ill patients who, with the help of Ross and her team, will each take a legal psilocybin trip (sometimes two) as part of the nation’s first approved randomised controlled trial of psychedelic psychotherapy. In fact, only two patients will go through the treatment early in 2020. And then suddenly it will all stop. Four days later, Ross will be back on the ward, in scrubs, counselling traumatised staff about how to care for patients, families and themselves as Melbourne enters its first lockdown. But what she sees in those extraordinary days before the trial is suspended will change the way she thinks about her work, her patients and the possibilities for treating some of the country’s saddest and sickest people.


So, definitions. Psychedelic therapy (psychedelic: from the Greek roots of “mind” and “manifesting’) first emerged as a subject of clinical research in the US in the late 1940s, with the release of the mind-altering, era-defining hallucinogen lysergic acid diethylamide, or LSD. During the 1950s and ’60s, tens of thousands of people took LSD — marketed initially as a cure for mental disorders from schizophrenia to “sexual perversion” — as part of the first wave of psychedelic research, before the drug, and psychedelics generally, fell victim to the social and political upheaval of the times. These days researchers prefer to focus on psilocybin — which is less potent than LSD and less stigmatised — as well as MDMA (“ecstasy”), which while not technically a psychedelic does some similar things to the mind, seemingly by lowering defences and promoting new ways of thinking. Researchers also prefer to talk about “psychedelic [or psychedelic-assisted] psychotherapy,” to emphasise that the drug treatment is securely corralled in a series of counselling sessions.

For those who give the topic any thought (I get mainly blank looks when I raise the subject with medical acquaintances), attitudes towards psychedelic psychotherapy range from dismissive to evangelical. In one iteration it is an undercooked, overhyped, potentially dangerous fringe treatment that could divert desperately needed funding from other areas of the already stretched mental health budget. In another, it is a paradigm-shifting therapeutic frontier that could reconfigure Australia’s treatment of hitherto intractable mental conditions — such as deep, persistent depression and post-traumatic stress — within a decade and also transform our approaches to other illnesses, including addiction, anorexia and obsessive-compulsive disorder.

There is another option that gets less airplay, but that is worrying psychedelic researchers here and worldwide. More on that later.

For now, and after a slow start — “We are very conservative in the research area; we’re a very conservative country,” says one researcher — Australia is turning towards psychedelic drugs to help treat some of our most entrenched and distressing mental illnesses. In November 2020, researchers at Melbourne’s Monash University announced the university would sponsor two new trials: one a large, world-first study of psilocybin to treat crippling anxiety; the other exploring the role of MDMA in treating severe post-traumatic stress disorder, or PTSD.

Several smaller trials are also ready to go, including a partnership between Melbourne’s Swinburne University and St Vincent’s Hospital that will study the impact of psilocybin-assisted psychotherapy on treatment-resistant depression; another on the role of MDMA in treating PTSD from Perth’s Edith Cowan University; and one from St Vincent’s Hospital Sydney on methamphetamine addiction. Other psychedelics of interest include the South American brew ayahuasca and its potent active ingredient, DMT.

Those involved in the trials hope they will lay the groundwork for psychedelic psychotherapy to one day become an approved, Medicare-funded treatment for thousands, maybe tens of thousands, of Australians whose mental conditions are resistant to conventional treatments, and whose daily lives span a continuum from distressing to unendurable.

But they warn that getting there will be difficult. Nor is it guaranteed. Along with the usual logistical and regulatory juggling over the approval of any new drug are personal, professional and political faultlines that must be navigated if Australia is to achieve a workable model of psychedelic psychotherapy (even now the temptation is to overlay the words in retro colourways). And before and beyond all that is the sheer mind-bending otherness of the experience at the heart of the proposed new therapy — an experience only partially amenable to language, which scientists in respected journals describe as “ineffable,” “mystical” and “transcendent,” and one that is embedded in a process that can involve a profound reckoning with the self and its place in the world.

“The psychedelic experience can open people up to long suppressed feelings both negative and positive,” says a 2020 paper from Rosalind Watts of London’s Imperial College. “It can involve peak experiences, mystical states, and experiences of intense, all-encompassing love and bliss.”

And from this flow great possibilities and great challenges.


In a noisy outdoor cafe opposite the hospital, Margaret Ross is searching for words. “There’s times when I have to really sit and percolate on an idea and kind of really, really feel it in my bones before I give a thoughtful answer.” She listens attentively, laughs frequently, talks in flurries. But she knows that talk only gets you so far.

Ross has spent much of her life among the dying. At seven, she was deeply affected by the death of a beloved grandmother. As a clinical psychologist working with palliative care patients, many from the St Vincent’s cancer wards, she has spent years thinking about how best to help people who can no longer be helped — at least not in the way that most of us entering a hospital want to be helped. She has researched the psychological impact of cancer and other terminal diagnoses. She has seen some remarkable deaths.

And she has seen a lot of fear. The publicity material for the Melbourne trial describes the “depression” and “anxiety” of some terminal care patients — words that seem barely to touch the sides of what Ross encounters in her work. First, she says, is the fear of death itself, the “annihilation of self.” Cascading from this central void is grief at leaving behind a partner, a sister, a child (“especially children”) and fear for their welfare in a world where you will no longer be there to love or protect them. Then there is the physical and mental dismantling that is the dying process (“Will there be pain, indignity, will I lose control?”). And fears about the relentless, incremental losses that define a terminal diagnosis: identity, independence, relevance, control. Even as a therapist, says Ross, it is hard sometimes not to feel helpless. “I see so much terror and distress.”

“We set up a kind of little camp”: Clinical psychologist Margaret Ross and her colleague, psychiatrist Justin Dwyer. Kristoffer Paulsen/Nine

When Ross was sixteen, her mother was diagnosed with a serious respiratory disease. She fought it for a long time and died when Ross, the youngest of five sisters, was twenty-seven. “She never really made peace with it. She loved life too much. She didn’t want to leave her girls… And she was absolutely terrified of death. So I’ve had an interest in death, the way people square with it, for a very long time.”

As a young psychologist, says Ross, she started thinking about the therapeutic potential of altered states such as meditation and yoga. She trained in hypnotherapy. Anything that might help people find a way through. When a new wave of international research into the therapeutic potential of psychedelics started emerging in the early 2000s, she paid attention. More so when, in 2016, two key studies came out of the US on the impact of psilocybin-assisted psychotherapy on the mood and attitudes of patients with life-threatening cancer. The results, she says, were “staggering.” The trials, one from New York University and the other from Johns Hopkins University in Baltimore, found rapid, sizeable reductions in depression and anxiety that for many would go on to last six months or more.

It was what Ross had been looking for. A way in.


Like most of those I have met in researching this article, Paul Liknaitzky confounds lingering stereotypes of the “turn on, tune in, drop out” variety. Neat, articulate and with a sentence structure girdered in careful clauses and subclauses, he could be a rising public servant. He is also determined. As a teenager he navigated solo through a series of distressing psychological episodes — “spontaneous and terrifying altered states of consciousness,” he calls them — eventually breaking the cycle by standing in front of a mirror, staring into his own eyes (“If I die, I die”). Which may help explain his centrality in the emerging framework of an Australian psychedelic therapy.

Margaret Ross describes him as the “mycelial network” of Australia’s psychedelic research world, referring to the underground web of microscopic fungal fibres that link and sustain diverse communities of plants. A research fellow at Monash University, he is co-ordinator of the St Vincent’s trial, and chief principal investigator on the upcoming Monash trials as well as several others now taking shape in the fertile loam of Australia’s nascent psychedelic research landscape.

The possibilities, he says, are astonishing. “There is uncharted territory wherever I look.”

Admittedly, any discussion of the potential benefits of psychedelic drugs in treating mental illness takes as its baseline the shortcomings of existing pharmacological models. The 2020 Productivity Commission Inquiry Report into Mental Health estimates the economic costs of mental illness and suicide at up to $70 billion per year in Australia, plus another $150 billion or so in reduced health and life expectancy. The most common drug treatment for depression, selective serotonin reuptake inhibitors, has been shown to help only about half of patients, with high relapse rates after discontinuing and no major breakthroughs in more than thirty years.

“That indicates to me that we have some fundamental assumptions about mental illness that are wrong,” says Liknaitzky.

Even measured against that relatively low base, the early indications for the potential of psychedelics — across a wide and growing range of mental disorders — has been galvanising.

In the past two decades, studies have shown significant, sometimes startling, improvements in conditions including death anxiety, treatment-resistant depression and PTSD. In recent years, high-profile psychedelic research centres have opened in major universities, including Imperial College and Johns Hopkins, as well as in Basel and Zurich. The US Food and Drug Administration has designated MDMA therapy for PTSD, and psilocybin therapy for depression as “breakthrough therapies.” One recent study using psilocybin to help treat long-term depression reported an effect around four times that of traditional antidepressants.

One of the primary claims made by researchers in the field is that unlike drugs such as antidepressants, psychedelics are not treating symptoms but root causes often related to childhood or other trauma. While it may be decades (or longer) before we truly understand the brain mechanisms by which psychedelics exert their disorienting effects, recent research suggests that psilocybin quietens a brain region called the default mode network, which includes the circuitry involved in how we remember our lives, and the stories we tell ourselves about who we are. In classic psychoanalytic terms, it seems to dial down the ego, lowering our psychological defences, helping us to think in more open and flexible ways, and increasing our sense of connection to ourselves and others.

“The opportunity there is for patients to consolidate aspects of themselves or their lives that have been repressed,” says Liknaitzky, “to gain new and helpful perspectives on old problems, and to feel, emotionally, a much stronger alignment with their values and what’s important in life — and to be motivated to act in accordance with [those] values.”

Of course, these sorts of insights, the ability to access and recast buried thoughts, memories and feelings and to change our behaviour accordingly, are the goal of much psychotherapy, as well as a benefit of some meditative practices. But psychedelics seem to turbocharge that process. And unlike some of the hard-won insights that arise through talk therapy, Liknaitzky suspects that the fundamental alteration psychedelics bring about is not a thought or idea, but “an embodied encounter or a revelation.” And that this suggests a greater possibility of long-term change, at least for some.

I speak with “Clare,” a social worker who tried psilocybin last year in a bid to overcome a debilitating, corrosive lack of confidence — she calls it “imposter syndrome” — that was affecting her personal life and interfering with her work. She took the drug illegally, though with the support of a trained therapist with an interest in psychedelic medicine and, she says, a lot of preparation. She describes a series of discrete scenes, each like a small film: an emotional encounter with her estranged mother; a vision of her seven-year-old self; a meeting with a large benign rat; and finally, herself, in a room, surrounded by family and friends, able in this moment to see herself through their eyes. “And it was just — it was full of love. And I felt like my heart was gonna burst with love and pride. For myself.” She says the experience has changed her understanding of childhood trauma and taught her what “self-care” really means — changes that have flowed into all aspects of her life. “It was one [of]…if not the most significant experience in my life.”

“People report a verisimilitude,” says Liknaitzky, “which is that your representation of reality seems to be more reliable than your sober one. And that’s a striking experience. It’s as though you’ve woken up from the dream of your life.”

Admittedly, he says, for the most part the insights that come out of the psychedelic experience sound like platitudes. “I mean you can pop them on a Hallmark card. You know, ‘love is all there is.’ Actually, what matters is that if you can genuinely feel that love is all that matters, then that is absolutely profound. And you can’t talk yourself into feeling that. You can’t talk yourself easily into feeling compassion for an abusive mother, for example, but once you’ve actually had that encounter, then you’ve got a possibility of having quite a different relationship to whatever was disturbing you.”


The trial takes place in the retreat room at St Vincent’s Cancer Centre, fitted out with the daybed, warm furnishings and, on dosing days, pot plants and an Indigenous painting that Ross brings from home. “It’s a lovely space.” The study follows the design and protocols of those two key US studies from 2016.

In the lead-up to this day, Ross and psychiatrist Dwyer, who is co-principal director on the trial, have interviewed eight patients, eager — some desperate — to participate, assessing their physical and mental suitability, as well as excluding anyone with a history of psychosis, bipolar disorder or some types of complex trauma — all of which can be aggravated by psychedelics. Many don’t make it through this initial screening. Others do, but later deteriorate. Some die. “Things can happen, and they have happened,” says Ross. “People can be devastated.”

Crucial to the emerging treatment are the so-called ‘set and setting’ protocols identified in the first wave of modern psychedelic research and refined in the past two decades: the mindset and intentions you bring to the experience, and the setting in which it takes place. One of the striking features of psilocybin is that its benefits are related to the nature of the experience you have while taking it. Feelings of awe and transcendence have both been linked to improved outcomes in trial participants. At the same time, the experience is powerfully contingent on the environment and atmosphere within which people take the drug, the preparation they have had leading up to taking it, and their relationship with the therapists who will be supporting them before, during and as part of the crucial consolidation period afterwards.

The difference between a “bad trip” and a challenging but rewarding trip is mainly preparation and consolidation, says Ross. She recently came across an analogy she likes: “It’s the psilocybin experience that kicks the doors open. What you do with it after that is up to you.”

On the day of the drug treatment, she says, the therapist’s role is largely hands-off. She and Dwyer will sit with their charge and pay close attention to their emotional and other reactions; they will help or comfort if needed, but generally won’t intervene unless invited (a reassuring word, a glass of water, a hand to hold.) The music seeps into the room as well as the patient’s headphones, providing an intense, evocative soundtrack. “I can’t articulate how much more raw and stripped back you feel as a therapist.”

Patients arrive at 8.15 am. They are invited to bring with them personal items of significance, talismans that might help them feel safe or remind them of what is important to them. “We set up a kind of little camp,” says Ross, “because we’re there for hours.” Dwyer checks their blood pressure and heart rate, which will be monitored throughout the session. (While psilocybin so far appears reassuringly safe overall, physical side effects can include raised blood pressure, nausea or headaches, although these are generally short-lived.) They chat. Patients revisit their intentions for the trip, what they hope to explore. And Ross might remind them of the poem they talk about during the sessions leading up to “dosing day’: “The Guest House” by Persian poet and mystic Rumi:

This being human is a guest house.
Every morning a new arrival.
A joy, a depression, a meanness…
Welcome and entertain them all!

Central to the emerging model of psychedelic therapy is a confronting and not very fashionable idea: this treatment may at times be difficult; it may be painful; you may be afraid.

From a gently swaying houseboat on London’s River Thames, Rosalind Watts talks about “moving into the darkness.” One of the rising stars of the UK’s psychedelic research sphere, Watts has thought a lot about how to prepare therapists and patients for a paradigm based on the acceptance of psychological pain in a culture that routinely prioritises numbing.

“It’s like seasons,” she says “It’s darkness and light. And in our culture, it’s very much about — it should always be summer, you know, everyone should be happy, we should be busy, we should be productive, avoid the darkness… keep it light.”

But it is in accepting the darkness, she says, that her patients have a chance to heal.

She understands that some will simply not want to undertake a disorienting, sometimes gruelling, interior journey during which, even under ideal clinical conditions, nearly a third of volunteers can experience “significant fear,” albeit transient, according to a recent US study. “It won’t be for everyone.”

But for those who do, she says, the results can be transformative.

Until recently, Watts was a clinical psychologist at Imperial College, leading a major study comparing the impacts on depression of psilocybin versus a conventional antidepressant. Those results had not been published before this article went to print. But a smaller feasibility study reported striking reductions in treatment-resistant depression and later helped inform Watts’s development of a framework to help therapists prepare patients for the psychedelic journey. (‘Without a framework a psychedelic experience can be kind of like a nebulous dream.”) Her “Accept, Connect, Embody” model encourages patients to visualise diving into the ocean to the seabed to find oysters containing hard and difficult things.

Back in Melbourne, the St Vincent’s team use different metaphors, though with the same aim (Margaret Ross is a fan of Watts’s work). While the decision is always for the patient to make, says Ross, her counsel — as they prepare to enter the unknown — is to invite in whatever they most want to turn away. And to surrender.

“You might experience bliss,” she says, “you might experience oneness with the universe. You may feel your body like it’s dissolving or warping. Or, you know, it might feel quite scary. Don’t worry, your heart is still going to beat, your lungs are still going to breathe. It may feel like you’re dying, you may feel like you’re going crazy, and it’s okay.”


But outside the small, still room in Fitzroy, pressures are building.

In December 2020, an online opinion piece in the prestigious Journal of the American Medical Association Psychiatry issued an unambiguous warning about the future of psychedelic research. Despite — indeed because of — the promising results of recent clinical trials and the “seemingly exceptional potential” of the treatments, the so-called renaissance in psychedelic research was at risk of being derailed. The same sorts of “exuberance, utopian thinking, and uneven clinical approaches” that had led to the banning of these substances in the 1960s, “combined with the contemporary tendency to politicise science,” could do the same again, depriving potentially millions of people of future treatments.

The authors — one of them the world-leading researcher and founder of the Johns Hopkins Center For Psychedelic & Consciousness Research, Roland Griffiths — called for more studies into the drugs’ mechanisms and risks, as well as into how best to administer them in hospitals or other clinical settings. Psychedelics, they warned, were neither a quick fix nor a panacea.

Most of all, they called for restraint.

A similar conversation is now happening in Australia. “Understandably, there’s a lot of pressure to get the treatment out there as quickly as possible,” says Melbourne psychiatrist Nigel Strauss. “But really, we’re not ready for that yet.”

A prominent trauma therapist, Strauss has worked  with survivors  of Tasmania’s Port Arthur massacre and Victoria’s 2009 Black Saturday bushfires. He also has a longstanding interest in the therapeutic potential of psychedelics. Now in his seventies, he travelled to the UK in 2014 to do the MDMA therapy training program run by the pioneering Multidisciplinary Association for Psychedelic Studies. Last year, MAPS released follow-up results of phase 2 clinical trials of MDMA-assisted psychotherapy for PTSD showing that 56 per cent of participants no longer met the criteria for the disorder two months after treatment, with two-thirds of those still clear after a year or more. Phase 3 trials are now underway, amid building hopes that the drug might be approved for medical use in the US this year.

Strauss has personally funded most of the pending Monash PTSD study and is set to head the upcoming Melbourne trial of psilocybin for intractable depression. But he too urges restraint. “The last thing we want is for mistakes to happen.” Last year he set up a group of like-minded psychiatrists, Australia New Zealand Psychiatrists for Psychedelic-Assisted Psychotherapy, to discuss and prepare for the possibilities and challenges ahead.

There is no shortage of challenges.

Some relate to gaps in the knowledge. How does it work, for instance? How to more accurately gauge who is or isn’t a suitable candidate for psychedelic therapy? Why do some people have life-changing experiences while others don’t respond at all? How to minimise the risk of retraumatising already vulnerable patients?

Some are to do with questions of access and equity: how to ensure that those most in need will get to use the treatments — not just the wealthy, white and connected?

Other questions arise from the curious nature of the relationship between the drug, the person taking it and the therapeutic framework within which it is taken. There are harrowing accounts of patients emerging from deep and chronic depressions after psilocybin treatment, only to relapse months later. Paul Liknaitzky wants to know more about dosing regimens — how often, how much? — and is also keen to explore ways of prolonging the benefits of the “psychedelic encounter” by combining it with other mind-altering practices. “It’s not a drug you can take home. It’s a drug that needs a lot of support,” he says.

This puts particular demands on the therapists who might someday administer these therapies.

It is also one source of a growing rift between sections of Australia’s psychedelic research community and the high-profile, Melbourne-based advocacy and education group Mind Medicine Australia, or MMA. Launched as a charity in 2019 by philanthropist and opera singer Tania de Jong and her investment banker husband Peter Hunt, MMA promotes the use of medically controlled psychedelic-assisted treatments with a focus on psilocybin and MDMA to help counter the rising toll of mental illness. The couple founded the organisation following their own transformative psilocybin experiences (they travelled to the Netherlands, where they could legally take the drugs with a private therapist), and subsequently provided some early funding to Margaret Ross’s study via the charity Psychedelic Research in Science & Medicine.

Recent MMA projects include seeking to reschedule psilocybin and MDMA from prohibited substances to controlled drugs under the Poisons Standard — a move that could also make it easier for doctors to seek special access to the drugs for individual patients under a scheme already in use for medicinal cannabis. The MMA website has published moving extracts from dozens of submissions from patients and mental health professionals supporting the change — including one from de Jong herself, who is the daughter of Holocaust survivors. But critics in the psychedelic research community (even some who don’t believe the drugs should have been scheduled in the first place) argue that efforts to expedite the therapy are premature and potentially risky. In February, the Therapeutic Goods Administration delivered interim decisions acknowledging “significant public support” for rescheduling both substances but opposing the moves at this stage, citing “the risks to consumers, the lack of training for physicians, and the current state of research.” A final decision was due in late April.

Meanwhile, MMA is this year rolling out its own Certificate in Psychedelic-Assisted Therapies. According to its website, the $7,500 four-month course — available to experienced mental health practitioners, including social workers and occupational therapists — will take in components from overseas courses, input from local and international trainers and experts, and workshops, one of which will offer participants experience with breathing techniques designed to induce altered states. However, trainees cannot currently legally use or work with psychedelic substances, or administer them to patients.

Nigel Strauss and Paul Liknaitzky contend that, at least initially, all training should take place within clinical trials overseen by experienced international clinicians and be limited to psychiatrists, psychologists and psychotherapists. Liknaitzky has been developing a program to provide on-the-job training initially for around thirty therapists who will work on the Swinburne and Monash trials and who, he says, will later be able to train others. “These trials are an opportunity to train therapists ethically and rigorously through expert supervision, with real patients, using these drugs.”

Five of those I interviewed for this piece, including Liknaitzky, Strauss and PRISM’s Martin Williams, have quit MMA positions in the past eighteen months. Among the latest to go is British psychologist Renee Harvey, a highly qualified clinician who has worked on psychedelic trials at Imperial College, and who MMA hired to set up and run their training program. Harvey would not discuss her time at MMA or her sudden departure, except to say that she was happy for it to be known that she had resigned.

The broader arguments are at times confusing and/or personal, but seem largely to boil down to disagreements over the pace and process of change.

“More research on psychedelics would be brilliant,” says MMA’s Peter Hunt. “But the question that we all have to face up to is, when do you actually allow these medicines to be used for people that are suffering? In other words, when is the evidence sufficiently good to warrant any residual risk you’re taking by using these medicines with patients?” Given proper screening, support and medical controls, he says the time is about now. He points out too that antidepressants carry their own risks.

The outgoing head  of  psychiatry  at  Melbourne  University  and  St Vincent’s Health, Professor David Castle, has his own concerns. Speaking generally, he says: “The worry, I suppose, with the field is that some people have sort of leapt ahead of the evidence… And actually, you know, if you look at the scientific evidence, it’s relatively light on. It’s hugely exciting, but in terms of an evidence base, it’s very small.” Castle, who is leaving Australia to take up a position at Toronto’s Centre for Addiction and Mental Health, is a former board member of MMA. He hopes to pursue his interest in psychedelic research in Canada.

There is fear on both sides. On the one, that research and approvals will come too late for people who are barely hanging on. On the other, that without a rigorous, transparent scientific process that focuses on the local landscape as well as connecting Australia to the international research effort, the risk of mistakes rises, and with it the spectre of the research once more going off the rails.

John Allan, president of the Royal Australian and New Zealand College of Psychiatrists, is a cautious voice. Despite some promising results, he says, the research is in its early stages. “There is a history of controversies involving supposedly game-changing psychiatric treatments that, despite best intentions, have been based more in enthusiasm and hope than in rigorous research, and have ended up doing harm to people and to the future of psychiatric treatments.” He says the college is open to new treatments that improve lives, and that Australia’s research is of high quality. “But we just have to be really careful. And we have to do the science properly.”

There are bigger questions too — that go to the heart, perhaps the soul, of the psychedelic experience. But these are questions for later.


Ona mild, sunny Melbourne day in January 2021, ten months after the historic psilocybin trial was suspended, Marg Ross is back at her desk in St Vincent’s Hospital. In front of her is a list of names of patients who had originally expressed interest in the trial. “Inevitably we will have lost some,” she says. But today she will start making calls.

It will be another three or four years before the trial is completed, and at least a year before interim results are available. And even then, she says, she will need to protect the privacy of her patients. There is a lot she can’t say.

However, reports from patients from the 2016 New York University study give a sense of the range and intensity of their experiences. A subsequent qualitative study describes participants’ “exalted feelings of joy, bliss, and love; embodiment; ineffability; alterations to identity; a movement from feelings of separateness to interconnectedness; experiences of transient psychological distress; the appearance of loved ones as guiding spirits…”

“I feel like a whole bunch of crap has been dumped off the surface,” said one woman, who until her psychedelic experience had been just watching “the clock numbers ticking by.” Now, she said, “ just watching that tree over there blowing in the breeze, seeing people in the street, and all the different people in vehicles rushing by! I just feel good about being alive.”

What Ross can say is that the experience of sitting with her two first patients was profound — she uses the word repeatedly, apologising for doing so. She uses the words “extraordinary” and “stunning” too. She talks about learning to trust the deep inner wisdom — the “genius” — of the patient. She says that more than once the experience brought tears to her eyes. That seven weeks after one of those early sessions, she came across a researcher who had just done a follow-up interview and was now weeping in the corridor.

“I said “Are you all right?” and she said, “Yep.” And then she teared up and said, “It was beautiful.”’

Ross hopes that in time her study will lead to other, larger trials, involving more patients and more therapists, and helping link Australia to the international effort to rehabilitate psychedelic therapy. Her hopes were boosted in March when Minister for Health Greg Hunt announced  a $15 million grants package to research breakthrough therapies including psilocybin, MDMA and the dissociative anaesthetic ketamine for treating debilitating mental illness.

“This is such a fragile re-entry back into medicine,” says Ross more generally. “We’ve got one shot at this and that’s why it will take the time that it takes — because it is so easily demonised. You need one bad experience that’s highly publicised, and then we are back at square one. You’ll see a very, very public and swift political backlash, like we did in the seventies. And then it’s all off.” She picks up the phone. By mid-afternoon she has made her first appointment. All going well, two weeks from now she and Dwyer will sit beside their patient and hand them a small white capsule. After that, who knows? “What they can experience is visceral,” says Ross. “And then that transcendent state which is really intangible. To see someone going through that is just something else. I don’t have —” She pauses, half-laughs. “No words. Enormous privilege is what comes to mind.” •

This essay first appeared in GriffithReview72: States of Mind, edited by Ashley Hay.

The post Love and fear appeared first on Inside Story.

]]>
The self-esteem racket, and other quick fixes https://insidestory.org.au/the-self-esteem-racket-and-other-quick-fixes/ Tue, 04 May 2021 05:18:11 +0000 https://staging.insidestory.org.au/?p=66514

How overhyped findings undermined psychology’s authority

The post The self-esteem racket, and other quick fixes appeared first on Inside Story.

]]>
Ellen Herman opened The Romance of American Psychology, her 1995 history of the field’s rising influence, with a bold claim: “Psychological insight is the creed of our time.” For its supporters, she wrote, psychology offered “worthwhile answers to our most difficult personal questions and practical solutions for our most intractable social problems.” A quarter of a century later, those solutions arrive at an accelerating pace, amplified by new media and TED talks and shaping our workplaces, schools and intimate lives.

But a sentence from Herman’s second paragraph rings less true. “In the late twentieth-century United States,” she wrote, “we are likely to believe what psychological experts tell us.” In fact, the opposite has happened: well-publicised instances of scientific fraud and growing evidence that many of the discipline’s most celebrated findings rest on flimsy foundations have given the field some self-administered black eyes. The darkest bruises have blotted the reputation of social psychology, a sub-field that aims to understand how humans think, feel and act in relationships, groups and cultures.

American journalist Jesse Singal’s The Quick Fix is a forensic investigation of this loss of trust, and is likely to contribute to a further erosion. Recalling his time as a behavioural science editor at New York magazine, Singal recounts how the “fire hose of overhyped findings” he received via psychologists’ press releases brought home the scale of the field’s problems.

Singal casts a critical eye over a succession of influential psychological ideas and findings, and catalogues the scientific failings, overheated claims and poorly justified applications that entangle them. Along the way, he explores the downside of fad psychology’s success, and in particular the costs of America’s “ever-intensifying focus on the individual” — a focus, he suggests, that often neglects larger political systems and social structures to the detriment of effective solutions.

The Quick Fix has six main targets: the self-esteem movement, the role of “grit” in promoting academic and career success, “power posing” as a means of boosting women’s self-assertion, resilience training in schools and the military, 1990s predictions of a looming demographic wave of teenage super-predators, and the idea of implicit or unconscious bias. Later chapters tackle psychology’s replication crisis and the place of “nudge” interventions in promoting healthier and more prudent choices.

Singal’s account of self-esteem describes a movement (centred, perhaps not surprisingly, in California) that presented mental illness, criminality, relationship breakdown and academic underachievement as manifestations of a lack of self-love. Its evangelical proponents promoted self-esteem interventions on a grand scale in schools, their enthusiasm extending far beyond the available evidence and at times suppressing findings that should have dampened it. Although healthier, happier and more successful people tend to have higher self-esteem, much of the research indicates that self-esteem tends to play little or no causal role in promoting those outcomes. Trying to enhance it has little benefit, and perhaps some cost in a growing culture of narcissism.

A similar critique applies to the more recently feted concept of “grit,” the capacity to persevere with an unwavering sense of purpose in pursuit of long-term goals. Singal shows that the link between grit and desired outcomes has been exaggerated, and that other known factors explain success substantially better. Nor has it been adequately demonstrated that interventions can boost grit, or that it is meaningfully different from a better-studied personality trait called conscientiousness.

Singal views grit as an especially seductive target for improving educational achievement, partly because it doesn’t demand deeper systemic change in schooling. “[E]ven if things out there don’t improve any time soon,” he writes, “there are traits… we can cultivate in ourselves… to hop back on the upward-mobility ladder.” The fix here is quick and atomistic.

Self-esteem and grit are psychological traits whose real-world implications have been oversold. Power posing and resilience training are psychological interventions that face the same charge. The belief that striking confident poses promotes powerful behaviour and even boosts testosterone levels has crumbled under closer examination, reports Singal. Key research findings have proved hard to replicate, and the co-authors of the original work have confessed to questionable research practices.

Attempts to prevent depression and anxiety among schoolchildren by enhancing their resilience have yet to suffer the same public implosion, but studies reviewed by Singal cast doubt on the efficacy of a leading program developed under the banner of positive psychology. More dubious still, he argues, is the use of the same program to prevent post-traumatic stress disorder in the armed services. More than a million members of the US military have passed through this Comprehensive Soldier Fitness program, despite a paucity of good evidence that it is effective and despite the head-spinning extrapolation from children to adults and from preventing depression to preventing post-traumatic reactions.

The idea of super-predators was neither a trait nor an intervention, but rather an errant prediction. The brainchild of political scientists and criminologists in the 1990s, and as such a rather odd inclusion in a book on psychology’s woes, it forecast a surge of remorseless, morally impoverished young criminals — but they simply failed to materialise, and in fact crime rates dropped in the 2000s. Singal discusses the case as an example of a mistaken idea having dire effects — its legacy was tough-on-crime policies that allowed children to be sentenced as adults.

Implicit bias, sometimes called unconscious bias, has been even more consequential. Grounded in the study of implicit attitudes, commonly assessed by the immensely popular Implicit Association Test, or IAT, research indicated that a substantial majority of white Americans held an automatic preference for fellow whites over Blacks, even if they sincerely claimed not to be racist. This bias was heralded as the newly discovered psychological “dark matter” that might explain the persistence of racism despite the public’s steeply declining endorsement of overt bigotry over the past century.

Although he acknowledges that unconscious racial bias exists and is probably responsible for some fraction of current racial disparities, Singal deflates much of the standard case. The IAT, sometimes seen as an X-ray of our racist souls, fails basic requirements of psychological assessment and lacks solid evidence of predictive validity. It remains unclear to what extent apparent bias truly reveals hidden prejudice rather than mere awareness of racial stereotypes or the dread of appearing racist on the test. Singal criticises the idea of unconscious bias for not only its scientific limitations but also its wider implications. As a staple of diversity training programs, it deflects attention from social change efforts to “internal spiritual cleansing,” he argues, but also partially exonerates us of our prejudice and leads us to ignore the degree to which discrimination is conscious and explicit.


How psychology got into this kind of mess and how it might get out are explored in chapters on the replication crisis and “nudging.” The former was sparked by evidence that many published psychological findings — at least half, and more for social psychology in particular — fail to be confirmed in repeat studies. Singal presents the many sources of these failures, including inappropriate statistical analyses, inadequate sample sizes, and questionable research practices that increase the chances of finding something that isn’t there. A research culture of chasing surprising and counterintuitive findings to attract media attention also plays a role.

Singal is searching in his critique but perhaps underplays the degree to which low rates of replication and high rates of dubious research practices afflict many areas of science, and the fact that psychologists have taken the lead in making science as a whole more open and responsible.

The final chapter of The Quick Fix explores “nudges,” those subtle tweaks to “choice architecture” that guide people towards desirable decisions. Making forms simpler is an example, as are telling people how their electricity usage compares to their neighbours’ and offering organ donation as an opt-out rather than opt-in decision. Promoted mainly by behavioural economists and institutionalised over the past decade in many government-supported “behavioural insight units,” nudging appeals to Singal. Its intuitive and often minimalist interventions, and its sober efforts to bring an experimental mindset to reforming everyday practices contrast with the splashiness and over-claiming he documents elsewhere in the book. Even here, though, he acknowledges that nudge techniques don’t invariably work as intended and aren’t entirely free of hype.

My only quibble is with Singal’s frequent description of the products of fad psychology research as “half-baked.” The degree to which the problems he reveals are caused by impatient knowledge-bakers and fix-quickeners is moot. New publish-or-perish imperatives and the old-fashioned lust for recognition can certainly lead researchers to present their work before it has been fully scrutinised and replicated; but time in the scientific oven is only one implicated aspect of the baking process.

The problems of fad psychology are as much about a lack of humility, honesty and concern for truth over novelty as they are about serving up research before it is ready for consumption. And without denying that psychologists bear the primary responsibility, the insatiable appetite of the media and popularisers for counterintuitive and overdrawn findings also plays a role.


Jesse Singal is an exceptional writer on the social and behavioural sciences, and The Quick Fix, his first book, is a showcase of his talents. He has a firm grasp of the technical and quantitative aspects of the research he examines, and he communicates what might have been dry methodological points with unusual lucidity. While he doesn’t shy away from strong claims, he is consistently fair-minded, and the targets of his criticism are invariably given a chance to respond. Free of grandstanding, ideological axe-grinding and pointscoring, a particular strength of the book is its insistence on zooming out from a specific psychological topic or finding to the broader cultural, societal and historical contexts in which it appears. Singal ventures expertly into political science, economics and sociology, repeatedly circling back to remind us of the self-evident but sometimes forgotten fact that individual behaviour is embedded in systems, institutions and implacable economic realities.

This makes him something of a rarity. Many writers expound on psychological ideas while paying the merest lip service to their wider context. Others write about politics and society while either ignoring the place of the human mind and behaviour or treating psychological analyses as reductive and individualistic. Singal offers some scalpel-sharp criticism of psychological research and its popularisation, reminding us, among other things, that the vast US racial wealth gap and current criminal justice policies probably have much more to do with contemporary racial disparities than implicit bias. But he acknowledges that psychology matters and that behavioural science is very hard to do well: human action is multi-determined, deeply contextual, hard to predict and, as the philosopher Ian Hacking observed, a moving target.

Singal’s mission is not to rebuke psychology or chastise it for not being sociology, economics or political science, but to make it better on its own aspirational terms. His prescription is greater humility, more rigour, and a fuller awareness of the limits of psychological interventions in the face of large, constraining systems of power and inequality. Psychology has a part to play in social science–informed responses to social problems, but it must become more modest, reflective and genuinely scientific in spirit.

Psychology doesn’t saturate Australian media to the degree it does in the United States, or have the same cultural cachet, so we might wonder about the local relevance of this book. But the differences are not as great as they might seem. Melbourne has more registered psychologists than New York City and many psychological concepts and practices are growing in influence. Rising concern with mental health is driving increased interest in resilience training in schools and beyond. Many organisations have embraced the notion of unconscious bias, sometimes uncritically, as a basis for equity and diversity initiatives. New behavioural insights units have sprouted within federal and state governments, and the general public remains hungry for popular psychology’s uplifting messages of self-help.

The Quick Fix provides a compelling perspective on these developments. It will leave the reader with a more questioning attitude about psychology’s latest revelations and interventions, and perhaps also with a more hopeful view of the field’s capacity to reform. •

The post The self-esteem racket, and other quick fixes appeared first on Inside Story.

]]>
With royalty at Riven Rock https://insidestory.org.au/with-royalty-at-riven-rock/ Tue, 18 Aug 2020 04:15:33 +0000 http://staging.insidestory.org.au/?p=62704

Harry and Meghan’s new home comes with a history of American aristocrats, primate research and the quest for the contraceptive pill

The post With royalty at Riven Rock appeared first on Inside Story.

]]>
I am not much of a royalty watcher, but I admit I was curious to see where Harry and Meghan had decided to settle in the United States. I wasn’t surprised to hear late last week that they had chosen Santa Barbara, an enchanting coastal town north of Los Angeles that I happen to know very well. I visited many times while researching my biography of the Australian actress Judith Anderson, who lived there from 1950 until her death in 1992. In fact, Anderson spent her final years in Montecito, the wealthy area where the royals have purchased their future home.

What surprised me, though, was that Harry and Meghan have bought the Chateau of Riven Rock. The words Riven Rock took me back to another life I had written about, that of feminist anthropologist Elsie Clews Parsons, and to the heartbreaking story of one of her closest friends, Katharine Dexter McCormick.

The better-informed of the journalists reporting the royal purchase may tell you that the Chateau of Riven Rock was built on part of the estate purchased by the wealthy McCormick family in the 1890s, and that it was home to Katharine’s invalid husband, Stanley Robert McCormick, from 1908 until his death in 1947. Behind those stark facts is Katharine’s extraordinary story.

Katharine Dexter was born in 1875 in Dexter, Michigan, to prominent lawyer Wirt Dexter and his wife Josephine. Her father died when she was fourteen, and Katharine, her mother and her brother Sam moved to Boston. There, Katharine studied at the Massachusetts Institute of Technology and Sam attended Harvard.

Holidaying in Newport in 1893, the family met Elsie Clews, the daughter of a wealthy New York banker. Elsie, who had just completed her freshman year at Barnard College, began a love affair with Sam that ended tragically with his sudden death the following year. Katharine and Elsie’s shared loss deepened their friendship, and the two rebellious young women encouraged each other in their scholarly ambitions — Elsie to undertake a PhD in education at Columbia University and Katharine to complete her undergraduate science degree at MIT.

Instead of attending medical school as she had planned, though, Katharine married Stanley Robert McCormick, the youngest son of Cyrus McCormick of International Harvester fame. Two years later, in 1906, Stanley was hospitalised with what we now call schizophrenia. In 1908, still a danger to himself and others, he was installed at the McCormick’s eighty-seven-acre Riven Rock estate.

Stanley and his carers shared the large, two-storey Mission Revival–style house. Over the years a golf course and a theatre were added, a large art collection was established, and a music director appointed.

In addition to physical and cultural comfort, Katharine was determined that Stanley would have the best of medical attention and that everything should be done to try to find a cure for his affliction. In Boston, he had come under the care of a young resident physician, Gilbert Van Tassel Hamilton, who was adding the new subject of psychology to his medical degree. Hamilton had resigned from his hospital position to become Stanley’s personal physician, and he moved with him to Riven Rock.

Royal hideaway: the Chateau of Riven Rock.

At the time, it was increasingly believed that studying the behaviour of primates would help in understanding human psychology. In the hope that this research would help find a cure for Stanley’s problems, Katharine encouraged Hamilton to establish a primate laboratory at Riven Rock. The work of the laboratory, with its ten monkeys and one orangutan, is described by researcher Robert M. Yerkes in his 1916 book, The Mental Life of Monkeys and Apes.

On the basis of his experience at Riven Rock, Yerkes established his own research station, the Anthropoid Breeding and Experiment Station, in Orange Park, Florida, fourteen years later. By then he was professor of psychobiology at Yale University.


It is difficult to get a clear idea of Stanley’s condition, but it manifested itself as an aggressive fear of sexuality and of women. Katharine was not allowed to go near him; until she bought her own home there in the early 1940s, she stayed in a hotel when she visited Santa Barbara. The laboratory’s primate studies focused on the sex lives of primates.

Hamilton published only one paper, “A Study of Sexual Tendencies in Monkeys and Baboons,” which appeared in the Journal of Animal Behavior in 1914. His work apparently shed little light on Stanley’s condition, and nor did it indicate any clear direction for a cure. He left Riven Rock under a cloud in 1917. In 1929, as director of psychobiological research at the Bureau of Social Hygiene in New York, he published the pioneering book, A Research in Marriage.

Katharine never gave up on her search for a cure for Stanley. During the 1920s she became interested in the relationship between hormones and mental disorders. In 1927 she established the Stanley R. McCormick Memorial Foundation for Neuroendocrine Research Corporation (later the Neuroendocrine Research Foundation) at Harvard Medical School.

During these traumatic years Katharine was also throwing her energies into the feminist movement. After she met birth control activist Margaret Sanger in 1917, she began the work that would lead to probably the most important contribution to women’s freedom in the twentieth century. During the 1920s she helped smuggle diaphragms from Europe for Sanger’s Clinical Research Bureau in New York.

When her mother died in 1937, she inherited an estate of more than US$10 million; ten years later, Stanley’s death added another $35 million to her fortune. Much of this she put into birth control research.

In 1953 Katharine met Gregory Goodwin Pincus, who was developing a hormonal birth control method at his research laboratory, the Worcester Foundation for Experimental Biology. She began to fund the foundation, initially at $100,000 a year but increasing to between $150,000 and $180,000 annually until her death in 1967.

In all, she provided $2 million (around $20 million today) of her own money for the development of the oral contraceptive pill. Even after the pill was approved for contraception in 1960, she continued to fund research on ways of improving birth control research.

From the time she bought her home at 1600 Santa Barbara Street, in the heart of the small city, Katharine was an avid supporter of the city’s art institutions. She was a founding member of the Santa Barbara Museum of Art, of which she was vice-president with fellow philanthropist and art collector Wright S. Ludington. Serving on the buildings committee, she was responsible for hiring renowned Chicago architect David Adler to convert the city’s old post office into the museum.

Katharine died in Boston, aged ninety-two, on 28 December 1967. She left her home to the museum, which used it to house the Ridley-Tree Education Center, where art classes for children and adults are held. Before her death she had donated funds to build Stanley McCormick Hall, a residential facility for female students at MIT that would make women a visible presence among students of science and engineering. Among many other legacies, she bequeathed $5 million to the Stanford University School of Medicine to support female physicians.

The original Riven Rock estate has since been broken up into smaller but still substantial blocks, of which Meghan and Harry’s Chateau of Riven Rock is one. The couple’s new neighbours will probably tell them that they can read a fictionalised version of the story of Katharine and Stanley in Riven Rock, a novel by local celebrity author T.C. Boyle; that they can see a cinematic version in The Romancing and Reaping of Riven Rock, a documentary funded by a group of Montecito residents; and that they can learn about some of the happier times in Katharine’s life, including a love affair, in my biography of her friend, Elsie Clews Parsons. But they will probably be happy simply enjoying one of the most beautiful views in the world. The rest of us can take a tour of the Chateau on YouTube. •

The post With royalty at Riven Rock appeared first on Inside Story.

]]>
Decent creatures https://insidestory.org.au/decent-creatures/ Wed, 27 May 2020 03:34:23 +0000 http://staging.insidestory.org.au/?p=61204

Books | If we were smarter, would we realise we’re better than we think?

The post Decent creatures appeared first on Inside Story.

]]>
For those of you who haven’t heard of him, Rutger Bregman is the author of Utopia for Realists, a book that canvassed the idea, among other things, of a universal basic income. Published in 2017 when Bregman was twenty-eight, it became an international bestseller. During an extensive tour promoting the book he came to Australia, which led in serendipitous ways to the development of this latest book, Humankind — a happy confluence of events to which I will return.

Bregman is a historian who also writes books about philosophy. Given his age and the range of his imagination, he has been called a wunderkind; I am inclined to call him a visionary. But he is humbler than that, content to describe himself as an investigative journalist. He contributes to the Die Correspondent, a highly regarded Dutch online newspaper that eschews advertising on principle (and now has a US edition). Die Correspondent resembles Intercept, or Inside Story and similar organs here and overseas that play such a vital role in keeping independent, long-form journalism alive.

But it’s as a writer of books that Bregman has made his greatest impact. Humankind is a huge, ambitious work, a critical survey of research in the natural and social sciences, woven together with the skill of a born storyteller. Its basic premise is that, as individuals, we Homo sapiens are fundamentally decent people. Not particularly smart, but good.

It’s an audacious proposition, though Bregman is not the first to assert it. As an undergraduate anthropology student long ago, I learned that it was cooperation rather than the Social Darwinists’ much-vaunted competition that gave Homo sapiens our evolutionary advantage. Bregman, given to snappy loaded phrases, calls it “survival of the friendliest.” And for this he has drawn on a mindboggling amount of research from a range of fields, though chiefly evolutionary biology, a discipline that has grown substantially since I was a student and has come up with intriguing findings about our physical and cognitive characteristics.

Hence Bregman’s renaming us as Homo puppy. This is his term for “self-domestication,” a concept derived from studies of how we tamed animals. Domestication involves breeding for desirable outcomes. As wolves were bred into dogs, for example, they became more juvenile, friendly and what we might even call cute. Their snouts became shorter, their tails curlier, their bodies smaller. Likewise, we Homo sapiens differentiated ourselves from other primates and humans. Our brows and jaws became smaller, our faces flatter, our skulls rounder, our bones thinner.

Natural selection was at work, but what scientists now make of it diverges widely from earlier interpretations of how our species became dominant. From biology to archaeology, advanced techniques have led to the more nuanced understandings of this process that Bregman discusses. The fact that the friendliest among us succeeded most as parents gave us the ultimate advantage over other kinds of humans.

Those other kinds included the Neanderthals, who the latest research indicates were smarter than us. They had larger brains, for one thing, and their technology was arguably as advanced as ours at the same point in time. Compared to us, though, they were loners, fitted best for the harsh conditions of the Ice Age. They lived in smaller communities and didn’t move around as much as we did.

As is his style, Bregman characterises the two species as Geniuses and Copycats. Homo sapiens didn’t necessarily invent things, but they picked up skills and technologies from people they mingled with, Neanderthals included, adapting and improving them. Still essentially hunter-gatherers, their communities could be large and organised enough to allow for construction of impressive monuments. Bregman cites Göbekli Tepe, a Turkish archaeological site discovered to be the world’s oldest temple, built not by slaves for rulers but by collective endeavour. By and large our ancestors were peaceful people as well. The archaeological record yields next to no evidence of murder, sustained intertribal warfare or other repeated violent acts.

All of this flies in the face of what we’ve come to believe about ourselves: what Bregman calls the “veneer theory of civilisation.” To elaborate this he takes us back to the Enlightenment, and the opposing social theories of Thomas Hobbes and Jean-Jacques Rousseau:

 

In one corner is Hobbes: the pessimist who would have us believe in the wickedness of human nature. The man who asserted that civil society alone could save us from our baser instincts. In the other corner, Rousseau: the man who declared that in our heart of hearts we’re all good. Far from being our salvation, Rousseau believed “civilisation” is what ruins us.

Even if you’ve never heard of them, the opposing views of these two heavyweights are at the root of society’s deepest divides. I know of no other debate with stakes as high, or ramifications as far-reaching. Harsher punishments versus better social services, reform school versus art school, top-down management versus empowered teams, old-fashioned breadwinners versus baby-toting dads — take just about any debate you can think of and it goes back, in some way, to the opposition between Hobbes and Rousseau.

 

So how did the Hobbesian theory — the one that states we are at bottom selfish beasts apt to revert to barbarism in any crisis, with only the Leviathan state between us and our murderous impulses — get the imprimatur? For Bregman, the defining moment was when we started farming. A group of tribes came to the Fertile Crescent, settled there, and started planting crops — the spring shoots of civilisation. Accompanying this move came the growth of cities, the instigation of hierarchies, and the supplanting of the old disinterested gods with all-knowing omnipotent ones and of tribal leaders with dynastic monarchs.

Of course, this is a gross simplification of developments that countless writers have devoting careers to describing. But the point to be made, and Bregman makes it convincingly, is that settling down wasn’t the unqualified advance we have long assumed it to be. Along with Yuval Noah Harari, author of Sapiens: A Brief History of Humankind, another bestselling narrative, Bregman contends that the agricultural revolution ushered in problems that beset us to this day. Failing crops led to periodic famines. Settlement in confined spaces made us prone to disease. Right now, we’re experiencing a pandemic, for us a once-in-a-century event, but before the public health and medical advances of the last 200 years or so — a mere blip on our 200,000-year timeline — plagues were common occurrences in human settlements.


I mentioned an Australian connection for Bregman’s research, but actually there are two. The first involves William Golding’s Lord of the Flies, a novel about a bunch of British schoolboys stranded on an uninhabited island. Bregman first read it in his teens, and it made a profound impression — as it has on just about every one of its readers, who at a guess would number in the tens of millions. Published in 1954, it won Golding the Nobel Prize for Literature, and all these years later it remains in print, is still being studied in schools, and has been the inspiration for reality TV.

After beginning to question its premise — that humans left to their own devices will ineluctably revert to their vicious animal nature — Bregman noted its depiction of the Hobbesian veneer theory as the secret of its success. Golding, he learned — “a man who beat his kids” — had a very poor opinion of human nature.

Then Bregman got wind of a real-life incident in which some boys were shipwrecked on a Pacific island and survived for a year and a half until they were discovered. Nothing like Golding’s characters, they were bored Tongan teenagers who “borrowed” a boat and went sailing for a lark. Finding themselves on their own they created a mini-society that went a long way towards proving that Rousseau, not Hobbes, was right. The island, deserted years before, was ‘Ata, part of the Tongan archipelago.

The Australian connection? Bregman learned from a digitised Age article that a thirty-five-year-old Australian sailor named Peter Warner discovered the boys in 1966. Given their ages at the time, Bregman figured out that some of the boys, and Warner himself, might still be alive. He was right, and took the opportunity of the 2017 book tour to visit Warner and his friend Mano Tauto, a survivor, on Warner’s property near Lismore.

The ‘Ata discovery was covered in the press back when it happened, a documentary was attempted, and then the incident was forgotten. Unlike Golding’s fiction, people found the real story difficult to believe.

The truth is, we find stories about goodness rather boring. We’re hooked on darkness, and the way the news is presented has a lot to do with it. Bregman calls it an addiction, one best avoided if we want to keep things in perspective. A century of two world wars and a worldwide depression hasn’t helped. The Holocaust alone exemplified the depth of cruelty our species can descend to, and in its wake psychologists undertook experiments purporting to prove what each of us is capable of in extremity.

On re-examination, though, the findings of those experiments have been shown to be spurious. And here’s where the second Australian connection comes in. The woman whose work has done most to debunk them is Gina Perry, a Melbourne psychologist Bregman met on that fateful 2017 book tour.

If nothing else, Bregman has succeeded in toppling Hobbes from his perch. He doesn’t argue that humanity has no dark side. The friendliness of our species tends not to apply outside specific social units — in other words, outside our tribes. Tribalism has stayed with us down the millennia, accentuated by authoritarian societies of one kind or another, including our own. If our hunter-gatherer ancestors were less noble than Rousseau would have it, they were arguably healthier and happier than their civilised descendants.

Yet time and again, the closer we get to our neighbours, as they did, the friendlier we can be. Those further from people unlike themselves are the ones most likely to fan the tribal flames. Here Bergman points to our leaders, who for various reasons find it in their interest to manipulate us. We have given them power, and power inevitably corrupts.

Humankind is an absorbing, challenging work. Its analysis is both intricate and sweeping. Its end points are the demise of democracy, or at least its withering, and the imminent threat of climate change. We have been let down by our leaders, but our greatest danger lies in the cynicism this has bred in us.

Bregman is weakest, I believe, when it comes to countering these dangers. Some of his suggestions, like the basic income of his previous book or the participatory budgets he advocates here, are not without their problems. But in proving that deep down we are decent creatures who can work together for the common good, he has shown us where to start.

The post Decent creatures appeared first on Inside Story.

]]>
Sick of all my kicks https://insidestory.org.au/sick-of-all-my-kicks/ Thu, 30 Apr 2020 01:38:40 +0000 http://staging.insidestory.org.au/?p=60668

Books | Should we embrace boredom?

The post Sick of all my kicks appeared first on Inside Story.

]]>
All over the world, people in lockdown are living in a stew of stale air and uncomfortable emotions. There is worry and a sense of threat about what may come to pass and irritation at the no longer cute foibles of cohabitants. There is sadness and grief over the loss of loved ones and possible futures. There is loneliness and longing. And perhaps most of all, at least for the lucky ones, there is boredom, the sullen tedium of being stuck in unchanging space and dragging time.

At first blush boredom might look to be a simple emotion, merely the mind’s response to an unstimulating environment, something that might trouble an embowled goldfish as much as a quarantined human. Not so, say the authors of this fascinating book. Boredom is as psychologically rich as the minds of those who experience it, and it isn’t even an emotion but a “feeling of thinking.” Academic psychologists James Danckert and John Eastwood have written an extended meditation on boredom as feeling and motive, as source of misery and meaning in life, as social pathology and technological predicament.

Danckert and Eastwood aren’t the first to turn their scholarly attention to boredom. Several bewhiskered German and Scandinavian philosophers have also pondered its dull ache. Arthur Schopenhauer declared it one of the two enemies of happiness, Søren Kierkegaard identified it as the root of all evil, Martin Heidegger distinguishing between its superficial and profound variants, and Friedrich Nietzsche speculated on God’s boredom on the seventh day of creation.

Psychoanalysts have weighed in, too, viewing boredom as a way of avoiding desires the person would rather not recognise. As Adam Phillips, epigram-loving author of the memorably titled On Kissing, Tickling and Being Bored put it, boredom is “the mood of diffuse restlessness which contains that most absurd and paradoxical wish, the wish for a desire.” In analyses such as these, boredom emerges as complex, conflicted and existential, not merely a reaction to imposed monotony.

So what then is boredom? One of the strengths of this book is that rather than plod didactically through a series of experiments or a simple explanatory scheme, the authors try to capture the subjective feel of boredom in an unreductive fashion. One part of that attempt is a careful dissection of the differences between boredom and other experiences that often travel with it, such as frustration, loneliness and sadness. Another is a deliberate attempt to distinguish what boredom is, in itself, from its causes and effects.

Although Danckert and Eastwood paint a nuanced picture, they resist the easy option of proposing that there is not one boredom but many. Boredom is singular, but it is a conglomerate of thoughts, feelings and wishes and has no single opposite to give it meaning. Sadness may be happiness’s opposite pole, but boredom’s many contrasts include curiosity, interest, relaxation and the sense of flow.

To the extent Danckert and Eastwood develop their own account of boredom, it has two mutually reinforcing elements: a “desire conundrum” and “underutilised cognitive potential.” The former refers to a sense of directionless “wanting to do something but not wanting to do anything,” when we are unable to summon a desire to do any of the things currently available to us. The latter involves the pain of not having our mental capacities optimally engaged, either because the environment is unchallenging or because it is too challenging.

This formulation captures how boredom doesn’t reside in the external reality of our world but in its lack of fit with our mental resources and aptitudes. This is not to say that the state of that world is irrelevant, of course. Monotony and situations that thwart or constrain us promote boredom, but so too do personality traits such as extraversion that make some of us more boredom-prone than others. What bores us is highly individualised, the photographic negative of the idiosyncratic pattern of things that fascinate us.

Out of My Skull clearly demonstrates that boredom has consequences. Research shows that it drives some people to over-eating, substance abuse, self-harm and other forms of impulsive behaviour. It is associated with depression and may be one of its early warning signs. People prone to boredom are especially likely to explode in anger and to engage in delinquency, with psychopaths and narcissists two cases in point. People living in extreme conditions that foster boredom — on polar research vessels, for instance, or in solitary confinement — often suffer severe psychological impacts.

According to the authors, the sense of meaninglessness and disconnection so central to boredom also drives destructive forms of engagement with technology. Although they are sceptical of claims that our societies are experiencing an epidemic of boredom, and they are not issuing yet another denunciation of the evils of social media or excessive screen time, Danckert and Eastwood argue that our devices often encourage and reward quick and easy escapes from boredom. Consuming this “junk food for the mind” is self-defeating as a response to boredom because it is ultimately unengaging and renders us passive. There is a reason why we tend to be least bored when engaged in face-to-face interaction with others.

Although Out of My Skull devotes space to the more negative implications of boredom, it also acknowledges its positive possibilities. Boredom, as Adam Phillips wrote, is a state “in which hope is being secretly negotiated.” Danckert and Eastwood suggest it is a signal notifying us that we are disengaged and prompting us to act. It motivates change and protects us from apathy, especially if we can avoid escapist responses that short-circuit rather than overcome it. In the end, being bored orients us to our agency and our need for life meaning, teaching us that we need a larger life project or perhaps that we gave one up too soon. In one of its many arresting aperçus, the book claims that “to be bored is to fail to be the author of our own lives.” Boredom tells us to pick ourselves up and write something.

Out of My Skull is an enjoyable and enlightening read that explores the many dimensions of boredom deftly but deeply. As notable researchers themselves, Danckert and Eastwood weave in selected research findings while giving at least equal time to philosophy, literature and educated introspection. The book is not an advice manual but offers a great deal of useful leads for those seeking guidance on matters such as why it is pointless to give a bored and whiny child a list of things to do, or why the teenage years are the apex of ennui.

What might this book have to offer to those of us who have succumbed to cabin fever while trying to avoid catching something worse? The authors recognise that isolation and boredom are “unhappy bedfellows,” two forms of disconnection that can have compounding effects. Even so, boredom offers up some opportunities. Out of My Skull tells us not to fear boredom or attempt to avoid or outrun it, but to embrace it instead. The capacity to be bored is an important achievement in our personal development and it spurs us to build self-reliance and find new projects and purposes: “it is good, on occasion,” say the authors, “to be understimulated by the world.”

Many of us don’t have much choice in the matter right now, but this engaging book offers wisdom and perspective on our predicament, both temporary and permanent. •

The post Sick of all my kicks appeared first on Inside Story.

]]>
Ages of anxiety https://insidestory.org.au/ages-of-anxiety/ Wed, 23 Oct 2019 00:57:58 +0000 http://staging.insidestory.org.au/?p=57429

Books | There are reasons why Claire Weekes didn’t receive professional recognition, but they don’t take away from her achievement

The post Ages of anxiety appeared first on Inside Story.

]]>
Over the past hundred years almost every generation has been said to be living in an “age of anxiety.” The 1920s received the first of these diagnoses, W.H. Auden and Paul Tillich delivered another in the aftermath of the second world war and, after anxiety was knocked off its precarious perch by depression for a decade or so, it has returned with a vengeance as the most culturally prominent form of misery. It is impossible to escape the drumbeat of climate crisis and anxiety epidemic, just as in earlier times fears of nuclear annihilation and existential angst rattled our minds.

Anxiety may have been a constant over the years, but the medical and mental health professions have treated it in radically changing ways. Fin de siècle physicians prescribed rest cures, psychoanalysts favoured lengthy exploration of the childhood roots of neurosis, and the mid-twentieth-century pioneers of psychopharmacology experimented with tranquillisers and sedatives. Later still, an assortment of short-term psychotherapies emerged, some employing the behaviourist language of conditioning, others aiming to reason people out of their dreads and inhibitions using cognitive therapy’s techniques of persuasion.

For the most part, the Antipodes were distant observers of these therapeutic developments. Australia’s nascent psychiatry profession was small, lacking in prestige within medicine, and liable to follow clinical trends dictated from the major centres in Europe and North America. Judith Hoare’s new biography documents an important and under-appreciated exception to this rule. In Dr Claire Weekes we had our own original voice in the treatment of anxiety, one that travelled from the colonial periphery to the centre and had a long and amplified impact around the world.

Contrary to the book’s title, Weekes’s life was in some respects quite ordinary. Hoare offers an intimate portrait of a rather staid and outwardly unremarkable subject who is enmeshed in the usual day-to-day challenges of domesticity and family life. What was extraordinary about Weekes was her circuitous path to expertise in the study of anxiety and the magnitude of her global influence on that field.

Weekes displayed notable intellectual gifts as a child and completed a degree in zoology in Sydney. After overcoming her own severe anxiety disorder, which was initially misdiagnosed as tuberculosis, she proceeded to obtain a Doctor of Science degree, supported by unusually enlightened male mentors. Her work on lizard embryology had an immediate impact, winning her a scholarship to continue her research in England and portending a bright academic future.

Weekes had a change of heart early in her postdoctoral career, however, first hoping to retrain in neurology, then embarking on a lengthy European tour as a singer, accompanied by a female pianist who was to become her long-term companion. She subsequently set up a travel advice bureau, only to see it collapse when war made grand tours of European cultural centres impossible. Finally, Weekes settled into medical training, graduating at the age of forty-two in 1945. Her work as a general practitioner led her to treat vast numbers of people with anxiety conditions and develop the approach to their treatment that would make her internationally famous.

The core of Weekes’s treatment approach was laid out in five self-help books. The first of these, Self Help for Your Nerves, published in 1962, was easily the most influential, and the later books largely ploughed the same furrow. The book, a classic of the bibliotherapy genre, has sold several hundred thousand copies and presents the sufferer with a framework for making sense of their condition — notionally any form of anxiety but especially targeting what is now called panic disorder — and a practical guide to curing it.

The framework is an accessibly written medical explanation based on the bodily basis of fear and on stress-sensitised nerves in particular. The guide provides a set of injunctions, summarised as “face, accept, float, let time pass.” The distinctive gist of Weekes’s approach is that anxiety should be confronted but not fought. The person experiencing intense, debilitating fear should not avoid it but should aim to pass through it in a spirit of present-focused acceptance, confident that it is temporary and that recovery is not only possible but assured.

Hoare has a fierce commitment to giving Weekes her due as a neglected figure in Australian and international medical history. There is no doubt that Weekes has not received the recognition she might have won, a 1979 MBE notwithstanding. Even so, Hoare’s case for the magnitude of the professional neglect she experienced and of the professional contribution she made is occasionally overstated.

Weekes may never have been embraced by the psychiatric or broader medical profession, aside from a few devoted champions, but that cool reception had many sources. Professionals invariably look askance at peers who popularise ideas and simplify them for a general audience in the process, a factor that Hoare emphasises in explaining why Weekes’s work was disregarded, while plainly viewing it as illegitimate. But Weekes did her ideas no favours by failing to present them to specialist audiences in scientific or professional publications, making little reference to competing explanations or treatments in her writing, and providing no meaningful evidence for the efficacy of her treatment approach, one scientifically worthless survey of self-selected patients failing that task by a wide margin.

Coupled with never having formally trained as a psychiatrist and rejecting its diagnostic concepts in favour of the rather quaint and over-general “nerves,” this refusal to engage with the psychiatric profession using its preferred methods for establishing truth and value ensured that many within the profession would see Weekes as merely a dispenser of self-help nostrums, however unfair that perception might seem in retrospect. Weekes’s equal reluctance to engage with the psychology and cognitive therapy communities, with whom she could have made common cause against the Freudians and pill-pushing pharmacologists, had the same result.

Hoare is certainly right to see Weekes as deserving greater repute as a mental health pioneer, but at times her biography overlooks how much the neglect between the mental health professions and Weekes was reciprocal. Weekes was a strong and self-reliant thinker, but her conventional success was likely undermined as much by her insistence on autonomy and her refusal to engage with the contemporary science of anxiety — unfortunate in view of her aptitude as a scientist — as by any prejudice among her medical colleagues against popularisers.

It is also possible to quibble with the book’s claims about the magnitude of Weekes’s contributions to the understanding and treatment of anxiety. Hoare is a passionate advocate for Weekes as a theorist of anxiety and as a therapeutic visionary, but the suggestion that she “cracked the code” of anxiety is hard to credit. Parallels can be drawn between elements of her thinking and later developments in the theory of anxiety, such as the ideas that panic (but not all anxiety) is grounded in fear of fear, that agoraphobia is not in fact fear of the agora but of the panic that tends to strike there, and that distinct neural systems are involved in clinical anxiety. Weekes’s ideas of accepting and floating through anxiety also anticipate in some respects the embrace of acceptance and mindfulness by so called “third-wave” cognitive-behavioural therapies in the early 2000s.

But the suggestion that Weekes was the first thinker to have these ideas, the most farsighted, or the one who grasped anxiety’s essence before others is exaggerated. Similar thoughts and therapeutic approaches and challenges to the Freudian orthodoxy on the nature and treatment of anxiety were being developed around the same time or before by several psychiatrists and psychologists. Many of them presented their ideas in professional forums and with greater scientific support, although they tended to use different vocabulary and lacked Weekes’s enviable capacity for public outreach.

Weekes and her proponents never seemed to grasp that the manifest impact of her books on thousands of readers, communicated in grateful letters by the bagful, did not provide the sort of validation her ideas required for mainstream acclaim. The nomination of Weekes for a Nobel Prize in Physiology or Medicine by her advocates in 1989, despite her having published no scientific work on her theories or on the efficacy of her treatments, has more than a note of pathos to it.

Ultimately, though, Nobel Prizes and scientific firsts are not the point. Weekes deserves our recognition not for making grand discoveries about the nature of anxiety. She deserves it for recognising the vast but often hidden suffering caused by “nerves,” for developing an accessible method for reducing it on a grand scale at a time when most treatment was one-to-one and ineffective, and for having the energy and determination to promote that method around the world.

It is impossible to quantify the human suffering that Weekes’s work has alleviated, but major awards and honours are routinely given for scientific discoveries that have surely had far less benefit. Contributions of this kind — high in influence but low in prestige, because “popular” — are often overlooked. In this fine book, Hoare has rescued the legacy of a great Australian from that fate. •

The post Ages of anxiety appeared first on Inside Story.

]]>
The second mountaineer https://insidestory.org.au/the-second-mountaineer/ Fri, 07 Jun 2019 04:10:07 +0000 http://staging.insidestory.org.au/?p=55588

Books | Conservative commentator David Brooks mightn’t be writing for everyone, but he’s traversing important terrain

The post The second mountaineer appeared first on Inside Story.

]]>
For almost four decades I have nursed a small grudge against Terry Eagleton. This lingering droplet of bile dates back to my first-year university English class, when I was made to pay an extortionate sum for an assigned text by the renowned Marxist literary critic that ran to a mere eighty pages. It was as if I had shelled out for an import album at my favourite record store and been handed a seven-inch single.

In the years that followed, I learned the hard way that authors don’t set book prices and academic writing is not a pathway to wealth. I came to realise that my dislike for Eagleton was immature and irrational. But rather than abandon it, I found grounds for renewal in one of his book reviews:

The cult of the Individual Life is… ultimately self-defeating. For one thing, most individual existences are routine and unremarkable… Biographies cannot help reminding us, in the very act of distilling the uniqueness of their subjects, of just what tediously generic creatures they are… even the most wayward of geniuses have to get themselves born and educated, fight with their parents, fall in love and die.

Eagleton’s disdain for individuality went against everything I believed as an aspiring psychologist. The ways we tell our life stories disclose our complex selves, and to flatten them into a few generic chapters seemed deeply anti-psychological. Our lives are, of course, moulded by our social and economic positions, but there is surely a degree of narrative uniqueness to the autobiographies we spin in our heads.

I was reminded of my pique when reading David Brooks’s The Second Mountain. On its face the book is a guide to living a morally enriched life, but it is inseparable from its author’s autobiography. Brooks, a conservative columnist at the New York Times and author of numerous widely read books, most recently The Social Animal and The Road to Character, will be known to Australian readers through his regular appearances as a political analyst alongside Mark Shields and Judy Woodruff on PBS NewsHour, broadcast locally by SBS. Five years ago, Brooks divorced his wife of twenty-eight years, began a relationship with a much younger researcher who worked for him, and underwent a spiritual transformation that prompted a partial conversion from his natal Judaism to Christian faith. When Brooks writes about the second mountain, it is clear that he is catching his breath from a recent climb.

For Brooks the second mountain is the one we must ascend after falling off the first. In his telling, the first mountain is the struggle for personal achievement and happiness, an individualistic quest for conventional success. The dark side of this struggle is loneliness, depression, a crisis of meaning, distrust, and the social and political tribalism that they breed. When people realise the bleak ecology of the first mountain or have their faith in it shaken by misfortune, they enter what Brooks describes biblically as a valley or wilderness. From there they can make a decision to follow a different path and climb the second mountain.

According to Brooks, this peak is higher than the first and has an entirely different moral system. Instead of “hyper-individualism” there is an ethos of belonging and community, which Brooks dubs “relationalism.” He concludes his book with a relationalist manifesto that lays out, in sixty-four numbered paragraphs, the tenets of this middle way between individualism and collectivism. On the second mountain people strive not only for social connection but also for permanent “moral joy” rather than mere transitory happiness. They aim to transcend rather than enhance themselves, and to offer unconditional care to those who are less fortunate than themselves. What unites them is a set of commitments — to a vocation, to a marriage, to a faith or belief system, and to a community — and the bulk of the book is dedicated to groups of chapters exploring these four commitments.


Brooks writes amiably, openly and without showiness, aside from a tendency to over-season his prose with quotes from the great and good, with a tilt towards Americans and religious figures. There is Lincoln and C.S. Lewis, King and Kierkegaard, Dillard and Dostoyevsky. Most chapters are relatively short and pithy, the main exception being a chapter within a section on faith that pivots to Brooks’s spiritual awakening and remarriage and is twice the length of any other. Here he describes an extended process of religious doubt and dalliance that ends with him committed to a form of Christianity while remaining culturally Jewish. The centrality of personal faith to Brooks’s journey up the second mountain becomes clear here, although it is perhaps less likely to feature in the journeys of most Australian readers. However, any suspicion that The Second Mountain is just another book for elevating solo souls is mistaken. Brooks writes thoughtfully on the dependence of personal development on positive institutions and communities — “second mountain institutions” — and his own efforts to build them.

Scott Fitzgerald famously wrote that there are no second acts in American lives. Indeed, he wrote it twice, once in an essay published in 1932 and once in his posthumously published novel The Last Tycoon. He probably didn’t mean it on either occasion, presenting it on the first occasion as something he once thought. If anything, the idea of second acts is especially prominent in American ways of storying lives, with their frequent themes of relentless optimism, manifest personal destiny and reinvention. Exploring how everyday people compose these personal narratives has been the research focus of the eminent American psychologist Dan McAdams. In some respects an anti-Eagleton, McAdams has devoted his career to understanding how we create personal myths with distinctive character types, themes and trajectories, binding our past, present and imagined future into a coherent sense of personal identity.

In interviews with middle-aged, middle-class Americans, McAdams has found that many life stories conform to what he calls a “redemption narrative.” These narratives contain five common elements. The protagonist experiences early advantage in life, has a precocious awareness of suffering or injustice in the world, develops a strong moral compass, is driven to act selflessly for others, and passes through a period of personal adversity into a redemptive future. Although some might be inclined to view this formula cynically, McAdams has found that adults in midlife whose personal stories match its contours are especially committed to making a positive contribution to future generations, an aspiration the great but unfashionable psychoanalyst Erik Erikson referred to as “generativity.”

Brooks does not cite McAdams, but his first mountain–second mountain scheme has more than a whiff of redemption narrative about it. The second mountain of joyful belonging redeems the first mountain of individualistic striving. In this regard, The Second Mountain is an occasionally inspiring tale of personal transformation and hope that flows along a deeply carved narrative channel in American culture, saturated as it is with religiosity and optimism.

How well this narrative resonates with readers with different backgrounds is open to question. For some, perhaps especially women, personal transformation is less a movement from individualism to relationalism and more the reverse movement from social obligation and embeddedness towards liberation. For people outside the American milieu, this romantic narrative of onward-and-upward overcoming will feel foreign and the constant uplift hard to stomach. I suspect many Australians will relate better to a comic narrative in which the protagonist bounces through a series of ups and downs without an unironic sense of personal mission or heroism.

Most of all, I wonder how distant or illusory the second mountain might seem to young people and others whose economic vistas are less rosy than fifty-something professionals. Creeping casualisation and the gig economy are making it more difficult for people to reach the foothills of a first mountain of conventional career success, let alone ascend it. Brooks is correct in identifying individualism as one element of what is wrong with contemporary American life — the untrammelled rise of the market is a correlated element that he tends to neglect — but we might hope for ways to find social connection and community that don’t require people to pursue self-seeking ambition for the first decades of adult life only to find it wanting in midlife.

It is no criticism of Brooks to observe that his book’s main target audience has a particular age, class and cultural profile. The average second mountaineer may be a well-off late boomer or an early gen Xer, but that is no reason to scoff at their aspirations to joy, community or selfless commitment to others. The world would not be worse off if more people embraced lives of service and altruism. But it is important to remember that some of us live socioeconomically closer to the mountains than others. Terry Eagleton would surely agree. •

The post The second mountaineer appeared first on Inside Story.

]]>
Fighting for face https://insidestory.org.au/fighting-for-face/ Wed, 13 Mar 2019 23:02:35 +0000 http://staging.insidestory.org.au/?p=53966

Books | What makes political leaders take their country to war?

The post Fighting for face appeared first on Inside Story.

]]>
A psychobiography published in 1997 diagnosed Richard Nixon as an “anal” personality. Vamik Volkan and his co-authors divined that the thirty-seventh president symbolically associated leaks from the White House with a loss of bowel control, and that he made the fateful decision not to destroy the incriminating Watergate tapes because he saw them as “anal gems that could not be given away.” Nixon would not have been impressed, having once dismissed psychobiography as “pure baloney.” To the convinced Freudian, Nixon’s choice of words might support Volkan’s hypothesis, sausage having been unmasked as a symbolically faecal food by the psychoanalytic writer Alan Dundes.

Politicians don’t lose their personalities when they are elevated to leadership positions, and it would be surprising if their idiosyncratic ways of thinking, feeling and acting did not influence their behaviour and decisions. The challenge is to make sense of that influence in a way that is reliable, fair-minded and nuanced. It is easy to carry out psychobiographical hitjobs, caricaturing political enemies with armchair diagnoses and abusing the interpretive licence of psychoanalytic ideas. It is equally easy to pretend that the psychology of political leaders is irrelevant, and that only policy, ideology and economic system matter. The former approach is reductive; the latter ignores the human element.

One way out of this dilemma is to understand the personality of political leaders in a more sober and rigorous manner than most psychobiographers have done. In this fascinating book, Princeton University political scientist Keren Yarhi-Milo shows how this can be achieved. Yarhi-Milo turns to the mainstream psychology of personality traits rather than to Freud, and the result is a compelling investigation of one trait that appears to have very real implications for the fortunes of American presidents from the cold war to the present. Who Fights for Reputation is a substantial but accessible work of scholarship that sets a very high standard for future studies of personality and political leadership.

The trait Yarhi-Milo puts front and centre is “self-monitoring,” a personality characteristic that was introduced to academic psychology two months after Nixon left office in disgrace. The trait captures differences in how people adapt their expressive behaviour to its context. People who are low in self-monitoring are disinclined to present themselves in different ways to different audiences, instead seeking congruence between their beliefs and their behaviour. These low self-monitors value consistency, steadfastness and being true to their (static) selves.

High self-monitors, in contrast, adjust their performances of self to the changing demands and expectations of their situation. Strategic impression managers, they value flexibility and fitting in. High self-monitors see low self-monitors as stolid stick-in-the-muds. Low self-monitors see high self-monitors as inauthentic, status-seeking chameleons. As the psychologist Bruno Bettelheim opined during a cameo in Zelig, Woody Allen’s mockumentary about an extreme human chameleon, “one could really think of him as the ultimate conformist.”

Yarhi-Milo uses the concept of self-monitoring to explain how political leaders, and American presidents in particular, respond to external threats. She argues for the pivotal importance of “reputation for resolve” in militarised crisis situations, where leaders must signal a willingness to fight if a red line is crossed and, if it is, must be prepared to fight so that their future signals are to be taken seriously. Fighting for reputation in this manner might appear irrational in the contemporary circumstances, but on the longer view it could be a prudent way of deterring future aggressions. Similarly, deciding that preserving face is not a good reason to fight may be sensible in the short term but create a reputation for weakness that emboldens opponents.

Yarhi-Milo proposes that two factors interact to determine a leader’s willingness to use military force. One is the leader’s hawkishness: hawks who believe in the efficacy of military force are more likely to deploy it than sceptical doves. The other is the leader’s level of self-monitoring. Leaders who are high self-monitors — oriented to how they are perceived by others and concerned about image, face and status — are more likely to act militarily because they have a stronger desire to create a reputation for resolve. (There is an irony here that is unremarked by Yarhi-Milo: the leaders who are most exercised by the desire to appear steadfast are those who are the most mercurial.) The two factors interact because self-monitoring is especially influential among dovish leaders, who are more likely to be Democrats in the American political environment. As a result, high self-monitor doves may be more prone to fight than low self-monitor hawks. This argument has obvious relevance to the military adventures of charismatic liberal or social democratic leaders in recent history.


Yarhi-Milo proceeds to support her predictions using the methodological toolkit of the social sciences, notably a survey of the public, a statistical comparison of the eleven US presidents from Harry Truman to George W. Bush, and intensive historical case studies of how three of them — Jimmy Carter, Ronald Reagan and Bill Clinton — responded to foreign affairs crises. The extended statistical study is remarkable, and even readers allergic to p-values and negative binomial regression will find it compelling.

Yarhi-Milo recruited sixty-eight presidential historians to rate the self-monitoring levels of presidents they had studied closely, using the questionnaire normally used to assess the trait in psychological research. This is not putting historical figures on the couch so much as putting them to the (personality) test. Most presidents scored relatively high on the trait, with Carter, Ford and G.H.W. Bush relatively low and Kennedy, Johnson, Reagan and most of all Clinton especially high. The presidents’ hawkishness was assessed by a systematic analysis of the content of their foreign policy speeches, with Truman, Eisenhower and Ford adjudged high and Kennedy, Carter and G.H.W. Bush low. Nixon, perverse and pathological in the eyes of Volkan and colleagues, was rather average on both measures.

Strikingly, Yarhi-Milo’s analysis reveals that high self-monitor presidents were twice as likely as their low self-monitor peers to initiate “militarised interstate disputes,” or MIDs. This finding held equally for Republicans and Democrats: high self-monitors Reagan and Kennedy embarked on the most MIDs, and low self-monitors Truman and Nixon the least. These large effects persisted even after assorted historical, geopolitical and demographic factors were statistically controlled. In addition to initiating more MIDs, high self-monitor presidents were more likely to have had them end favourably to the United States, implying that their aggressive resolve may have paid off.

The exhaustively researched case studies that close out Yarhi-Milo’s book put flesh on the statistical bones of her key findings. Low self-monitor dove Jimmy Carter’s travails in response to the Soviet invasion of Afghanistan are placed in the context of his stubborn reliance on principle over persuasion. The battling influences of his advisers Cyrus Vance and Zbigniew Brzezinski are also brought to life, the latter a high self-monitor who urged Carter to pick fights with foreign leaders to project an image of toughness. The military assertiveness of high self-monitor hawk Ronald Reagan, who announced that “our days of weakness are over,” is examined through the escalating conflict in Afghanistan and interventions in Lebanon and Grenada. High self-monitor dove Bill Clinton, of the “dangerous charisma” and the “you-are-the-only-person-in-the-world gaze,” is tracked through actions in Haiti, the Taiwan Strait and Somalia, where his concern for maintaining the American reputation for resolve motivated a continuing presence that became deeply unpopular on the home front.

Throughout this intriguing work, Yarhi-Milo rehabilitates personality as a respectable focus for the study of political leaders, showing that presidential personality matters to an unexpected and highly consequential degree. She not only demonstrates that the personality traits of powerful people are themselves powerful, but also clarifies the image-burnishing concerns through which one such trait has its effects. Nations are led into armed conflict not only by pragmatic calculations of cost and benefit but also by considerations of reputation, and presidents who are more inclined to “fight for face” are more likely to weigh these considerations heavily.

Yarhi-Milo’s dispositional account of presidential behaviour can seem one-note at times, perhaps over-reliant on a single personality trait that has lost some favour in its academic home of personality psychology. But it offers a model for systematic future studies of personality and leadership on the world stage. Whether traits such as self-monitoring have similar effects in political systems where the power of leaders is more trammelled, and whether they predict leaders’ behaviour in domestic contexts as well as in foreign military conflicts remain to be examined. All the same, this illuminating book shows the wisdom of the simple but sometimes forgotten fact that, as Yarhi-Milo reminds us, “leaders are, at the end of the day, humans.” •

The post Fighting for face appeared first on Inside Story.

]]>
University challenge https://insidestory.org.au/university-challenge/ Sat, 20 Oct 2018 22:40:29 +0000 http://staging.insidestory.org.au/?p=51443

Books | Is the heightened tension on American campuses evidence of more psychologically vulnerable students?

The post University challenge appeared first on Inside Story.

]]>
The authors of this important book are apologetic about its title. When they wrote the 2015 Atlantic article from which it grew, they proposed “Arguing Towards Misery: How Campuses Teach Cognitive Distortions,” but an editor with greater market awareness fastened on “coddling.” It’s a word that originally evoked warmth and comfort but whose current connotations are sure to stir heated argument. The article was widely read and highly controversial; Barack Obama referenced it approvingly in a speech. Three years later, “coddling” is still there in the title.

You can see why Greg Lukianoff, a free-speech lawyer, and Jon Haidt, a social psychologist, would be uneasy about the word. To refer to coddling when writing about today’s college students might seem to imply criticism of the students themselves — another instance of the intergenerational sniping about spoiled youth that has been with us since Socrates complained that “the children now love luxury… they contradict their parents, chatter before company, gobble up dainties at the table, cross their legs, and tyrannise their teachers.” But Lukianoff and Haidt primarily target academics, parents and the institutions they have prepared for the current generation of students. Theirs is not another attack on millennial “snowflakes”; it is more an attempt to tease apart the societal and cultural changes that have created young people who believe they are fragile and who have decided that a particular form of “vindictive protectiveness” is the armour they need.

Lukianoff and Haidt may have been queasy about their book’s title for another reason. It deliberately echoes an earlier critique of campus culture, Allan Bloom’s much-debated but little-read The Closing of the American Mind, published in 1987. Bloom’s jeremiad took aim at relativist professors, political correctness, the coarseness and shallowness of youth culture and the decline of great books and classical music. A stern, moralising text, it was full of declinism and affirmations of the besieged Western tradition. Although some readers will reflexively slot this new book into Bloom’s mould, it is not fundamentally conservative, although it is certainly critical of some contemporary college mores.

Lukianoff and Haidt are both avowed Democrats, one a self-declared liberal and the other a centrist, and their target is not liberalism so much as a new illiberalism they identify in the campus left. On the surface, not much has changed since Closing became Coddling: we still have humanities departments thick with critical social theorists — fewer “tenured radicals” only because there is less tenure — and we still have the bitter arguments about speech codes and activist teaching. What has changed, Lukianoff and Haidt argue, are the grounds that student appeal to when expressing disapproval of offending views and their preferred means of dealing with the offenders.

In the interests of full disclosure, I should mention that I first met Jon Haidt in 1987, not long after Bloom’s book thundered onto the shelves, when we both entered graduate school in psychology at the University of Pennsylvania. At the time Jon was an avid liberal who swiftly enlisted me to help out on Michael Dukakis’s doomed presidential campaign against Bush the first. Jon and I shared an office for several years while he conducted his soon-to-be-famous work on the psychology of disgust and the emotional grounding of moral judgement. Since that time, he has written two bestselling books, The Happiness Hypothesis and The Righteous Mind, shaped the fields of moral and political psychology, and founded Heterodox Academy, an organisation that promotes political diversity on campus. Little did I know back then that I would be name-dropping him three decades later.

The problems that Lukianoff and Haidt wish to explain will be well known to anyone who has followed American universities for the past half-decade. At the extreme end are violent protests in response to challenging ideas. These include a student takeover of Evergreen State College in reaction to a biology academic who refused to leave the campus on a day white students and staff were asked to leave; riots in Berkeley occasioned by a speech by troll-provocateur Milo Yiannopoulos; a physical assault on an academic escorting The Bell Curve co-author Charles Murray from a speaking engagement at Middlebury College; and shocking denunciations of other academics at Yale and Claremont McKenna College whose well-meaning speech gave offence.

Often these events occurred during attempts to prevent invited speakers from speaking, a phenomenon that Lukianoff and Haidt show to have been ideologically balanced in previous decades but is now predominantly carried out by the left. Less extreme but much more common phenomena are the spread of what Lukianoff and Haidt take to be dangerous ways of accommodating student sensitivities, such as the designation of “safe spaces” where they can go to be away from undesired experiences or people, the use of “trigger warnings” to announce that potentially discomfiting material is about to be presented in class, and the rise of training programs to identify “microaggressions,” subtle expressions of supposed prejudice — such as asking an Asian student where she is from, or declaring America to be the land of opportunity — and call out the perpetrators.


Such are the symptoms of the new campus disorder, but what is the underlying pathology? According to Lukianoff and Haidt it is the rising prominence of three “Great Untruths.” There is the Untruth of Fragility (“what doesn’t kill you makes you weaker”), the Untruth of Emotional Reasoning (“always trust your feelings”) and the Untruth of Us versus Them (“life is a battle between good people and evil people”).

Before turning to its analysis of the social trends that have generated these damaging beliefs, the book unpicks each of them. The widespread belief in fragility is ascribed to a rising culture of “safetyism” that overvalues comfort, inflates risk and demands protection from bothersome ideas. A better belief, say the authors, is anti-fragility: just as muscles need resistance to grow, personal development requires challenge and difficulty rather than softness and enablement. The excessive reliance on subjective feelings reflects a new focus on emotional impact rather than intention when apportioning blame: it is enough that someone feels offended or traumatised to punish the perpetrator, whether or not they meant harm. The belief in a world divided between good and evil people is driven by theories of power and privilege that license dichotomous thinking about victims and oppressors; it is also associated with a tribal form of identity politics and the apparent embrace of virtuous victimhood that sometimes accompanies it. Lukianoff and Haidt are not opposed to identity-based politics in principle but take issue with forms that undermine a sense of common ground and humanity across group boundaries.

Running through these three untruths is a conviction that today’s students are thinking about familiar concepts in unfamiliar ways. Safety was once understood as protection from physical harm but is now invoked in relation to harmful ideas or emotions. Trauma used to refer to life-threatening adversities but is now used to describe encounters with offending words. Speech can now be violence and right-wing ideologies that were once seen as extreme are now redefined as permeating the political spectrum. In the words of one student chant reported in the book, “liberalism is white supremacy.”

Here the authors cite my own work on “concept creep,” which documents how definitions of harm-related ideas in the social sciences — for example, bullying, prejudice and mental illness — have steadily expanded to include a progressively wider range of experiences and actions. As the concepts inflate, they identify more and more experiences as harmful and more and more people as harmed or harming. Lukianoff and Haidt suggest that emotional fragility, efforts to exclude controversial guest speakers and a readiness to take fierce offence at clumsy turns of phrase might all ultimately result from creeping concepts. Expansive definitions of harm may undermine not only personal resilience but also interpersonal civility.

The bulk of the book examines social trends that may have contributed to the current fractious state of American colleges. Any satisfactory explanation of the coddling phenomenon must reckon with its relatively sudden appearance, but most of the six contributing factors that Lukianoff and Haidt identify do not. One is the rising political polarisation in the United States, another is the rise of student-centred college bureaucracies with their well-intended behaviour codes and awareness-raising social programs, and a third is the growing tendency for an increasingly liberal professoriate to present unequal social outcomes as direct evidence of injustice and prejudice.

A more surprising pair of factors addressed at length are “paranoid parenting” and the decline of free play among school-aged children. The former is driven by exaggerated parental concerns about threats, the resulting over-protection and “helicoptering” impeding the development of independence. The latter partly reflects a parenting philosophy of talent cultivation, supported by an excessive focus on skill development in schools. Students become over-scheduled, often in the service of a “résumé arms race” to enhance applications to the best colleges.

But the factor that gets the most airplay, and the only one that can perhaps account for the timing of the changes on campus, is the use of smartphones. Relying on the work of psychologist Jean Twenge, Lukianoff and Haidt suggest that social media immersion and excessive screen time from a young age amplify common adolescent concerns surrounding peer exclusion and body image. One outcome is increasing rates of anxiety, depression, self-harm and suicide among young people, especially, the authors claim contentiously, since about 2010. The heightened tensions on campus from around 2013 might be manifestations of a more psychologically vulnerable student body, especially one jostled by the political turbulence of Trump, Black Lives Matter, Charlottesville and now #MeToo.

Lukianoff and Haidt offer several prescriptions to treat the pathology of coddling. Some are directed to parents and school systems. Children should become more free-range, with less adult supervision. Parents should encourage greater autonomy (but limit screen time), show how to engage in respectful disagreement and how to be charitable in dealing with opponents, and challenge distorted emotional reasoning. Schools should give more time to play and less to homework and the single-minded pursuit of academic success. Universities should demonstrate a genuine commitment to freedom of speech and inquiry and allow protest only when it does not prevent unpopular views from being heard. They should reject the great untruths and the encroachment of safetyism, and actively support viewpoint diversity and civil debate.

Some of these proposals are idealistic, swimming against the rip-tide of ongoing societal changes, but many are refreshingly concrete and actionable.


A key question for Australian readers is whether the book’s arguments are germane to our young people, our polity and our universities. The answer is mixed, but perhaps more yes for our youth and no for our institutions of government and higher learning. It is unquestionably true that many of the cultural trends that Lukianoff and Haidt observe are global rather than uniquely North American. Our children are less physically active than their parents were, spend much more time transfixed by screens, and seem to be afflicted with higher rates of depression and anxiety, although some of the alarmist figures drawn from local surveys are unreliable. Social media is just as much a preoccupation here as in the United States, and online mobbing is no stranger to our digital shores. Indeed, some of the societal changes that feed the generational predicament that Lukianoff and Haidt document are deeply familiar.

In other respects, however, their work resonates less powerfully. Our politics may be rather dire, but they rarely reach the Manichean levels of polarisation that have become entrenched in Trump’s America. Our racial divides are not as inflamed. Our campuses witness the occasional protest when a speaker presents a contrary view on a topic du jour, but these events have yet to spark the anarchic violence of Berkeley or Evergreen. Political diversity among Australian academics is probably similar to our American peers but our ideological differences rarely become a focus of public conflict. Trigger warnings have not caught on widely among lecturers and “microaggression” has not entered most students’ vocabularies.

Meanwhile, the one-dimensionally academic basis for selection into most Australian university courses arguably works against over-involved parenting. Many American parents engage in fevered curation of extracurricular activities to give their children the best shot at the most prestigious colleges, which want evidence of sporting prowess and civic-mindedness as well as sky-high grades. And for Australian students, who generally commute to university from the suburbs and work off campus, higher education is a less total and encompassing experience than it is for students living at America’s elite colleges, where most of the well-publicised campus conflicts have taken place. In a less hothouse environment, the more disturbing dynamics that Lukianoff and Haidt document are perhaps less likely to flower.

And yet we shouldn’t be surprised if the sorts of campus conflict that motivated this book emerge here, perhaps as suddenly as they did in the United States. American trends have a way of becoming ours at a lag. British universities are experiencing their own version of safetyism and the no-platforming of unpopular speech. Campus culture is showing signs of change, especially around the salutary goal of increasing respect and reducing harassment and other forms of maltreatment. Lukianoff and Haidt would not object to that goal, but their book reminds us that some of the means taken to achieve it have a way of transforming into something darker. •

The post University challenge appeared first on Inside Story.

]]>
Not my type https://insidestory.org.au/not-my-type/ Mon, 08 Oct 2018 00:47:27 +0000 http://staging.insidestory.org.au/?p=51239

What explains the curious persistence of the Myers–Briggs personality test?

The post Not my type appeared first on Inside Story.

]]>
Standing at the end of a line, pressed up against the glass wall of a well-appointed meeting room, I asked myself the rueful question that all personality psychologists have posed at least once: why is the Myers–Briggs Type Indicator so damned popular? The smart, charismatic consultant facilitating this leadership course had given the questionnaire to his class and instructed us to line up according to our scores on extraversion–introversion. Far to my right on this spectrum of perkiness stood a colleague with a double-espresso personality; down this end, with no one to my left, I was decidedly decaf.

Let me get off my chest what’s wrong with the Myers–Briggs, or MBTI as it is known in the acronymphomaniac world of personality testing. The MBTI classifies people according to four binary distinctions: whether they are extraverts or introverts, intuitive or sensing types, thinkers or feelers, and judges or perceivers. Three of these distinctions rest on an archaic theory of personality typing proposed by Carl Jung, and the fourth was invented and grafted on by the test’s developers.

The four distinctions bear little relation to what decades of systematic research have taught us about the structure of personality. They are smeared unevenly over four of the five dimensions that most contemporary personality psychologists accept as fundamental, and completely ignore a fifth, which is associated with the tendency to experience negative emotions. The same effort to erase the dark side of personality is evident in the MBTI’s use of sanitising labels to obscure the negative aspects of its four distinctions. In large measure, being a thinking type amounts to being interpersonally disagreeable, and being a perceiving type to being impulsive and lacking in persistence. But in MBTI-world, all personality types are sunnily positive, a catalogue of our “differing gifts.”

The MBTI doesn’t only misrepresent the content of personality. It also gets the nature of personality fundamentally wrong. Despite masses of scientific evidence that human personality is not composed of types, its four distinctions are understood as crisp dichotomies that combine to yield sixteen discrete personality “types,” each with a four-letter acronym such as INTJ or ESFP. In reality, personality varies by degrees along a set of continuous dimensions, just like height, weight or blood pressure. In the face of mountains of research demonstrating that personality is malleable throughout the lifespan, proponents of the MBTI also argue that one’s type is inborn and unchanging. In short, the MBTI presents personality as a fixed essence whereas the science of personality shows it to be a continuous flux.

The MBTI also fails to meet the standard statistical requirements of psychological tests. Its items employ a problematic forced-choice format that requires people to decide which of two statements describes them better. Its scales lack coherence. The typology lacks re-test reliability, which means that people are commonly scored as having different types when they complete the measure on two separate occasions. Evidence that MBTI type correlates with real-world behaviour — known as predictive validity in the trade — is scant.

So why is a test with weak psychometric credentials, based on a musty theory of personality that gets the structure of human personality wrong, so enduringly popular? Arguably its weaknesses from a scientific standpoint are precisely what give it its appeal. Personality may not really form discrete types, but people relish the clarity of noun categories and binary oppositions. Personality may not really come in sixteen flavours, but MBTI types are sweet simplifications. Personality may be mutable, but people find reassurance in the idea that they have an unchanging true self. And the average person could not give two hoots about the statistical considerations that trouble test developers.

What matters to most people, at least those who complete the MBTI as an exercise in self-understanding rather than a compulsory workplace activity, is whether it offers accessible and palatable insight. And the MBTI undoubtedly provides that in spades. Its four-letter codes are readily grasped, its descriptions flatter our strengths, and the fact that its four distinctions bear some relationship to fundamental personality traits ensures that it offers a certain truthiness.


Although the shortcomings of the MBTI have been discussed within academic psychology for decades, a historical analysis has been lacking. Merve Emre’s fascinating new book, What’s Your Type? The Strange History of Myers–Briggs and the Birth of Personality Testing, fills that gap stylishly. Emre, a literature academic at Oxford, documents the genesis of the MBTI in the Jungian enthusiasms of Katharine Briggs and the more worldly ambitions of her daughter, Isabel Briggs Myers. Despite the subtitle’s questionable reference to the “birth” of personality testing — the first test dates back almost another thirty years to the first world war — the book’s recounting of the origins of the instrument is colourful and revealing.

Katharine Briggs emerges as someone single-mindedly devoted to making sense of human individuality and using that sense to guide people in directions to which she believed them suited. As a young mother without training in psychology, she developed a system of personality typing that she used in an informal child guidance, or “baby training,” enterprise, later finding a resonance between her ideas and those expressed in Carl Jung’s Psychological Types, which was published in 1921. Jung became Katharine’s “personal God”: at one point she wrote a hymn to him (“Upward, upward, from primal scum / Individuation / Is our destination / Hoch, Heil, Hail to Dr Jung!”). Encouraged by her correspondence with the great man, and armed with 3ʺ x 5ʺ index cards, Katharine refined her classification system and compulsively typed everyone she encountered, from neighbourhood children to Adolf Hitler.

Katharine’s daughter Isabel Briggs Myers had a more pragmatic cast of mind but inherited her mother’s absorption in types. After writing two mystery novels, she developed an early version of the MBTI while working for America’s first corporate personality consultant in 1943. Soon after, she launched it as a small commercial proposition. In the late 1950s the questionnaire was picked up by the Educational Testing Service, an eminent test developer and publisher in Princeton, New Jersey, giving it a chance at mainstream success and respectability. After endless wrangling between Isabel and staff psychometricians, though, the ETS lost interest and cut its losses. Seeing the instrument as “little better than a horoscope,” ETS staff insisted on conducting the same validation research as any other test would undergo, but Isabel remained resistant and possessive. Eventually a new publisher released the MBTI as a self-scored test and it quickly became a staple of the US$2 billion personality assessment industry, especially beloved by personnel consultants.

As history goes, Emre’s book is compelling and well paced. It presents Katharine and Isabel as rounded characters and places them in a richly drawn cultural and historical context. But as an account of personality testing more generally, the book is flawed. Despite having chronicled the many ways in which the MBTI was a cuckoo in the nest of personality psychology — the product of obsessed amateurs, disparaged by the psychometric orthodoxy at the ETS, popularised rather than professionalised — Emre sees it as emblematic. An emblem it is not. Unlike most other major tests, its use is not restricted to trained professionals and its legacy is protected by an almost cultish organisation that forbade Emre access to most of the Briggs–Myers papers, despite their officially being open to the public. Unlike other tests, the MBTI doesn’t promote itself by appeal to a validating body of scientific evidence. To treat the MBTI as representative of contemporary personality testing is like presenting the primal scream as representative of modern psychotherapy.

Emre is on more solid ground when she describes the functions of workforce personality testing, using the MBTI as an example. Its key purpose in that domain — only one of several in which it is used, it must be said — is indeed to select people who are likely to perform better than others in particular lines of work. Ideally that rationale is backed by evidence that the tests are valid predictors of workplace performance. Whether this purpose is benign or sinister is open to debate. It can be viewed positively as the legitimate application of behavioural science to enhance the wellbeing of workers and the success of organisations, or negatively as a dystopian tool for creating human cogs for the corporate machine.

Emre favours the darker interpretation, writing that personality typing “conscripts people into bureaucratic hierarchies.” This charge is hyperbolic: even if one is critical of the use of the MBTI or other testing, it does not force people into any position against their will, it is not employed exclusively in bureaucratic organisations, and it is used at least as much to differentiate people horizontally according to their strengths as it is to stratify them in hierarchies. The very same charge could be made against any other approach to selecting or assigning people to organisational roles, including interviews, hiring quotas or old boy networks.

The key question has to be whether personality testing selects and assigns people to work roles in ways that are better or worse than its alternatives: whether it is fairer and more valid, efficient or desirable than some other preferred metric. Unless there are grounds for believing that personality tests are worse than these alternatives, to criticise them for conscripting people into bureaucratic hierarchies is merely to express hostility to bureaucratic hierarchies.

Emre also struggles to form a consistent view when she discusses personality testing’s relationship to individuality. At times she presents the MBTI as a tool that promotes individualism by claiming to clarify each person’s specialised strengths and aid in their quest for self-discovery. At others she describes it in over-heated terms as “liquidating” or “annihilating” the self, as if a questionnaire had the capacity to destroy the person’s uniqueness. Here she cites the work of German social theorist Theodor Adorno, fierce critic of commodification (and jazz), who proclaimed that personality tests undermine human individuality.

Emre never quite resolves these antithetical views, but the paradox is only apparent. Receiving a score on a personality test, or even being assigned to an MBTI “type” does not submerge individuality. It simply provides it with a partial description that other people may share. Being described as brunette, overweight, liberal or a typical Taurus does not undermine a person’s selfhood but merely qualifies it, and the same is true when someone is described as being an ENTP. MBTI types, for all their conceptual failings, don’t reduce personal identity to one of sixteen psychological clones. They simply offer people a language for capturing some aspects of their personal distinctiveness.

In passing, Adorno’s critique of the “reified consciousness” involved in personality testing has a certain irony to it. In one of his books he recalled being asked by an American colleague whether he was an extravert or an introvert, writing contemptuously that “it was as if she, as a living being, already thought according to the model of multiple-choice questionnaires.” A few years later, while conducting his influential studies of authoritarianism, Adorno proceeded to create his own multiple-choice personality questionnaire.

Another confusion arises in Emre’s discussion of personality typology. Remembering the horrors of the Holocaust, Adorno rightly condemned the practice of assigning people to categorical types. This is a legitimate criticism of the MBTI, whose proponents view personality types as discrete and unchanging facts of nature. (Emre writes that Isabel Briggs Myers was astonished to find that scores on the MBTI’s scales were distributed in a bell curve, not in the camel-humped way that type theory supposed.) Emre notes this criticism of typology but then mistakenly applies it to personality testing in general. In contrast to the MBTI, almost all personality tests are explicitly anti-typological. These tests assess differences between people along a continuum without invoking bogus categories, and they do not make ill-founded claims that their scores correspond to unchanging personal essences. By failing to recognise that typological thinking is a specific failing of the MBTI, Emre misses the extent to which major criticisms of that instrument do not tarnish personality testing as a whole.

To serious students of personality, the continuing success of the MBTI within the testing industry is a source of bafflement. Emre’s book does not diminish that dismay, but it helps to clarify why the instrument is the way it is. Despite its unpromising beginnings, she demonstrates that it has a powerful appeal, offering an intuitively attractive way to apprehend ourselves as a pattern of distinctive strengths. In Emre’s preferred Foucauldian terminology, the MBTI is an effective “technology of the self.” The fact that it is a rather Bronze Age technology is almost immaterial. •

What’s Your Type? The Strange History of Myers–Briggs and the Birth of Personality Testing
By Merve Emre | HarperCollins | $32.99 | 336 pages

The post Not my type appeared first on Inside Story.

]]>
Love thine enemy https://insidestory.org.au/love-thine-enemy/ Thu, 16 Aug 2018 02:44:30 +0000 http://staging.insidestory.org.au/?p=50409

What happens when you meet the person you’ve done battle with online?

The post Love thine enemy appeared first on Inside Story.

]]>
When I told friends who I was meeting for a drink, they all asked the same question: “Why the fuck would you do that?” They were right to be sceptical. I was about to break bread with a woman who had publicly described me as a “social justice warrior straight out of central casting.” She routinely dismisses people with views like mine as “frightbats,” “feminazis,” and “comically deluded, fringe-dwelling, virtue-signalling lefties.” I was nervous.

Rita Panahi is an outspoken right-wing columnist for Melbourne’s Herald Sun newspaper and a social commentator on Sky News, 3AW and Channel 7’s Sunrise. She has created a strong brand as a conservative provocateur, with some tipping her as heir apparent to Andrew Bolt, her hard-right tabloid colleague. With more than 150,000 followers on Twitter and Facebook, she is fast becoming one of Australia’s most vocal media personalities. She prides herself on “telling it like it is.” Over the years, she and I have had many frank and robust Twitter debates. Her scorched-earth approach to public discourse has infuriated me. There is almost nothing we have agreed on.

Her background could not be more different from mine: born in Arkansas in the United States, where her Iranian father was studying to become an engineer, she moved back to Tehran with her family at the age of three. In 1984, after the revolution that led to the installation of Ayatollah Khomeini’s repressive regime, the family was accepted into Australia as refugees and moved to Melbourne when she was eight years old.

It is her memories of life under the Ayatollah — when she and her schoolmates were made to chant “Death to America” before each class — that helped inform a deep contempt for people who apologise for radical Islam, a topic she writes about with great frequency and fervour. She believes that political correctness is largely to blame for the Western world’s failure to combat Islamic terrorism.

Another passion is her intense displeasure with the latest generation of feminists, who she says obsess about trivial or imagined offences while “ignoring the persecution of their sisters in the name of Islam.” In a column to mark International Women’s Day, she wrote, “It’s clear the feminist movement has been hijacked by the regressive left with a level of hypocrisy and inanity unmatched in public discourse; just when you think these dolts have hit rock bottom, they find shovels and start digging.”

When I hit my own rock bottom — a breakdown that saw me crippled by anxiety and depression, leaving me unable to work for almost five months — I began to realise Rita was one of my red flags. Arguing with her was not productive; it was a unique form of self-harm. I’d see her tweeting about “SJWs” (social justice warriors) and “regressive left-wing flogs” and would be compelled to respond. It felt personal. I’d take to my keyboard and outline how wrong she was, mocking her arguments to the cheers of my followers. I could lose hours locked in combat with this woman I’d never met.

As I furiously tried to make her see reason, my stress levels skyrocketed. Our followers would join in, taking sides, which only added to the sense of gladiatorial theatre. I had to have the last word. I had to land the killer blow. I simply had to be right. Sometimes the exchanges would stay with me for days.

When I took a break from social media to rebuild myself, I could see that in these moments I was not in control. The ship was being steered by the child part of me that was desperate to be applauded by the crowd. The more anxious I became, the more I took to Twitter for reassurance that I had value. Perhaps I was even a bit jealous of Rita’s growing profile, rocketing me all the way back to the popularity contest of high school. Being seen and heard as I took her on would somehow give me autonomy in a situation where I felt powerless. But the anger was toxic. When you pick up a hot coal to throw at your enemy, you’re left only with burnt hands. So often, anger is externalised shame.

When I returned to Twitter, I thought of what Professor James Doty — founder of Stanford University’s Center for Compassion and Altruism Research and Education — had said about the benefits of showing compassion for the “other” and trying to focus on commonalities rather than differences. I still disagreed with most of what Rita said, but I could see our shared interests. We are the same age. We both love cats; we have a passion for the Hawthorn Football Club; and we share an enduring romance with the Italian Riviera. She has a wry sense of humour that is similar to mine, and on some issues, including abortion and gay rights, she is quite progressive. She is also a single mother, and I admired her independence.

The more I practised compassion, for myself and others, the less I felt like fighting. I didn’t imagine we would ever be firm friends, but I wondered if I could learn more about the world, and the people who view it differently from me, by getting to know Rita. And maybe I’d discover that, like me, her online persona was being driven by invisible battles.

I approached her on Twitter, and she accepted my invitation to catch up. When the moment came to finally meet, I was apprehensive. What if she didn’t show? What if she wrote about the encounter in one of her columns? I could just imagine the headline: “My Secret Meeting with a Feral Flog from the Loony Left.”


We met at a bar in Southbank, next to the Herald Sun offices, one evening after work, just as the sun was setting behind the Yarra. I was surprised how quickly I felt comfortable in her company. The conversation flowed easily. She is petite and energetic and has a warm, infectious giggle that makes her seem younger than her years.

I felt secure enough to tell her that in an angrier moment during one of our many Twitter battles I had fantasised about the two of us presenting a left vs right political podcast. I even had a working title: Educating Rita. She laughed, but seemed genuinely surprised that I had given so much thought to our online interactions. In her view, they had never been fights. The anger I felt was not an emotion she shared, although it is something people frequently bring up when they encounter her online.

“I’m often surprised when people say, ‘You’re so angry,’ because I’m never angry on Twitter. Maybe it comes off that way because unless you’re using a smiley face people can misread things,” she said. “That anger accusation always comes from the left, who, to be honest, I perceive as angry, and I’m thinking, are you projecting? Or is it just because my tone doesn’t convey the mocking or the sarcasm?”

As we sipped our drinks — pinot grigio for me, port for her — I suggested that people may perceive her as angry because of the language she uses. Among her favoured terms for those with whom she disagrees are “irrelevant bile-filled dolt,” “moronic nutbag,” “self-loathing broken,” and “deranged leftie loon.”

She said I was mistaking mockery for anger. “I just think some people deserve to be mocked. They are ridiculous, and trying to engage them… One, Twitter isn’t the forum for serious discussion, and two, they’re not worthy of that sort of discussion, they’re worthy of being mocked and dismissed. They’ve said something that is insane or absurd and you’re mocking that particular viewpoint.”

Mockery is an effective way to silence critics. In the kneejerk world of Twitter, it’s a weapon used with blunt force. And it can be vicious. Early last year, I found myself in the unfamiliar position of defending a right-wing millennial commentator. I didn’t know much about Daisy Cousens, other than that she was a writer for conservative publications including Quadrant and the Spectator, and seemed to be a self-styled alt-right flamethrower who prided herself on “triggering snowflakes.”

She had recently praised Donald Trump on Q&A, saying the president was like the “weird relative” who comes over once a year at Christmas and occasionally says embarrassing things but “ultimately you forgive him because he is nice and gives you the best presents.” So yeah, she was an easy target for parody. Her crime on this particular evening was to publish a breathless, first-person account in the Spectator of her one and only meeting with the controversial and recently deceased News Corp cartoonist Bill Leak, in a piece that read as part eulogy, part erotic fan fiction. She described “purring with satisfaction” after appearing on Sky News’s The Bolt Report, and praised Leak as a “gentleman, whose handsome face and unstudied smile left me strangely weak.”

I read it open-mouthed and cringing, wondering if it was satire. The late ABC veteran journalist Mark Colvin described it on Twitter as an “astonishingly bad piece of writing.” That was about as polite as it got. Within minutes, Cousens was a national laughing-stock. High-profile figures played to the gallery of their large followings, dredging up everything she’d ever written and eviscerating it word by word. She was ridiculed in cruel and inventive ways, described as “a cyst that had been left out in the sun,” a “thing,” and a “puffed-up nobody.” Her article was ridiculed as a “bug-eyed, slack-jawed piece of teen romance tripe,” “thigh-slappingly, teeth-grindingly awful,” and “Enid Blyton meets 50 Shades of something that should have been burned on a pyre.”

Women on the internet have copped far worse. But there was something about the gleeful, almost tribal way people rounded on Cousens that made me wonder what the hell we’re all doing in these moments. It was Lord of the Flies meets Mean Girls. People went trawling for old pictures of her, critiquing her looks, her youth and her apparently vapid personality. Memes were created, satirical poems written in her honour. Someone dug up a column she’d written for an obscure overseas website in which she claimed to have turned down sex with a famous athlete. A frenzy ensued as a bunch of women — many of them staunch feminists, who themselves have been the target of vicious online trolling — tore apart the piece, joked about who the sports star could be, or declared that the incident simply never happened.

As I watched it unfold I tweeted: “The Daisy Cousens pile-on is one of the harshest pile-ons I’ve seen in a while. It was a weird piece, but mob humiliation is pretty mean.” And so ensued a furious Twitter debate among my followers on whether Cousens should be defended or if in fact she deserved the ridicule because she’d set herself up as a right-wing provocateur and was therefore asking for it. People told me I’d backed the wrong side, implying I was a traitor to the left for not joining in the public shaming. I added, “Eviscerating a young conservative columnist who wrote something a bit silly is not witty or progressive. It’s just bullying. Maybe don’t.” After several hours I went to bed, realising I’d only thrown kerosene on the flames. In the morning, I logged on and it was still going. People were incensed that I didn’t share their fury.

Rita, who is herself a regular target for trolling and abuse, told me she “couldn’t give a fuck” about the flak she cops, and has grown a thick skin. But I wondered if she ever felt a degree of empathy for the people she disagrees with online?

“Sometimes I think they’re genuinely not well,” she said with a laugh. “I’ll think, there is something much greater wrong with you than your opinion on this particular topic, and your reactions aren’t normal. Quite often then you worry about how they’re reacting. You might be thinking this is a hilarious little exchange, but they might actually be in some sort of a shame spiral. You just don’t want to be adding to someone’s pain or discomfort.”

She has, at times, felt guilty for retweeting ridicule or getting involved in a Twitter pile-on. When I told her about my mental-health battles, she was sympathetic and kind. It is not an attitude she extends to everyone. “It’s hard to have sympathy for people who are genuinely unpleasant. Or their persona seems to be. I’m sure if you met them you’d probably get a very different impression from what you see.”

Which brought me to why we were there. I was curious as to how she viewed me. Online, she uses a broad brush to dismiss everyone on the left side of politics as “idiotic.” What did she hope to get out of meeting with this particular idiot?

“I’ve always thought you seemed like someone I’d be very happy to have a drink with or go to a footy game with or whatever. I’ve never seen you as the enemy,” she said. “It’s unhealthy only to have people around you who agree with you because then there’s not much boisterous discussion or diversity of thought. We’re not at war — this is Australia. It’s not like parts of the world where people you disagree with you literally can’t talk to because it ends in gunfire.”

She explained that she differentiates between the “ordinary left” and the “lunatic left,” and went on to describe my place on this complex continuum. “You’re somewhere between ordinary left and social-justice-warrior left. I think lunatic left is where you don’t want to be. Regressive left is very close to it, then you’ve got your social justice warriors and your ordinary left. So you’re in dangerous territory, you’re getting close to the regressives.”

When someone is essentially calling you intellectually backward, it’s hard to keep an open mind to their broader point. But I listened intently and found that despite the slightly obtuse way her points were being delivered, there remained in her views some areas of common ground with my mine. I couldn’t disagree with her belief that the mainstream media is out of touch with the average punter, given Brexit, Trump and Hanson came as a complete shock to almost every journalist who covered them.

Whether it’s coming from the right or the left, when we shut down debate, try to ban voices we don’t like, and label people with opposing views as stupid or inherently bad, we only risk further alienating communities that already feel ignored. “If you actually think that someone is evil then it gives you licence to not only dismiss their opinions but dismiss them as human beings,” Rita said.

This is the heart of the problem in our divided, digital age. When we separate ourselves into opposing camps and condemn the “enemy” as fundamentally bad, we dehumanise the target, making it easier to join the social media pile-on.

Nowhere was this more apparent than when Yassmin Abdel-Magied, a Sudanese-born, twenty-six-year-old Australian mechanical engineer, author and ABC radio presenter, found herself front-page news in 2017 after a seven-word Facebook post on Anzac Day, urging people to remember those in Australia’s offshore detention centres who had fled conflict, went viral. Social media, Liberal MPs and the News Corp papers went into overdrive, calling for her to be sacked, punished or deported, in a frenzied orgy of bloodletting that was revolting in its ugliness.

In the following three months, some 90,000 words would be written about Abdel-Magied in the Murdoch press, with the febrile reaction prompting daily death threats, rape threats, and harassment against a young Muslim woman whose greatest crime was to ask that people pause on a day of remembrance for our war dead to reflect on the plight of refugees fleeing war-torn countries. The abuse proved so extreme that she later announced she was moving to London, writing in the Guardian that she had become the most hated Muslim in Australia. In response, conservative commentator Prue MacSween announced on Sydney’s 2GB radio that she was “tempted to run her over.”

Leaving aside the transparent racism and Islamophobia that underlies the whole sorry incident, there is something deeply troubling about the anger running through these public shamings. As Abdel-Magied pointed out, the visceral nature of the fury weakens us all. It’s a raw, naked rage that divides us into warring tribes, constantly scanning the surrounding landscape for someone to blame.

Rita upset some of her fellow conservatives during the Abdel-Magied furore by not calling for her to be sacked. She told me that in a democracy it is fundamentally unhealthy to silence opinions we don’t like. In that regard, I share her concerns at the way some on the left, in a bid to be inclusive and intersectional, have driven a burgeoning “no platform” movement, banning speakers they don’t agree with from university campuses or glossing over the tensions between radical Islam and women’s or LGBTIQ rights. In some Muslim countries, women are being stoned to death for having sex outside of marriage and gay men are thrown from buildings.

Rita’s opinions might make some on the left uncomfortable, but as a woman of colour and a former refugee who has lived under an oppressive Islamic regime, she has a perspective that can’t be ignored just because it doesn’t fit the ideological narrative of those who view themselves as progressive. You don’t have to agree with her to listen.

“I think if you’re a powerful woman and lucky to be living in the Western world then you’ve got an obligation to at least focus some attention on women who are genuinely oppressed and don’t have a way out,” she said. “I find the abandonment of feminists of that group really quite ugly because that’s to me the biggest battlefront for women, instead of obsessing about sexist air-conditioning. I think it’s a responsibility we have, to focus on those women who are voiceless. It could have been me. It affects family members of mine, and it was just luck that I’m not still there.”

Despite what her critics have claimed, Rita is adamant she’s never written anything she doesn’t believe in. This is not part of a brand she has built to feather her nest. Indeed, she is independently wealthy, with an expansive property portfolio, and could have retired at thirty had she chosen to, she told me. She writes on issues she feels passionately about. During our time together I try to get her to open up. I’m looking for those invisible battles — that soft underbelly we all have. She is guarded and doesn’t give much away, but when I ask about her ten-year-old son, it’s clear that this boy — who on Instagram she affectionately dubs #TGC (The Golden Child) — is the centre of her world. His future informs much of her writing.

“As much as I joke about and mock certain things, and certain trends that are occurring, I do worry about him going off to university and being faced with that oppressive campus culture where you’re expected to think a certain way and you don’t have that diversity of thought or critical thinking being celebrated,” she said. “I would hate for him to go off and be exposed to that or be made to feel guilty because he’s male or to be blamed for things he’s not guilty of just because of the sex he was born into.”


After a couple of hours, we had to bring our catch-up to an end. She was heading to the studio to appear on Sky News’s The Bolt Report. As a condition of our interview I had promised to let her see the chapter before publication. “I’m trusting you,” she said, before adding, “I screenshotted the message. The one where you said you’d give me copy approval.”

I didn’t blame her. We were from different worlds. Over a drink we had narrowed the gulf, but the apprehension remained. No doubt I will continue to disagree with Rita on many issues. When she talks about the “broken dullards of the regressive left,” it will still make my teeth itch. And I’m sure my social justice warrior-ing will keep her in eye rolls for years to come. But I’m tired of being angry all the time. I remain a passionate advocate for equality, inclusion and the progressive causes that Rita might view as “SJW virtue signalling.” But I’m no longer convinced that righteous fury is the best vehicle for change.

The anger is corrosive. It diminishes me and my argument. I know it’s not easy to remain calm and dignified when the public rhetoric is so poisonous — especially for those in marginalised communities who face constant attacks on their humanity and are still fighting for basic rights. How do you remain stoic and gracious when you’re being told you’re an abomination? A terrorist? A dole-bludging waste of space? And yet, history consistently shows that the antidote to hate is not more hate. We need dialogue and open hearts. As the American writer Van Jones says, we have to see the dignity and humanity in all people. And at the end of the day, branding someone a moron has rarely changed anyone’s mind.

Meeting Rita won’t heal the world, but it has helped change me. She’s no longer just an avatar behind a series of tweets. I know I won’t find fulfilment in telling her she’s wrong. I can’t imagine ever ridiculing her again to win the admiration of my followers. My happiness levels do not improve by fighting with people on the internet. The anger I once felt when battling with Rita online — an out-of-control fury that would send my anxiety soaring and leave me empty — has dissipated. It’s easy to throw stones at an ideology. It’s much harder to hurl them at flesh and blood. •

This is an extract from Jill Stark’s new book Happy Never After, published this month by Scribe.

The post Love thine enemy appeared first on Inside Story.

]]>
Going under https://insidestory.org.au/going-under/ Mon, 03 Jul 2017 00:05:00 +0000 http://staging.insidestory.org.au/going-under/

Books | When does consciousness end and unconsciousness begin?

The post Going under appeared first on Inside Story.

]]>
Imagine you are lying in a hospital bed, facing an operation that you must endure without anaesthetic for medical reasons. The surgery is so painful that a drug is administered to erase its memory. Waking from sleep, you ask a nurse whether you have had the operation or whether it is still to occur. The nurse is unsure whether you are the patient who had a ten-hour operation yesterday or the one who will have a one-hour operation later today. Which patient would you rather be?

This thought experiment was posed by the great moral philosopher Derek Parfit, who died last New Year’s Day. Parfit, whose ideas on personal identity have a Buddhist flavour – passages of his work are intoned by monks in a Nepalese monastery – queried the rationality of preferring a longer period of concluded pain to a shorter period that is still to come. The two experiences can both be vividly imagined but not recalled, and they differ only in their tense and quantity. It seems odd to favour the option that involves the larger quantum of suffering, as most of us do intuitively. More basically, is it rational to care at all about future pain that will not be recalled, suffering that does not become part of a continuous experience of self?

Kate Cole-Adams’s fascinating Anaesthesia: The Gift of Oblivion and the Mystery of Consciousness reveals that Parfit’s tale is not just a philosopher’s thought-bauble but also an insight into common experiences of surgery. Surgical patients are intensely concerned about the pain they might feel, often more than they care about the possibility of surgical error, infection or the gruesome fact that their bodies will be sliced and perforated. They submit to a chemically induced discontinuity in their consciousness to avoid that suffering in the present and wipe it from future memory. It is even possible that anaesthesia is effective precisely because it induces amnesia: patients under the influence of anaesthetics may experience excruciating pain during surgery but have their memory of it blocked when they awaken.

Cole-Adams’s main preoccupation is not, however, forgetting pain after anaesthesia but becoming aware during it. Some proportion of patients, its magnitude much in dispute, reports emerging into wakefulness during surgery, an experience that is often profoundly traumatic. Cole-Adams describes the experience as one of terror but it is better understood as horror. It is not so much that an awful but uncertain possibility is dreaded; instead, an awful reality has come to pass. In the words of the American scholar Barton Levi St Armand:

Horror overtakes the soul from the inside; consciousness shrinks or withers from within, and the self is not flung into the exterior ocean of awe but sinks in its own bloodstream, choked by the alien salts of its inescapable prevertebrate heritage.

Why becoming aware during anaesthesia should generate this existential horror is easy to comprehend. Searing pain is experienced in a body that is paralysed and unable to communicate, a predicament of deep, invertebrate helplessness. It is no surprise that people who endure it often suffer lasting psychological effects. As Cole-Adams speculates, some reports of alien abduction – being probed by spectral figures while lying immobile under intense light – may represent distorted memories of imperfectly anaesthetised surgery. This horror also explains why preventing awareness during surgery – and denying the inconvenient truth that it sometimes occurs – is a preoccupation among anaesthetists.

Anaesthesia is a topic that most readers will not have explored in any detail. But it is an important and spacious one, despite being almost invisible, just as anaesthetists themselves hover in the background while surgeons hog the limelight. (Anaesthetists’ joke: “The adjustment of an operating light is an immediate signal for the surgeon to place his head at the focal point.”) We should be very grateful to Cole-Adams for bringing the field into the wider consciousness it deserves. Without effective anaesthetics, the surgical revolution would not have been possible, medieval patients having to make do with prayer, herbs and violent restraint. Beyond its enormous practical importance, anaesthesia is also intellectually rich, helping us to understand the complexities of consciousness and unconsciousness.

Take consciousness, for a start. We are accustomed to having our subjectivity, our sentience and our capacity for voluntary movement all marching in lockstep. While awake we are aware, feel pain and emotion, and act on the world, and when we are asleep we are unaware, numb and inert. Anaesthesia can disrupt this alignment. There can be awareness without feeling, thanks to local analgesia; feeling without awareness, in the sense of pain suffered while unconscious; and awareness and feeling without movement, due to muscle-blocking drugs that cause paralysis. By producing these unusual phenomena, anaesthesia opens a revealing window on how the mind reacts to its temporary unravelling.

The nature of unconsciousness is equally complex. The term “unconscious” can refer to many levels of unawareness, which anaesthetists call the “planes of anaesthesia.” Cole-Adams demonstrates how enormously difficult it is, despite great advances in medical technology, to assess in the operating theatre when consciousness ends and unconsciousness begins. “Unconscious” can also refer to knowledge that exists outside awareness but may nevertheless exert an influence on behaviour. Cole-Adams explores at length how events occurring while patients are apparently out cold, and overheard speech in particular, may affect patients after they have come to. “Unconscious” can also have a Freudian meaning, referring not to what merely happens to be outside awareness but to what is forced out of awareness by repression or other psychological mechanisms of defence. The link to anaesthesia may seem tenuous, but Cole-Adams makes a case for some intriguing connections that also illuminate the neural and psychological processes responsible for the unity of normal consciousness.

This is a book of ideas, and Cole-Adams has spent its long gestation talking with a remarkable assortment of practitioners and researchers, and critically observing and mulling over their work. But it is also a deeply personal story. Interspersed with her never-dry explorations of concepts, theories and clinical practice, she relates her own experience of spinal surgery, her neuroses, her troubled relationships with partners, the illnesses and frailties of family members, and her deep resonance with people who have experienced “accidental intraoperative awareness.” Cole-Adams’s sensitive and slightly bruised persona is always present, only occasionally becoming intrusive or distracting. Throughout she writes vividly. During a period of recovering from a nameless fatigue, her voice is “slow and flat, like a mop being dragged across a floor.” As an anaesthetised patient has his false teeth removed, “his lips wilted inwards.”

It has been said that anaesthesia is the half-asleep watching the half-awake being half-murdered by the half-witted. Like surgery it is a troubling, anxious subject that most of us would rather avoid or deflect with dark humour. Cole-Adams has illuminated it in a memorable way. The book is a gift not of oblivion but of awareness. •

The post Going under appeared first on Inside Story.

]]>
No time like the present https://insidestory.org.au/no-time-like-the-present/ Sun, 26 Feb 2017 16:52:00 +0000 http://staging.insidestory.org.au/no-time-like-the-present/

Books | Our experience of time has a lot to do with how we balance past, present and future

The post No time like the present appeared first on Inside Story.

]]>
For the great American poet Walt Whitman, time was a constant preoccupation. He celebrated the vividness of the present moment – “To me, every hour of the day is an unspeakably perfect miracle” – and also its ambiguity: “The future is no more uncertain than the present.” He made his peace with time’s passage – “Here or henceforward it is all the same to me, I accept Time absolutely” – and understood it not as a physical abstraction but as a psychological reality: “Nothing endures but personal qualities.”

Walt might have had much to talk about with another Wittmann, this one a contemporary German psychologist, who has written a short meditation on the subjective experience of time and its neural foundations. Marc Wittmann’s book is an engaging overview of current thinking about how phenomenal awareness of time arises, how it is embedded in our experience of our bodies, and how mental illness and brain damage are associated with alterations in these processes.

Felt Time is made up of seven essay-length chapters. Each chapter stands on its own as an exploration of one aspect of subjective time, and there is no obvious logical progression from one to the next. The first chapter launches an investigation of temporal myopia, the way in which people neglect or irrationally discount the future, and how that short-sightedness is linked to impulsiveness in conditions such as attention deficit hyperactivity disorder. Wittmann contends that overvaluing our present can lead us to damage our future, but also that overvaluing the future can cheat us of a satisfying present. Being oriented too much towards future events and achievements can deprive us of the chance to live spontaneously and to be fully in the presentness of the moment. Compulsiveness is no more healthy than impulsiveness, in other words, although Wittmann offers no guide to finding the sweet spot between them.

The following chapters explore how time itself is experienced, suggesting that this perception occurs at four levels of granularity. At the finest grain, the present appears to last around thirty milliseconds, the span below which people can’t reliably detect the order of sensations. This level of temporal resolution offers a clue to the sense of time accelerating or decelerating under conditions of normal emotional arousal as well as abnormal neurological conditions. At the second level, the present moment seems to last between two and three seconds, the duration of many conversational turns and musical phrases and also, perhaps not coincidentally, of a human breath. Each short segment of experience represents an integration of perceptual experience into a single unit, and cultivating mindfulness allows us to participate fully in it. The third level, enabled by working memory, knits together moments into more lasting periods, and at the highest level these are assimilated into a narrative sense of self and personal history within long-term memory. Wittmann’s emphasis here and elsewhere is on the lower, shorter levels of time perception rather than the longer frame in which the sense of identity resides.

These chapters on the perception of time lead into chapters on its functions and our fears. Wittmann explains that we still don’t know for certain whether an internal clock or “cerebral pacemaker” exists, although humans and many other creatures can be quite skilled at estimating duration. This skill alerts us to when events are not occurring at the expected pace, as in the aversive experience of waiting. Although there may be no inner metronome, there is a daily (“circadian”) biological rhythm, and individual differences in how it operates have very real consequences. Wittmann presents the intriguing idea that the owl chronotype – people whose physiology impels them to late bedtimes – can experience “social jetlag,” landing blearily each morning in the foreign nation of early-rising larks. All of us, however, share a sense of time as a finite quantity that concludes in death, and the awareness of that regrettable fact may account for the experience of time passing more quickly with age. Wittmann recommends that we seek out novelty and variety as a way to stretch time, and observes that monotony makes time move slowly in prospect but seem to have raced in retrospect.

To conclude the book, Wittmann examines the self-consciousness of time, arguing that subjectivity arises from our sense of embodied persistence. It is not the succession of external events that gives us a sense of self extended through time, but rather the continuous internal awareness of our bodies. Time itself is not experienced directly but is apprehended through bodily rhythms such as the heartbeat, integrated by the brain’s insular cortex.

Felt Time makes for enjoyable and illuminating reading without resorting to gee-whiz sensation or didacticism. Wittmann’s writing, as filtered through Erik Butler’s unobtrusive translation, is neither showy nor self-consciously academic. He accessibly presents a variety of empirical studies in neuroscience and psychology, as well as distillations of philosophers not known for their accessibility, particularly the noted obscurantists Martin Heidegger and Edmund Husserl. One refreshing element in a book that aspires to present the latest ideas on mind and brain is the attention paid to German research and the broader German intellectual tradition. In a scientific domain where there is a significant bias towards the Anglophone countries, and especially the United States, this book stands as a useful corrective and should give wider exposure to some neglected European ideas and research findings.

For all its virtues, though, this book also stirs a few frustrations. Inevitably, one of these has to do with its omissions. The study of time in the human sciences is a vast and complex territory, and a slender text can’t hope to map it comprehensively. Even so, Wittmann has much more to say about the experience of the present than about how we imagine the future or remember the past. Our subjectivity includes a vivid sense of the moment but it is also wrapped up in forecasting what is possible and making sense of what has come to pass. Felt Time briefly explores how we discount the future or simply ignore it because it lies beyond our time horizon, but it is mute on the important question of how we make predictions about the future and how different imaginings affect us. Our subjective orientation to the past is also neglected almost completely. A sense of narrative continuity through time is essential to personal identity, and memory provides a foundation for it, but the focus of the book remains firmly on the experience of the present moment.

Other limitations of the book are structural. Time offers such a range of topics that the author tends to flit from one to another, sampling each briefly rather than settling for a period of patient exploration. This associative style is well illustrated by a chapter on the self and temporality, which bounces from an examination of the role of the insular cortex in bodily perception to a somewhat immodest solution to the mind–body problem, then on to a short dissection of boredom, and finally to a sociological discussion of the accelerating pace of technological development and the modern crisis of busyness. The pace of this veering intellectual tour will appeal to some readers, but others will find that it induces conceptual whiplash and an intellectual appetite stimulated but not whetted.

The book’s occasionally disjointed structure is symptomatic of a more general uncertainty about its aims. It is in some respects a genre-bending work, part science journalism, part popular philosophy and part advice manual. This hybridity is a departure for MIT Press, which has a proud record of publishing cerebral texts in the cognitive and brain sciences. The mixture can be engrossing. Wittmann moves smoothly between reporting on recent laboratory studies and citing long-dead phenomenologists, rarely falling into the trap of faux profundity that often snares writers of popular neuroscience. At other times, though, the switch from academic language to the language of self-improvement can be jarring. Wittmann’s injunctions to live mindfully in the moment and to exert control over the tempo of one’s life add a sermonising tone that the book could do without. To this reviewer, they called to mind Nicholson Baker’s delicious takedown of advice to “seize the day.” Carpe diem, he observes in The Anthologist, means to “pluck” the day, not seize it. We should “gently pull on the day’s stem” so as not to crush its delicate fruit. “Don’t freaking grab the day in your fist like a burger at a fairground and take a big chomping bite out of it.”

For all its limitations, Felt Time is a mind-broadening book and one that can be read profitably by humanists, scientists and laypeople alike. Wittmann’s amiable curiosity will inspire readers to continue their own exploration into time’s mysteries, perhaps taking one of the book’s almost 200 notes as a stepping-off point. Time is a limitless topic, but as ours is so limited it is almost a blessing that this fascinating book can be read in an afternoon. •

The post No time like the present appeared first on Inside Story.

]]>
Crisis talk https://insidestory.org.au/crisis-talk/ Fri, 09 Jan 2015 01:44:00 +0000 http://staging.insidestory.org.au/crisis-talk/

Books | We need to change, yet we resist. Sara Dowse reviews Vincent Deary’s compelling account of the psychological how and why

The post Crisis talk appeared first on Inside Story.

]]>
Vincent Deary comes late in a long tradition of grappling with this tricky business of being alive. Over a thousand years ago the Roman philosopher Seneca noted that “a gem cannot be polished without friction, nor a man perfected without trials.” Philosophers, theologians, psychotherapists, novelists, poets, columnists, economists – you name it, they’ve all been more than ready to jump in and tell us how to navigate life’s joys and sorrows. It’s part of the process of figuring it out for themselves. But Seneca was a statesman as well as a philosopher who also wrote plays, while Deary, until this first publication, was a little-known health psychologist lecturing at Northumbria University, researching the fear of falling among the elderly, the persistence of medically undiagnosable symptoms, and various other maladies the acronyms of which I’ve been unable to decipher.

So here, on the face of it, was an unlikely candidate for a multimillion dollar bidding war among publishers that has left Deary substantially better off than anything he could have dreamed. To be fair, the war was for two other books besides this one, all on the tired yet endlessly fascinating theme of how best to cope with the difficult things we’ll encounter simply by being born. Readers take note, volumes two and three are on their way: How We Are will be followed by How We Break and then How We Mend.

As is often the case, the catalyst for conceiving all this was a crisis in Deary’s personal life. What exactly happened he leaves unclear, but it necessitated a radical rejigging of his living arrangements and goaded him to embark on his project. Starting out, he had little idea where it would take him, and after the first year he had not much to show for it but a bundle of post-it notes. Five years later he had finally finished, but it took him another five to show it to anybody. Once he did, that was that. His reader was excited, as were the publishers who bid for the book. Penguin came up with the highest price and that’s how, fully formed, with its prestigious Allen Lane imprint, this first volume has landed on the bookshop shelves.

Yet you’d be forgiven for asking just what it is that Deary has to say that hasn’t been said so many times before. The catastrophic-life-event-that-doesn’t-break-you-will-make-you theme got a boost in the late 1970s with M. Scott Peck’s The Road Less Traveled, and countless other books went on to tread the same path. A genre, if you will, was boiled down to the reductionist “self-help,” but it can be much more. Elizabeth Gilbert’s bestselling memoirs ease into the category, as do the writings of Alain de Botton. So what makes Deary different, or perhaps better suited to our time?

The challenge in writing such a book is to drag your expertise away from the fusty abstruseness of your profession out into the open air where your readers have some hope of grasping what you’re on about. Deary takes many of his insights from the modern cognitive behaviour therapy he’s practised, but this is precisely the theoretical straitjacket he has struggled to break out of, at least for the purpose of this exercise. And so, because crisis lies at the core of any dramatic work, it’s to drama he turns. How We Are is structured accordingly. Act One deals with what he calls “Saming,” that is, the comfortable carapaces of habit that leave us on automatic pilot too much of the time; and Act Two is about “Changing,” or how we respond to the cracks in those carapaces that most of us, sooner or later, will experience, if to varying degrees.

The kind of drama he refers to is not for the most part the kind that Seneca penned but the more homely variety we watch every day on television or whenever we go to the movies. Television, Deary points out, is all about seasons. Most popular series are plotted on life’s stages: their very popularity rests on the way they mirror our own trajectories, reflecting a more or less steady progression from childhood to death with the inevitable bumps and hollows in between. Movies are different. “Movies are all about the change,” he writes, “the graceless dance” of passing through a transition. He illustrates this with a diagram of the narrative arc of a standard movie script, and then elaborates:

The little line to the left is Act One, representing the last stage of the stable state, just before disaster strikes, before something happens, some news from elsewhere: the leaving, the loss, the imperilling, the new arrival. The Event. The bulk of the movie is taken up by Act Two, the desperate dance of adjustment. That’s what we pay to watch. The struggle to re-establish equilibrium, not the old one, that won’t happen, and that’s the real struggle at first. The new reality clashes with the habits of the old one, the inbuilt, ingrained physiology of denial that continues to perform the old gestures in response to new demands, continues to reach for what’s gone.

First he tells us what has gone – that “small world” where everything seems to work for us. So much so we don’t even have to think about it; and so much so that, in such a mode, our consciousness is less a matter of will than of unconscious habit, those “beaten paths,” many of which can be retraced back to the beginnings of our species and are finessed in our individual histories. In other words, we’re creatures not only of our own sets of routine but also of the ready-made cultures we arrive in and the physical surroundings we inhabit. But that’s only the half of it. We in our turn inscribe ourselves onto these environments, from the way we order our intimate living spaces to the physical paths we create through acting out our desires.

Deary begins this first section of Act One with a disquisition on just what it is that’s “automatic” about us and how it has come to be so. One example he gives is the construction of a new supermarket in the neighbourhood where he once lived. To give the supermarket a congenial setting, the planners built a park in front of it and in the park a series of curving concrete paths to encourage pedestrians to amble through on their way to do their shopping. It was a noble intent, designed to make them stop and smell the roses. But habit had another thing in mind. The shoppers were concerned to pick up their groceries and other essentials in the quickest time possible and so ignored the curving paths and beat one of their own, straight from the road to the supermarket. A lesson was thus learned. The next time the planners undertook such a project they filled the open space in front of the supermarket with lawn and let the shoppers walk where they would. In very short time distinct paths were worn and these were the ones that were concreted.

That there’s a metaphor lurking here is obvious. But it’s when Deary expands on it, supported by the depth of his cognitive theory, that he comes into his own, demonstrating with his simple conceit of the beaten path how we are shaped by our world and how we come to operate in it. From here he is able to branch out into examining such diverse matters as how killers are made, not born, and how habits, good and bad, have both bodily and perceptual effects. A dancer’s constant practice will “physically sculpt” him and, in addition to staining teeth and damaging lungs, the tobacco habit will make a smoker enter a room in a wholly different manner from a non-smoker, causing her to make a quick judgement about whether it’s safe to light up. These are but two instances of what Deary posits as the interplay between force and form.

Habit creates unconscious agency, which comprises the greater proportion of our consciousness in general. We act out of habit even before our conscious selves are aware of it. In such a situation memory, for one thing, becomes unreliable. We have all had experiences of remembering things that others recall differently, or discovering that a remembered incident either didn’t happen or didn’t happen as we had recalled it. Yet it’s through memory, this most unreliable of narrators, that we create some sense of ourselves as living human beings. We need a continuous narrative, a vital part of which is “back story,” in order to maintain a healthy presence in the moment and, more, to plan ahead for a future. As far as we know, we are the only animals to have this capacity. Still, this essential component of our humanity is a far from perfect mechanism. “Attention needs to stay fresh to every moment, so the memory gets shifted into temporary storage, still relatively intact, for a few days at most,” Deary warns. All we can remember after that is “the gist” of things. The rest is imagination.


Force and form: the impact of these two concepts, and the all-important, never-ending interaction between them, constitute the basis of Deary’s thinking and much of the cognitive theory that has shaped his perception of the workings of human consciousness. But what happens to us when the form of our lives is suddenly, or ineluctably, dismantled? This, of course, is the essence of change. Yet however difficult or unexpected it is, change may be beneficial, may be, in fact, the means by which a healthier being is grown. That this isn’t always the case and sometimes is rarely the case, is yet another riff on Deary’s theme:

There is always a period of adjustment, of resistance to incoming change. And not just incoming: in our anticipations, dreads and desires, in our stories about how we should be, we create the gap between where we are and where we want to be or think we want to be. We make ourselves uncomfortable. Our longings to be elsewhere produce arousal and fatigue at the thought of all the work that we’ll have to do to get there. Our aversions for our present state lead to an inner shrinking, a closing down; a kind of arousal still, but this time fuelling the desire to flee or fight against it.

These feelings are, put plainly, our “resistance to the new.” Deary uses examples drawn from people in his life, friends or acquaintances, not to mention his own experiences. (They have been consulted about inclusion and given different names, and they have never been his patients.) I’ve come across a slew of examples of this sort of resistance in my own life, as most readers would have. A feminist activist, I learned early on that women who protested most loudly against feminism were often the very ones likely to end up embracing it fervently, and I myself spent years shoring up what I thought were my defences in a relationship I now see was doomed. But what Deary also makes clear is that these periods of change can be dangerous, “when the Automatic is for the moment suspended and Deliberation makes a rare, sustained appearance.” He finds such transitions “uniquely important and indeed formative of our character,” because this is when we can make the most character de-forming mistakes. We can be caught up in cults, for instance, because they offer ready-made, routinised frameworks as rigid as the shells we’ve broken out of. Or we may take cover in our own defensiveness and denial, never to move on to the next stage of equilibrium.

The force of our desires, the forms they take, then break. This is a process repeated throughout our lives until death, that final break, comes to carry us away. But part of Deary’s thesis – a fundamental part – is that our body’s death doesn’t actually remove our presence, which is embedded in the memories of other people, whether they be family members or members of the wider community. We are embedded as well in the physical environment, the buildings we’ve lived in, the records we leave behind, the paths we have taken, and in countless other ways. A good friend of mine – another writer – was the first to draw my attention in her fiction to the way a stone step can be worn away in the middle, evidence of the generations of feet that have trod there. And as it is with individual people, so it is with entire societies. This is especially significant when societies are subjected to new challenges. Their capacity to re-form, to channel their energies into newly productive ways of living – or not – is the very substance of history.

Back to the beginning then. Does Vincent Deary tell us anything new, anything we haven’t yet heard from one discipline or other, from another expert or another of trauma’s survivors? Well, perhaps not, although he does direct us to recent findings about the way our minds work and informs us that mere conscious will is not the efficient driver for action we once believed it was, and persist in hoping it will be. He is able to extrapolate from psychology, insights into social grouping, and the dynamics of history. Moreover, he’s presented all this in a highly readable, quickly assimilable book. And there, I suggest, lies the problem – at least it’s been a problem for me, its reviewer. Deary’s writing is so engaging, the insights at his command so extensive and varied, that oddly enough the central message can be hard to grab hold of. But I’ll give it a go.

Cognitive behaviour therapy is a quicker, less expensive method of resolving emotional difficulties than other forms of psychotherapy. But it shares a fundamental tenet with deep analysis, both Freudian and Jungian, and many other philosophies. “Know thyself,” in other words, comes in many packages, and can be reached through many well-beaten paths. Vincent Deary has shown us to be essentially creatures of habit, rather dull, and depressingly conformist at that, but every once in a while something will happen to change this, and if we see our way through it, a new way of being will emerge. It’s one of the very few things we can count on in this frustrating, beautiful, tragic, ultimately mysterious business of living. •

The post Crisis talk appeared first on Inside Story.

]]>
Edging through the fog https://insidestory.org.au/edging-through-the-fog/ Thu, 13 Nov 2014 05:01:00 +0000 http://staging.insidestory.org.au/edging-through-the-fog/

A diplomat and a psychologist have produced a remarkable guide to dealing with intransigent conflicts, writes Graeme Dobell

The post Edging through the fog appeared first on Inside Story.

]]>
Blessed are the peace-makers, for they must look deep into brutality, bastardry and blood lust. And pity the peace-makers as they grapple with nationalism, religious fervour and spiralling cycles of retribution and revenge.

The horrors of the twentieth century demanded new ways of peace-making, and The Fog of Peace follows the craft into some of the toughest places. Echoing Clausewitz’s famous reference to “the fog of war,” this is a jargon-free meditation on how “psycho-politics” can be applied to geopolitics. Mercifully stripped of multilateral diplobabble, it is partly a how-to-do-it guide built on experience and wonderful stories.

The two authors are veterans of the craft they describe. Gabrielle Rifkind, a director of the Oxford Research Group, is a psychotherapist and specialist in conflict resolution who has long been immersed in the politics of the Middle East. For two decades, Giandomenico Picco was a UN negotiator, working on the Iran–Iraq war and spending eight years seeking to end the Afghan–Soviet war. As a UN undersecretary, he led the initiative known as the Dialogue Among Civilisations.

Offering himself as both captive and negotiator, Picco helped release eleven hostages in Lebanon in 2001. Taken by masked men and bundled into a car at 2 am in Beirut, he felt the sweaty hands of one of the men who’d grabbed him: “I could see the kidnappers were frightened, so I took his hand and helped him feel safe, as he was looking at me through the mask he had always worn in my presence. No words were uttered but much was said.”


Rifkind and Picco’s central argument is that simple humanity – the effort to seek the humans hidden inside rigid hierarchies of power and bureaucracy – is the strongest way through the quagmire of conflict. This is idealistic but it is not naive. The proof is in the scars of failure they carry as well as the wins they’ve achieved.

Picco describes this book as a collaboration between “a manual worker of war and peace” and a professor of the human mind. To edge through the fog is to have faith in the ability of humans to surprise even themselves, to seek to unlock the hardest heart: “We believe we need to understand our own minds and our own potential for arrogance, vanity or puffed-up pride, and how for all of us our own ego may sit in the way of progress if not properly managed.”

While realpolitik and power are usually seen as the drivers of international conflict, Rifkind and Picco bring an understanding of human relationships and emotions into the elite world of economic and military calculations, strategic options and alliances:

Politics is not about therapy and politicians and states cannot be placed on the couch; nevertheless, human motivation and psychology need to be part of the strategic calculations of decision makers. For it is man who both creates and ends wars, and destroys his environment. Institutions do not decide to destroy or kill, or make peace or war; those actions are the responsibility of individuals. So to try and understand the root causes of conflict only in terms of power politics and resources, without also understanding human behaviour and what exacerbates the fight over resources, undermines our effectiveness in preventing war and making peace.

The first section of The Fog of Peace is an anecdote-rich discussion of Picco’s and Rifkind’s experiences. Instead of war stories, these are peace stories, and they are fine yarns.

Picco starts as a UN political officer arriving in Cyprus two years after the 1974 war. He lands on Sunday and by Monday, before he has spoken to anyone, he is being accused of bias by both sides. The distraught young man from the United Nations learns his first lesson: in a conflict, no side ever buys the myth of the “impartial” outsider.

That lesson hardened into a permanent truth during the long negotiations to end the Iran–Iraq war. Iraq stormed out of the UN talks in 1988, believing that after eight years of war it had at last gained the strength to make battlefield gains. Then Picco showed the traits the book prizes in peace-making – the ability to spring a surprise and the capacity to try something new, based on a deep understanding of all the players. If Iraq would not stay at the table, the United Nations would invite Iraq’s banker, Saudi Arabia, to negotiate in its place. The Saudi willingness to stop the flow of weapons and cash to Iraq, and the Saudi king’s pressure on Saddam Hussein halted the rush back to battle and unlocked a surprise ceasefire that became permanent.

When the Soviet Union invaded Afghanistan, it was widely predicted that the war “would never end,” that it would be fought “to the last Afghan.” Instead, Picco describes a seven-year UN gamble – initially with “no political support or encouragement from any quarter” – that wove together an extensive agreement for Soviet withdrawal. Institutions always matter, he writes, but they can be pushed and shifted by individuals at key moments.

Rifkind worked as a political therapist in the Middle East, applying the lesson of Northern Ireland: peace is achieved not by moderates but by getting hardliners involved in a process that move the language from violence to politics. That meant looking with fresh eyes at Hamas and Hezbollah, and treating them as religiously inspired nationalist groups even if they were deemed terrorists by the European Union. “It is very important to differentiate Hamas from groups like al Qaeda, who have a nihilistic philosophy and no clear political agenda beyond the destruction of the West,” she writes.

Engagement with Hamas offered the chance of a slow, IRA-style evolution of its political and ideological position, tapping into “a degree of pragmatism about what they want for their people and how to achieve it.” Most Israelis, Rifkind notes, see the attempt to engage with Hamas as a naive risk to national security rather than a potential opportunity. She recounts her experience as a Jewish woman travelling to secret destinations for discussions with Hamas leadership:

Hamas showed respect and some appreciation that we had taken the trouble to try to understand the world from where they stood. I do not want to sound like the foreign minister for Hamas when telling their story, nor do I want to be seen as having gone “native.” When telling a human story, it can easily sound as if you have been taken in. One does not need to agree with how a group thinks, but it matters to understand why they think as they do.

Rifkind repeatedly returns to the point that empathy and understanding is vital to negotiations – just as it is in most relationships – but it doesn’t equate to support or endorsement.


As Australia’s military heads off again to fight evil and extremists, Canberra policy-makers could ponder Rifkind and Picco’s extensive treatment of the psychology of conflict in the Middle East in chapters on political Islam and personal accounts of thirteen successful negotiations with Iran.

They argue that the West has “been insensitive to the growing Shi’i–Sunni divide that had begun to envelop the whole region from the late 1970s.” All the region’s conflicts since then have had “a sectarian dimension with historic roots but with a modern vision,” they write, yet the competition between the Saudis and the Iranians has been “well outside the deeper understanding of the West.”

Beneath the surface of extreme and rigid ideologies they see “fear, humiliation and profound anxieties.” Airstrikes won’t resolve the situation:

[I]f the senior leadership of any insurgent group is wiped out, those left in charge are often politically immature and inexperienced, and do not have the skills to succeed when dealing with “the other,” including the ability to look at options. It is often the youth of the leadership that is hardened, inexperienced and lacks the maturity to negotiate around the table.

Discussing the toxic relationship between Iran and the United States, the negotiator and the therapist lament how the “ghost of history” means any new behaviour is met with suspicion and mistrust. As much as Iran, the United States has “been driven more by ideology than any serious commitment to how you shift relationships between enemies.” To illustrate the mindset, they quote the sardonic Five Rules of Negotiating with Iran, penned by former US ambassador John Limbert in 2009:

1. Never walk through an open door – instead, bang your head against it
2. Never say yes, or else you look weak.
3. The other side must be seen as infinitely hostile, devious and domineering.
4. Anything the other side proposes contains a trick; the only purpose for the other side is to cheat.
5. If any progress is made, someone will come along and mess it up.

Little wonder that Barack Obama’s early overtures to Iran were as strongly attacked in Washington as they were in Tehran.

Rifkind and Picco offer fine vignettes of specific peace efforts: the intimate secrecy of Oslo, the long process of building basic trust that inched to peace in Northern Ireland, the missed chances and noisy choices of the Clinton administration’s rushed effort in the frantic media spotlight at Camp David. These tales from the diplomatic trenches feed the therapist’s argument that to understand people is to understand geopolitics. States as well as people must be open to new identities and new understandings of the history that drives their actions and their anger. Realpolitik doesn’t hold all the answers for what happens in conflict, nor in the human heart.


Much is packed into this 266-page version of War and Peace, and not all of the threads tie neatly. The effort to give sense and shape to the analysis is unbalanced by the noise and sensations crashing through the Middle East. The headlines as the book was being written were about Syria, just as today they are about Iraq. The authors offer thoughts about what went wrong in the West’s approaches to Afghanistan, Iraq, Syria and Iran. Several times they reach for the idea that the world has been wrong to focus on what the United States has been doing in the Middle East – that the real story has been the unleashing of that struggle for dominance between Saudi Arabia and Iran. The region running from Lebanon to Pakistan, they write, seems headed for a long period of huge instability and violent political awakening. Even a modern-day Tolstoy would struggle to get all that into one small volume.

Based on the experience of the last decade, the book suggests the West needs to recognise that it doesn’t have all the answers for what ails the Middle East. The plea is for fewer drones and more diplomats – but not cumbersome forms of international negotiation based on “complex bureaucracies, circuses of diplomats, frequent flyers around the global terrain with insufficient evidence of success in the resolution of conflict.” Big institutions are central to war and peace, yet bureaucracies become stuck in the fog. To seek a way out of conflict is to try for something new, to embrace new thoughts. That is not what the big beasts do.

Picco and Rifkind say negotiators need the trust and backing of leaders and institutions, because it is the politicians and bureaucrats who must try the fresh ideas, seal the deals and make settlements or ceasefires work. In seeking that new path, though, peace-makers must have some freedom from the constraints of power and the demands of politics – they must be “more nimble and agile” with the “flexibility to act with both heft and speed.” The necessary qualities are of the highest order because the task is so exquisitely difficult. Peace-makers may find that when the fog lifts they are in the middle of no-man’s-land being shelled by all sides.

The penultimate paragraph of the book goes back to the basics of what humans are and what we do:

War as a solution, with its pathology, creates a madness of irrationality and paranoia, in which the worst aspects of human behaviour are stimulated. Such fights for survival stimulate the kind of aggression that magnifies hatred and perpetuates the most destructive aspects of mankind.

To grapple with this reality is to ask governments and leaders to understand themselves as well as their people. It is a counsel of maturity, seeking emotional as well as political intelligence. The diplomat and the therapist end with the thought that peace-making is not a technique, but the triumph of those with the will to do better and the courage to try – “a vision that understands both geopolitical power struggles and the complexity of the human mind.” •

The post Edging through the fog appeared first on Inside Story.

]]>
Perfect storms https://insidestory.org.au/perfect-storms/ Mon, 18 Feb 2013 00:20:00 +0000 http://staging.insidestory.org.au/perfect-storms/

A new book explores why wars can continue well beyond the point where they seem to have served any purpose, writes Tom Bamforth

The post Perfect storms appeared first on Inside Story.

]]>
“THE snow’s glorious up here,” said my colleague over the phone from Geneva. “The best day of the season.” I had called on the weekend, hoping to find out a bit more about what I was going to be doing when I arrived in the Philippines to assist with the humanitarian response to Typhoon Bopha. Known locally as Typhoon Pablo, Bopha ripped through the eastern provinces of Mindanao at 230 kilometres an hour late last year, causing flash flooding, uprooting trees and triggering immense landslides. It was a long way from the snowfields of Switzerland.

As I drove through the wreckage shortly after my arrival, palm trees littered the landscape like giant toothpicks that had blocked roads, ripped up gardens and smashed houses. “Only my grandfather can remember the last typhoon,” one community leader told me, “and that was in 1912.” Totally unprepared for “super-typhoon” winds, over six million people were affected in some way, with 2000 killed and a million homeless. In one area, Andaap, severe flooding caused giant boulders to wash downstream, diverting the course of the local river and sweeping away villages. And as if the typhoon wasn’t enough, for two months it rained solidly. As I spoke to my colleagues on the ski slopes of the Swiss Alps, a further 40,000 people, displaced by ongoing flooding, were camping by the sides of roads or sheltering in government-provided “bunkhouses” – shabby plywood constructions with ten people to a room and no connection to water, sanitation or drainage.

In the words of the UN coordinator overseeing the work of humanitarian organisations, the typhoon was part of a “perfect storm.” Not only had Pablo been hugely destructive in itself, but with general election campaign getting under way there was a fear that aid would be manipulated to serve electoral purposes. More damagingly, though, the disaster came as a ceasefire ended between government forces and the New People’s Army, the communist insurgency that brought together Maoist socialism, peasant revolt and opposition to the country’s repressive regimes in the 1970s and 80s, fuelled and fractured by competing mining and natural resource interests. Stretching back nearly forty years, this conflict had become a seemingly permanent part of the Philippines landscape.

When I visited Manila to try to raise funds and publicise the disaster, some donors encouraged me to see Typhoon Pablo as an “opportunity” – to rebuild, encourage tourism and replace aid funding with private sector investment. Returning to Mindanao, I learned that labour recruiters, quick to take up such “opportunities,” were already offering women whose livelihoods had been destroyed contracts to work as domestic help in the Middle East. At $400 a month, these contracts were considered attractive, despite the significant risk of physical, sexual and economic abuse they carried with them. Once again, against a backdrop of conflict and natural disaster, aid agencies and business and political interests were creating a web of interdependent and counterproductive interests. Attempting to get the humanitarian response right was, for many, a subsidiary concern to exploiting the “opportunities” that arise from a fresh natural disaster and the impact of a decades-old war.


WHY wars like this go on for so long is the subject of David Keen’s Useful Enemies. This is a timely, if not altogether new, discussion of how the interests of conflicting parties and aid/development agencies mean that some conflicts can be too valuable to be allowed to end. Economic, political and psychological imperatives can ensure that these conflicts become “permanent emergencies” and mutual profiteering can continue. As Keen puts it, wars are neither an exceptional phenomenon nor simply a breakdown of peaceful means of resolving differences. Instead – and here he draws on the French philosopher Michel Foucault’s description of prisons – they are “the detestable solution, which one seems unable to do without.” The term “war systems,” he argues, may better reflect reality than the word “war,” with its implications of winners and losers and defined political and military objectives.

Useful Enemies is structured around the central insight that the conventional approaches to understanding conflict in the author’s academic disciplines of sociology and political science have failed to explain the brutality and longevity of certain conflicts. These approaches reflect a simplistic worldview in which war is rule-bound activity fought between two identifiable sides, each with the ostensible aim of defeating the other. Keen’s aim is to understand the “real factors” that underpin and sustain brutal, long-term conflicts such as those in Sudan, Sierra Leone, Guatemala, Cambodia, Afghanistan, Iraq, Sri Lanka and Pakistan.

Importantly, Keen doesn’t see protracted wars as solely a feature of far-off places. He also turns his critical gaze on the function of war and the “permanent emergency” in the West, particularly through an analysis of the nebulous Global War on Terror – or GWOT, as it became known to jaded State Department employees – and the political influence of the US military–industrial complex. Significant in this analysis is a conception of intractable conflicts as part of an internationalised system linking the theatre of conflict with the political, economic and even mass-psychological imperatives of the wars’ various prosecutors – including governments, rebel groups and, at times, Western aid institutions. In Keen’s words, “when the expressed goals in a war are not being achieved, a number of unexpressed goals are nevertheless being fulfilled.” It is these hidden or unexpressed motivations that Keen seeks to uncover.

While Keen has done valuable service in bringing together a wide-ranging analysis of global conflicts, much of what he describes has been explored before. To take a few examples: the war-mongering and increasing influence of the US military–industrial complex, and the ideological uses to which it was put by neo-con members of the second Bush administration, have been recognised for decades. In fact, it was Dwight Eisenhower – Republican president and former supreme allied commander in Europe – who coined the term “military–industrial complex” and identified the phenomenon as a future threat to US democracy. Similarly, the self-serving (if still undeclared) anti-Indian and pro-Taliban policies of the Pakistan army, which lead it to aid parties against which it also fights, is an acknowledged source of frustration among the US officials who continue to fund the army because Pakistan is “too important to fail.” The role of natural resources and blood diamonds in the Congo, while horrific, became so much of an international cause célèbre that it even has its own Hollywood dramatisation. And the growing politicisation of the aid industry and the cooption of aid agencies as “force multipliers” – part of military “hearts and minds” campaigns in places like Afghanistan – is concerning but hardly news.

It has also been some time since anyone thought wars were between two sides. That kind of thinking was abandoned by many aid agencies, for instance, in the 1980s, with the growth of complex aid operations and the humanitarian response to famine and war in Sudan and Ethiopia. Mary Kaldor’s “new wars” thesis appeared in 1999 and Mark Duffield’s seminal work on the merging of security and development practices and discourses, Global Governance and the New Wars, in 2001. These analyses recognised that in the post–Cold War world, state-based conflicts had been replaced by fragmented, multifaceted subnational conflicts that merged ethnic separatism, ideology, criminality, and proxy regional and great-power interests, and erased the distinction between civilians and combatants.


UNUSUALLY for a book dealing mainly with the political and economic drivers of conflict, Useful Enemies also considers the role of individual psychology. In what is perhaps the book’s most interesting chapter, Keen draws on often-disturbing interviews with rebels and soldiers in Sierra Leone, US conscientious objectors and military recruiters to show how shame, violence, power and prestige play interlinked roles in entrenching conflict. Military recruiters play down the infantilising discipline and hierarchy of military life and emphasise the promise of respectability and honour when they seek to attract young men and women from impoverished regions. Similarly, the power of the gun and the gang often stems from self-perceptions of low status, powerlessness and victimisation that can suddenly be reversed through violence, which helps to explain the brutality of conflicts such as the civil war in Sierra Leone. In the words of one aid worker, “when unappeased, violence seeks and always finds a surrogate victim. The creature that excited its fury is abruptly replaced by another, chosen only because it is vulnerable and close at hand.” By extension, Keen argues that the response of the United States to the 9/11 attacks in New York fell into this narrative of shame on a grander scale. The terrorists, according to George W. Bush, saw the United States as “weak,” “flaccid” and “impotent” – a shameful affront to American nationalistic masculinity that could only be rectified by war.

Stylistically, Useful Enemies is idiosyncratic. In attempting to be accessible to a non-specialist readership, Keen inserts personal reflections and experiences that don’t really serve to illustrate his points or advance the argument. Coming in the middle of a discussion of Darfur, for example, an anecdote about a friend who developed “green” vacuuming products seems injudicious. In distilling the lessons of a dozen protracted conflicts extending back to the English Civil War, the chapters sometimes read like a peripatetic and somewhat ad hoc tour of complex emergencies. This approach also means that Keen is unable to develop some of the local complexities of his arguments. The events described, presented in summary form, are not presented as being interesting in themselves but rather serve as an array of evidence for his broader themes about the economic, political and psychological interests that prolong conflicts. This piecemeal approach also means that some important historical trends in understanding conflict are missed. There is only limited discussion of how changing environmental conditions and sources of livelihoods – so much a component of Saharan conflicts – often underpin complex emergencies. Nor is there much discussion of the roles of men and women in conflict, even in the section on the psychology of violence, despite the gender basis of much of the extreme violence of ongoing conflicts.

Here in Mindanao, meanwhile, the rain continues and international donors are preoccupied by other priorities as the development gains of the past decade are destroyed. It takes ten years for a coconut tree – one of the main sources of local income – to reach maturity. In the meantime, amid the destruction of lives and livelihoods, there is fertile ground for new “opportunities” to arise: recruitment into the military–criminal nexus of the contending armed forces, “opportunities” to take indentured work overseas, with few if any rights or protections, and the economic “potential” for rezoning of mining areas or tourist resorts in what were once villages. Fundamentally, Keen is right. The problem is that this analysis is already well understood – not least by those who bear the greatest burden of war’s brutality. •

The post Perfect storms appeared first on Inside Story.

]]>
Perchance to dream https://insidestory.org.au/perchance-to-dream/ Fri, 30 Nov 2012 19:45:00 +0000 http://staging.insidestory.org.au/perchance-to-dream/

There’s still a lot we don’t know about sleep, writes Sally Ferguson

The post Perchance to dream appeared first on Inside Story.

]]>
A few years ago David K. Randall, a senior reporter for Reuters, experienced some disturbing sleepwalking episodes. On one occasion, he woke to find himself injured outside his bedroom. He had no idea what had happened to him, nor what had caused his night-time wanderings. He sought help from various doctors, including specialist sleep physicians, hoping for a diagnosis and, ideally, a treatment program.

Surprisingly for him, but not for those of us who work in the field, Randall was told that doctors can’t explain the cause of sleepwalking and, more importantly, they can’t guarantee that anything they suggest will stop people like him from somnambulating in future. Intrigued, Randall set off on a quest to find out more about what we do and don’t know about sleep. Dreamland is the result of that search; entertaining and well researched, it looks at the time we spend between the sheets and what happens when it doesn’t include enough sleep.

Where Randall’s book is very much an introductory overview of sleep research, Till Roenneberg’s Internal Time presents the science of circadian rhythms in rich detail. Part textbook, part handbook, part autobiography, it couples scientific information with largely fictional (and in some cases rather odd) case studies that serve to illustrate the application of much of the detailed biological science. This is the story of biological rhythm research over the past fifty years told by someone who was a part of it.


Although it’s true that we still have a lot to learn about sleep, David Randall covers much of what we do know in a very easy style. His 260 pages provide a nice summary of what we know about how sleep works, what it looks like (when we record it using electrodes on the head), what goes on when we have our eyes shut (mostly at night) and what happens when we don’t sleep. Randall discusses theories about the evolution of sleep, the impact of electricity and lighting on sleep in recent centuries, and the interaction of work and sleep in today’s open-for-business-24/7 world. He also delves into the believe-it-or-not part of the sleep world, describing the bizarre and in some cases rather horrific behaviour that can occur. Despite having spent the past twenty years studying circadian rhythms and sleep, I have to admit to struggling through some of the more disturbing cases.

The changing pattern of sleep in the Western world has implications for human health – from metabolic health to mental health – that we are just now discovering. Sleep is often last on our list of priorities. Many of us only go to bed when we have done everything else we need to do, and this can mean that we don’t spend the ideal time between the sheets. Information like this, which helps people understand the importance of sleep for health, well-being, quality of life, and safety on the roads or in the workplace, makes Randall’s book very valuable.

But it is not just a matter of prescribing eight hours of sleep for all. If we understand the role that sleep plays in healthy, happy lives then we are better placed to make informed choices about how we spend our time and how we prioritise activities. Sleep isn’t the be-all and end-all of life – although, as Randall describes, rats will eventually die if you keep them awake for long enough (and not as long as you might think). Rather, sleep is one element of our well-being, and how we fit it in among all the other parts of a healthy life – work, leisure, family time, socialising and being on our own – is the important part.

Randall certainly doesn’t claim to have found all the answers. Nor does he claim to have written a textbook or self-help book. What he does is provide a very entertaining and informative insight into that large chunk of our lives that many of us don’t spend much time thinking about.


Where Dreamland might be described as a mile wide and an inch deep – in a good way – Internal Time is an inch wide and a mile deep, and that’s not a bad thing either. Till Roenneberg, a professor at the Institute of Medical Psychology at Munich’s Ludwig Maximilians University, trained with one of the true pioneers of circadian rhythm research, Jürgen Aschoff. Although Aschoff’s name is not widely known outside the research community, postgraduate students and academics who work in the field began their careers reading his rules and learning them by heart. Essentially, he explained why our internal clocks synchronise to the world, why they get out of whack and what to do if you want to realign them. Having worked with Aschoff, Roenneberg is thus well placed to explain the science of circadian rhythms, which he has been a part of discovering, to a broader audience.

Circadian (circa – about, diem – a day) rhythms are biological rhythms controlled by our internal body clock. The exact timing of our individual clock is determined by our genes, but even without the daily cues of sunrise and sunset we will keep ticking over on a cycle of about twenty-four hours. (Roenneberg says that the fact his book has twenty-four chapters is a happy coincidence, but it does give a nice balance to the book.) Although the most obvious example of human circadian rhythms is our sleep–wake rhythm, physiological characteristics like body temperature, hormone levels, blood pressure and performance also fluctuate in a pattern across that period.

Internal Time contains very thorough descriptions of how our biological clock, located in a specific part of our brain, works. The detail might be too great for many readers, but the writing is relatively easy to follow and the technical descriptions are accompanied by graphs and images. An unavoidable drawback is that the case studies and examples are very eurocentric, which means most are out of sync with the southern hemisphere and Australian readers. Where Roenneberg discusses the perils and challenges of long nights in December, Australian readers can make the conversion to our winter relatively easily. But when a case study takes an individual from Frankfurt to Morocco, the conversion is a little trickier and may require a map or timezone converter.

Roenneberg also discusses how researchers in this field assess the internal time of individuals. A number of questionnaires have been developed to find out the times of the day at which different individuals prefer to perform certain activities. The premise is that individuals are largely switched on to their internal drivers and can state a preference for cognitive activity in the morning and for sleep between 10 pm and 6 am – as larks might – or a preference for cognitive activity in the evening and for sleep between 1 am and 8 am, like owls or adolescents. Such surveys have been validated against physiological markers.

Roenneberg and his team’s survey, using the Munich Chronotype Questionnaire, is based purely on the timing of one’s sleep on free days – those days when we don’t use an alarm clock. (For most of the population, this probably means weekends.) On this basis, he argues that we can fairly readily judge which “type” we are and attempt to adhere to our natural cycle – with benefits including greater alertness during waking hours and more positive moods.

While the questionnaire does give some indication of the internal timing of an individual’s body clock, a range of other factors influence the timing of our sleep. If we were in a bunker and free to do whatever we wanted, whenever we wanted, then yes, our internal clock would probably be the strongest driver of our behaviour. But in our homes and in our cities, the factors influencing our choice of bedtime are innumerable, even on free days. Social activities, television and other technologies are just a few of the factors that keep us out of bed on those nights when we know we don’t have to get up for anything in particular the next morning. Nevertheless, it is certainly the case that understanding our own fluctuations in performance, alertness, mood and hunger can allow us to use the time in our days well.

Dreamland and Internal Time are a perfect pair. The questions that Dreamland didn’t answer (and didn’t aim to) are largely addressed by Internal Time. But while Dreamland is definitely appropriate as bedtime reading owing to the lightness of the writing and the content, Internal Time is better consumed at or near your peak concentration times. Stories of partner-swapping, homicidal wives and space travel aside, the science of Internal Time will take some digesting, which is best done at a time of day when you can concentrate and process information most easily. Having said that, dozing off with stories of sleep crime on your mind may make for some interesting times in dreamland. •

The post Perchance to dream appeared first on Inside Story.

]]>
Memories for the future https://insidestory.org.au/memories-for-the-future/ Fri, 27 Apr 2012 00:09:00 +0000 http://staging.insidestory.org.au/memories-for-the-future/

If we are the sum of our memories, then how should we go about creating them, asks Richard Johnstone

The post Memories for the future appeared first on Inside Story.

]]>

OUR pasts, and the pasts we’ve lived in, have never been more accessible. Aerial photographs of the backyard of our childhood home, records of our online shopping that constitute a kind of biography of acquisition, our emails stretching back to the beginning of time (or circa 1992, which amounts to the same thing) can all be found in a matter of moments and all without leaving the terminal. Indeed, so ubiquitous and so accessible is the raft of information that defines us that it has almost become easier to remember than to forget.

That at least is the burden of Viktor Mayer-Schönberger’s Delete: The Virtue of Forgetting in the Digital Age, first published in 2009 and now released in paperback. The subtitle is designed to be arresting: virtue and forgetting have not typically been placed side by side in our cultural landscape. Forgetting indeed has been something to be avoided. Forgetting is the process by which our memories are altered or removed, and that has not generally been seen as a good thing, expect perhaps by those – survivors of war, for example – for whom it is the only alternative to the repeated trauma of remembering. To forget, whether through the natural processes of ageing or as a consequence of illness, or to be induced to forget, in the case of such terrifying prospects as mind control or brainwashing, has been seen as tantamount to losing our very selves.

Memories, in fact, are our selves. Although our lives might be shared with others, with families and friends and colleagues, when the time comes for recollection we each of us tend to remember those shared experiences in significantly different ways. We are culturally attuned to seeing the two things – our unique selves and our memories – as mutually dependent and inseparable. Our memories create us, and we find the prospect of one day losing those memories deeply unsettling. We fear how easily we might forget, which is why the new technologies, with their capacity to archive forever the digitised raw material that shows who we are and where we have been, have appealed to us as guarantors that our memories will never fade. The vast electronic archive may not contain all the memories that are in our heads, but it does contain the stuff of memory, the documents and photographs and scraps of information that have the power to revitalise our current memories and even help us to recover ones that would otherwise be lost forever.

It is not surprising, then, that over the past twenty or thirty years of the technological revolution we have proved so relaxed, even cavalier, in assigning more and more of the responsibility for remembering the details of our lives to memory banks and memory discs and memory cards with capacities for accumulating and preserving more and more of the raw material of memory. We can only have been so relaxed about this development because of our belief that more is better – that the more information that is out there and freely available, the more we can dive into it to supplement and refresh our own memories and our own sense of our selves. Against the idea that technology is dehumanising and intrusive has grown the powerful belief that the technology is there to help us – among other things, to help us to remember and even to do our remembering for us.

Yet the ever-accelerating pace of technological development can feel as if it is creating too many choices and too much information – too many memories, in a sense – and it is this feeling of being overwhelmed that has led to a reassessment of the act of forgetting. In considering ways in which the web could be induced to forget at least some of what it currently remembers, Mayer-Schönberger has caught a revisionist moment in our approach to the relationship between remembering and forgetting. How to unburden ourselves of some of the stuff that is holding us back and tying us down to outdated definitions of ourselves is emerging as a new project for our times.

Up until very recently in human history, argues Mayer-Schönberger, “the default was to forget.” Remembering was hard, keeping records was costly. If you wanted to uncover something from your past – an elusive document, a buried trauma – you had to make an effort. To forget, to let go of or otherwise repress those memories that were superfluous to immediate requirements or just plain embarrassing, seemed by far the easier, and in many ways the more sensible option. At the same time, the relative ease of forgetting carried overtones of weakness, particularly when viewed through the lens of psychoanalysis, say, or the “never again” school of history, which painted forgetting rather than remembering as damaging to the self. By these lights, rather than take the easy option, we had a responsibility to ourselves and to others to remember. But now, “for the first time in human history,” remembering and forgetting have swapped places, and remembering has adopted the default position. We now live, says Mayer-Schönberger, in “a world that is set to remember.” This means, in effect, that the injunction to “forget it,” which once seemed like an encouragement to take the path of least resistance, now seems like a recipe for hard work.

In a media-saturated environment of constant and never-ending novelty, in which the week before last is already the remote past, it may seem counterintuitive to argue that it has become easier to remember than to forget. But Mayer-Schönberger, in describing a world that is set to remember, is not saying that everything is remembered by everyone all at once, but that elements from our pasts, forgotten by others and considered by us, if at all, as irrelevant to who we are now, can suddenly and arbitrarily resurface, sometimes with destructive consequences. One of the most influential ways in which memory has been conceptualised in modern times is as a collection of everything we have ever done, sitting there in our heads, waiting to be accessed, if only we could find the key. By happy analogy, the computer provides those keys. They’re right there, between us and the screen, offering access to information and hence to memories that might have been thought of as over and done with. The compromising photograph, for example, injudiciously posted to Flickr or Facebook, which can without warning emerge from the archive to cut short a promising career. For Mayer-Schönberger, this kind of recovered memory – recovered by others, not by the individual whose memory it is – is pernicious because it denies us the human right to remake ourselves, to learn from experience, to forget and move on to the next stage.

While the examples that Mayer-Schönberger quotes are extreme cases – for the most part, ill-advised Facebook postings do not in fact lead to career plateaus – his underlying point, that digital memory threatens to overtake human memory, refers to how we increasingly feel overshadowed by an edifice of information about ourselves that we no longer control. We can feel trapped, sometimes by the very material that we ourselves have posted online while under the illusion that we were in charge. (How extraordinary, by the way, that the two terms that we most familiarly use to describe the new reality of electronic interconnectedness, “web” and “net,” should both evoke the notion of entrapment.) It is on this question of control that the implications of Delete are most far-reaching. As control over information slips away from us, it is no accident that the curatorial impulse has assumed such importance in the way we interact with the web. Curating, in fact, is the new creativity. Compilations, assemblies, mash-ups – Mayer-Schönberger’s preferred term is “bricolage” – constitute a kind of fight-back, a way of remaking discrete documentary items, emphasising their malleability rather than their fixedness, and in doing so imposing a personal stamp upon them. In the same way, the fashion for talking up the role of so-called “information curators,” the people who drive pathways through the web in the quest for some kind of order, contributes to the idea that if we are to retain authority over our own biographies, we need to take some kind of action and not just sit there and let ourselves be taken over.

Mayer-Schönberger proposes some solutions for how we might reclaim ourselves and our memories from the clutches of the web – a changed culture of how we use and respond to the burgeoning of information, a built-in use-by date for the currency of that information – but they don’t really convince. We are left wondering whether it isn’t just better to go with the flow rather than worry too much about where it is all heading. But for Mayer-Schönberger, that would be a dangerously complacent line to take. “We are creating,” he says, “a digital memory that vastly exceeds the capacity of our collective human mind,” and it is possible to hear, echoing out from that phrase, the idea that we have built the monster that will devour us. In his suggestive remarks on the popularity of bricolage, he seems to be pointing to the possibilities of counter-revolution, of asserting ourselves against the vast repository of information, and resuming control. And indeed, an optimist could argue that the resumption of control – us owning the information rather than the information owning us – is well on its way, in the increasing sophistication of search functions that, thanks to tags and metadata and the like, not only make information accessible but also can make us feel much more in control of that information, and much more in control of our pasts. “It’s easy,” say the advertisements for ancestry.com.


AS A useful corrective to the idea that the information explosion is a very recent phenomenon, James Gleick’s information-packed history of the way we store, transmit and recover knowledge, The Information: A History, a Theory, a Flood, reminds us that we have always felt overloaded, struggling to cope. Gleick quotes Robert Burton, writing from Oxford in 1621 of being inundated by “new books every day, pamphlets, currantoes, stories, whole catalogues of volumes of all sorts, new paradoxes, opinions, schisms, heresies, controversies in philosophy, religion, &c.” Gleick uncovers many other such examples from what we might call the “before web era,” while noting that the ever-increasing supply of information was not being lamented so much as wondered at. Indeed, by the mid twentieth century, the prospect of linking up all that information, in some unspecified but wired way, held out the prospect of a new “shared mind” that would collect and disseminate information to the benefit of all. Gleick refers to Teilhard de Chardin, who asked in 1955 whether it does “not seem as though a great body is in the process of being born – with its limbs, its nervous system, its centres of perception, its memory?” Which is, pretty much, what’s happened, even though it hasn’t turned out entirely for the best.

Gleick points to how this abundance of information has come to seem as much burdensome as liberating, not least because so much is now stored outside our heads that it threatens to undermine our sense of ourselves. What is ours, and what belongs to cyberspace? Gleick is interested in the means by which we store and access information content, more than in the function of memory itself, but there are nevertheless some striking parallels with the conclusions reached by Mayer-Schönberger, particularly around questions of control and ownership. “For a certain time,” Gleick observes, “collectors, scholars or fans possessed their books and their records.” It was “part of who they were.” No longer, when those books and those records are available equally to all, in digitised form, to be called up when required.

The problem with all this material – this infinity of choice – is that it leaves us without our own unique story to tell. Somehow, all this information and documentation must be ordered and winnowed and matched to us as individuals in order that we may possess it as part of our personal biography, rather than having it possess us. Gleick sees this process as entailing a reassertion of the power to forget, which seems to be analogous to a kind of robust editing. “Forgetting used to be a failing,” he says, “a waste, a sign of senility. Now it takes effort. It may be as important as remembering.” In their different ways, Mayer-Schönberger and Gleick both encourage us to consider the virtues of forgetting. Rather than a sign of weakness and decline or, most disturbingly of all, a loss of selfhood, it may be quite the reverse – a way for us to get our lives back in order.


FROM this notion of forgetting as an aid to the recovery of order and control in our lives, it is only a small step to seeing it as having a specifically curative function. In the final chapter of Memory: Fragments of a Modern History – a brilliantly researched and highly readable account of scientific, quasi-scientific and just plain quack approaches to the way we remember – Alison Winter brings us up to date with “the first speculative steps… now being taken in an attempt to develop techniques of what is being called ‘therapeutic forgetting.’” This revisionist approach to forgetting – to memory loss – goes against the orthodoxy, ingrained in most of us, that the best way to deal with traumatic memories is to disinter, confront and control them, to the point where they no longer have the power to harm us. It is all part of a cultural narrative of self-assertion, of the power of individual strength and determination to defeat our demons. But what if we would be better off directing our strength and determination towards getting rid of the memories altogether, or at least damping them down so that they cease to bother us?

In describing recent attempts, whether involving therapy or drugs or a combination of both, to induce selective forgetting in those who have suffered various forms of psychological trauma, Winter is sensitive to the underlying question of what, actually, is being erased. Is it just the traumatic memories, the ones that militate against a full and satisfying life, or is it the life itself, the individual biography that is being damped down and fundamentally altered? She reminds us, just to complicate matters, that the introduction of anaesthesia to the operating theatre in the nineteenth century “was initially upsetting, because it challenged a convention about personal identity.” If it was life’s experience, and the memory of that experience, that formed your personality and character, what did that say about a process – the administering of anaesthetic – that knocked you out so that you were guaranteed not to remember a thing? As Winter says, those concerns now seem quaint and beside the point. But if we accept without question that it is legitimate to anaesthetise the present – that is, for the time that the patient is on the operating table – then perhaps it is equally legitimate to anaesthetise the past, or at least those memories from the past that stand in the way of health and wellbeing.

Winter constructs her study as an analogue of memory itself, offering a set of historical “fragments” that taken together provide a picture of “the deep intellectual and cultural complexities of memory science in practice.” In doing so, she explicitly links the late nineteenth- and twentieth-century fascination with the working of memory to the growth of technology, and particularly to the overwhelming cultural influence of photography, film and other, electronic forms of representation. Time and again photography and film have stimulated ideas of how memory works and Winter has fascinating tales to tell of “flashbulb memories” and “Kodak moments” and “movie-like memories secreted in the subconscious.” There was a sense, too, in which the filmed moment became not just a trigger for memory but the memory itself. “Mid-century advertisements for cameras and films,” she notes, “presented home movies as a way to relive the past with perfect fidelity.” This equation of film with memory assumed even greater prominence in the last decades of the twentieth century with the rise of what came to be called repressed memory syndrome, closely followed by its oppositional twin, false memory syndrome.

Winter moves surely through the complex set of cultural circumstances that contributed to the rise and fall of repressed memory syndrome and in doing so manages to be fair both to the proponents and to the critics. It is difficult now, even though the events took place so recently, to recapture the intensity of emotion and conviction with which both sides argued their case. The repressed (sometimes called “recovered”) memory phenomenon, in which victims, often at the prompting of zealous therapists, claimed to have suffered appalling abuse at the hands of relatives or others close to them, now seems like an extreme playing out of our long-held cultural belief that the truth lies buried within us and that only by confronting that truth can we be free to be our true, unencumbered selves. It is remarkable how often the recovered narrative of “what really happened” was expressed in terms of a film or a set of photographs. “It may seem,” Winter quotes, from a 1989 pamphlet explaining the process of triggering and releasing repressed memories, “that we are seeing a slide projector, with pictures flashing very rapidly before our eyes.” Only by confronting these pictures, these buried memories, it was argued, could the victim assume “control,” of the memories and themselves.

The sceptics, those who responded to the emergence of repressed memories by recasting them as false, did not so much question the existence of the memories, or their essentially cinematic nature, as they questioned where they came from. In many cases, the sceptics argued, they did not come from lived yet unacknowledged experience, but rather were created unconsciously in response to suggestions from outside or to other, random stimuli. The argument lingers on, with no definitive answer. Are recovered memories real, in the sense that they correspond to events that actually happened, or aren’t they? The answer, frustratingly, is that it rather depends. The remarkable thing about the battle between repressed and false memories, which raged so strongly for much of the eighties and the early nineties, is how quickly the heat went out of it. It is almost as if we grew wary suddenly of looking so deeply into ourselves, of recovering long-buried memories that would explain who we really, really are. The memory project, once so earnest in its determination to get at the truth, has developed into something more focused on placing – rather than resurrecting – ourselves, as we search the registries of births, deaths and marriages, or connect and reconnect with old friends on Facebook.

When Mayer-Schönberger or Gleick speaks of the need to turn away from memory for a while and focus on forgetting, or Winter of the new interest in the therapeutic potential of deleting memories rather than resurrecting and confronting them, they are identifying ways in which we might productively respond to the burden of too much information and too much memory. There is a corollary to this new approach to forgetting. If, as we have long believed, we are the sum of our memories, then rather than rely on memories from the past to define us, memories over which we may struggle to exert effective control, why not concentrate on creating memories for the future? This notion, of creating memory rather than letting it create you, is very much in the air. Camera manufacturers, for example, no longer represent their product as “a way to remember the past with perfect fidelity,” as Alison Winter reminds us they did in the seventies. They are much more likely nowadays to offer us the opportunity to “create memories.” Or, to put it another way, to take control of our future memories, rather than letting our memories control us from the past. •

The post Memories for the future appeared first on Inside Story.

]]>
Quiet, please https://insidestory.org.au/quiet-please/ Tue, 10 Apr 2012 02:05:00 +0000 http://staging.insidestory.org.au/quiet-please/

Are we so impressed by the power of collaboration that we’ve come to overvalue working in groups, asks Jock Given

The post Quiet, please appeared first on Inside Story.

]]>

IMAGINE a line-up of very young babies, all just four months old. Your job is to predict which ones will turn out to be introverts and which ones extroverts.

Experimentation is allowed, so you decide to test their reactions to sounds, sights and smells. Tape-recorded voices are played and balloons are popped. Colourful mobiles dance in front of their eyes. Cotton swabs are doused in alcohol and placed near their noses.

One in every five babies cries lustily and pumps its arms and legs. These will be the extroverts, right? And the two in five who stay placid, moving their limbs a little but without all the drama, will be the introverts? (The other two in five fall between the extremes.)

Wrong, says Jerome Kagan, a professor at the Laboratory for Child Development at Harvard, whose research and ideas are one of the foundations of Susan Cain’s Quiet. Kagan has been running longitudinal studies of temperament for decades. He conducted these very experiments with young babies, then brought the same children back to his lab for more tests at two, four, seven and eleven years old.

Kagan found that the infant limb-pumpers were more likely to turn into quiet, introverted children, and the quietest babies tended to become relaxed, confident extroverts. This seems counter­intuitive, but it was just what Kagan had been expecting. He had a hypothesis about the biology of the very responsive babies, whom he called “high-reactive,” and the unresponsive ones, whom he billed “low-reactive.”

The sensitivity of the nervous systems of the high-reactives seemed to be linked not just to noticing novel or scary things but also to noticing in general. To high-reactive babies and children, the world could be impossibly stimulating. They would see danger where others didn’t, and complexity where others found simplicity or nothing at all. They would be at home with puzzles, the more complicated the better. They would sense rejection where others felt acceptance and jubilation when others were unmoved, be alert and absorbed when others were bored, and feel deeply about things that others hardly noticed. High-reactives brought “an extra degree of nuance to everyday experiences.”

Children with this temperament could find it hard to fit in. Cain quotes one high-reactive child’s solution to the everyday puzzle of how to share toys in a group: “Alphabetise their last names and let the person closest to A go first.” (My prediction: The rest of the kids would instinctively implement a shambolic first-come-first-served scheme, and the extroverts would get the best toys. The high-reactive would despair, feeling deeply that alphabetising surnames was a fairer way to do it. She would be bewildered that others couldn’t see it and seek solace in inner worlds and private games where she could control things better.)

Despite the strength of his conclusions, Kagan insisted that high- and low-reactivity were not the only routes to introversion and extroversion. Indeed, when Cain went to interview him, he got testy about how people over-simplified his results in ways that implied a person’s temperament was necessarily their destiny. A child’s environment and experiences go to work on its temperamental predisposition. High-reactives don’t all end up intense introverts, although a quarter of Kagan’s apparently suffer from “social anxiety disorder,” a chronic and disabling form of shyness.

According to one of his “colleagues and protégés,” however, the influence of a high- or low-reactive temperament “never fully disappears in adulthood.” Carl Schwartz, director of the Developmental Neuroimaging and Psychopathology Research Lab at Massachusetts General Hospital, retested Kagan’s kids as adults. In Cain’s words, he found that “we can stretch our personalities but only up to a point. Our inborn temperaments influence us, regardless of the lives we lead.”


SUSAN CAIN says that a third to a half of all Americans are introverts. Through history, the world’s introverts have included people like Moses, Albert Einstein and Charlie Brown.

Moses led the Jews out of Egypt, took care of them in the desert for forty years and turned up the Ten Commandments. How? Not by giving PowerPoint presentations to venture capitalists and setting big, hairy, audacious goals. He climbed a mountain and took careful dictation.

Albert Einstein once said, “I am a horse for a single harness, not cut out for tandem or teamwork,” and on another occasion, “It’s not that I’m so smart. It’s that I stay with problems longer.” And Charlie Brown’s introversion, according to one website, means that “he constantly rehearse[s] what he will say to the little redheaded girl” but also “prevents him from actually delivering the goods.”

Cain is an introvert herself and she’s naturally drawn to these people. But she needed to learn more about extroverts, so she headed off to “the Spiritual Capital of Extroversion,” Harvard Business School. Students go there partly to study but mainly to build a personal network that will last a lifetime. By day, they are consumed in group exercises. By night, socialising is “an extreme sport.” “Isn’t there anyone on the quieter side?” Cain asks one student. “I couldn’t tell you,” he replies.

The essence of a Harvard Business School education is to teach students to “act confidently and make decisions in the face of incomplete information.” Of course, no decision-maker will ever have complete information, even in hindsight. So what should they do? Act now? Or ask more questions, collect more data? “By hesitating, do you risk losing others’ trust and your own momentum?” asks Cain. “If you speak firmly on the basis of bad information, you can lead your people into disaster. But if you exude uncertainty, then morale suffers, funders won’t invest, and your organisation can collapse.”

The choice takes us some way beyond introversion and extroversion, but Cain believes this aspect of our personality is especially important.


HORSES are the most sensitive of creatures, but two dozen of them galloping down the home straight at Flemington make a sound like thunder. The Melbourne Cup is watched and listened to all over Australia and the world, but the Cups King, Bart Cummings, is a quiet man.

“He never said a lot, but he listened a lot,” jockey Roy Higgins told Les Carlyon, a journalist who has watched Cummings close up since he started winning Melbourne Cups in the 1960s and has now written a “personal portrait” of The Master.

“He’d throw you a question relevant to the horse, a short, sharp question, and he’d just stop and listen,” says Higgins. “Most of the time he wouldn’t be looking you in the eye – he’d probably be looking at a horse walking around or having a pick of grass. Everything was dead quiet and he’d take in every bloody word you said. He could absorb.”

First taking Australia’s richest race with Light Fingers in 1965, Cummings has trained a Melbourne Cup winner, on average, every four years since. Twelve Cups is “what elevates him to the highest place in the pantheon,” says Carlyon. It’s “as freakish as Bradman.”

If Susan Cain and Jerome Kagan are right, Cummings could have been a hell of a limb-pumper at four months. He seems a classic introvert and Carlyon’s portrait is full of quietude. The 1950 Cup winner trained by Bart’s father Jim, Comic Court, was “the quietest loveliest” horse. The son’s “exquisite colt” Beau Zam was “the sweetest of horses”; 1996 Cox Plate and Melbourne Cup winner Saintly “the most docile, quietist horse ever,” at least until he stepped onto a racetrack.

Bart’s father taught him “that horses were a puzzle waiting to be worked out and Cummings, who had extraordinary powers of observation (and not just with horses), was always staring and speculating, cross-examining his track riders, picking up hoofs, changing bits, looking in the unlikeliest places for the clue that might unlock the puzzle.” Cummings “lived in his solitary world and answered only to himself. He was patient to the point of being perverse about it,” though that should not obscure his “fanatical intensity” or ruthless competitive spirit.

A couple of times we see and hear Bart let things out, but they are hardly fireworks. In 1988, when Beau Zam won the St Leger at Randwick by ten lengths, Cummings, unusually, led in his own horse. “Too much post-race hysteria tends to frighten the horses and the horses are the real heroes,” Carlyon thinks.

Then in 2010, after perhaps the best horse he’d ever trained, So You Think, was sold by its owner and lost to his stable while Cummings was in hospital, Bart said the owner’s racing manager “talked [the owner] into it while I wasn’t there. He did it while I was in hospital – that was the worst part.” This was not just any owner, it was Cummings’s most successful, Dato’ Tan Chin Nam, for whom the Master won four Cups: Think Big (twice, in 1974 and 75), Saintly and Viewed (2008). They’d sought and worked horses together for four decades, found one “about as near to perfect as possible,” and suddenly it was gone.

In the recession of the late 1980s, Cummings was almost ruined financially when a scheme to buy yearlings and sell off units to investors collapsed. A fire sale of the horses raised $11 million less than the debts run up to buy them. Cummings had shaken hands with his partners and thought they were working together in a joint venture; the Federal Court found the debts were all Bart’s. The Master was on his own.

A scheme of arrangement was worked out with creditors. Cummings went back to what he was best at. Let’s Elope won the 1991 Caulfield and Melbourne Cups double.


INTROVERSION is not shyness, although they might look the same. Shyness, says Cain, is “the fear of social disapproval or humiliation.” Introversion is just a “preference for environments that are not over-stimulating,” what writer Winifred Gallagher calls “the disposition that stops to consider stimuli rather than rushing to engage with them.”

So, Cain argues, we can have shy extroverts like Barbra Streisand, a larger-than-life personality paralysed by stage fright, and non-shy introverts like Bill Gates, who apparently keeps to himself but is not fussed by the opinions of others.

Introversion and extroversion are not inherently good or bad, as the widely used Myers-Briggs Type Indicator stresses. A favourite tool for those off-site corporate team-building exercises, Myers-Briggs identifies an individual’s preference for introversion or extroversion as one of four personal “dichotomies” specified or implicit in Carl Jung’s theory of psychological types.

Myers-Briggs calls it “Favourite World.” Do you prefer to focus on the outer world (extroversion) or on your own inner world (introversion)? The other three are about information (sensing/intuition), decisions (thinking/feeling) and structure (judging/perceiving). These attributes can be combined in sixteen different ways, corresponding to different personality types. Charlie Brown, for example, was probably an ISFP: “a very low need to lead and control others, and yet driven by a desire to see everything – plants, animals, and people – living harmoniously,” according to one description.

Explains the Myers & Briggs Foundation:

Personality type is what you prefer when you are using your mind or focusing your attention. Studies and experience have shown that there are consistent patterns for each person. For example, one pair of preferences is about whether you choose to spend more time in the outside world or more time in your inner world. We call this a preference for Extroversion or Introversion. Neither is wrong. You can do both. You just prefer one.

Read the publicity and op-eds about Quiet and you get the impression Susan Cain wants an open season on extroversion. Out with brainstorming sessions, groupthink, open-plan offices and the bankers who led us into the GFC without listening to the quiet guys who saw it coming.

She doesn’t. She cites a lot of research studies that simply found brainstorming is not what it’s cracked up to be. Loudly expressed ideas prevail over good ones. Pumped-up participants believe they have performed better than they have. But getting the team together to toss around ideas might still be a worthy goal, so long as a feast of great, fresh ideas is not thought to be the principal benefit.

Cain also cites research studies that show brainstorming does work online. When online groups are properly managed, they do better than individuals, and the larger the group the better it performs. Academic researchers who work together electronically, from different physical locations, have also been found to produce research that goes on to become more influential than the findings of those working alone or collaborating face-to-face.

That’s a huge qualification to the initial idea about the ineffectiveness of brainstorming. The “world that can’t stop talking” has a lot to do with the pervasiveness of information technology, but IT is also bringing productive new ways for introverts to contribute alongside its extroverted demands for people to be noisier.

Cain just worries that we’re now so impressed by the power of online collaboration that we’ve come to overvalue all group work at the expense of solo thought. “Introverts prefer to work independently, and solitude can be a catalyst to innovation… We failed to realise that what makes sense for the asynchronous, relatively anonymous interactions of the internet might not work as well inside the face-to-face, politically charged, acoustically noisy confines of an open-plan office.”

As for the quiet guys who anticipated the GFC, Australians could point out that the developed country that got through it best decided to “Go Early, Go Hard, Go Households.” Extroverts got us into this and now they had a plan to get us out of it.

Cain would probably not demur. She’s not asking us to stop doing things. She married an extrovert herself and wouldn’t have it any other way. Her point is just that, at least in her country, the United States, “we tend to overvalue buzz and discount the risk of rewards-sensitivity: we need to find a balance between action and reflection.”

The final chapters of the book are full of quiet, practical advice for people whose natural disposition is to “stop to consider stimuli rather than rushing to engage with them.”


THE paradox of Bart Cummings, says Les Carlyon, is that “a shy man, a man happy in his own solitary world, a man not given to speeches, is the racing figure everyone always wants to interview. The man who shies away from the media has somehow become a media darling. The world changes and he stays the same; other people look dated and he looks timeless.”

Cummings belongs “to the era of Don Bradman, when it was thought proper for sports heroes to be humble, and when they didn’t use social networking sites and a forest of exclamation marks to tell us about their trip to the supermarket.” Yet “Bart” has not needed a surname since he won his first three Melbourne Cups in a row in the mid 1960s. “Racing [is] smaller and he [is] bigger.”

Susan Cain has learned to front the corporate training sessions and conferences that once terrified her. She knows she wouldn’t have got a contract to write Quiet unless the publisher had thought she could do chat shows and book launches to sell it. “It’s not true that I’m no longer shy; I’ve just learned to talk myself down from the ledge.” She’s even done a TED Talk, one of those Everests of Internet Age extroversion.

At a workshop at the Public Speaking–Social Anxiety Center of New York, the instructor told her: “There are only a few people out there who can completely overcome their fears, and they all live in Tibet.” The secret to life, Cain now thinks, “is to put yourself in the right lighting. For some it’s a Broadway spotlight; for others, a lamplit desk.”

Bart Cummings has spent a long time in the spotlight but a longer time on dark mornings watching those big, sensitive puzzles gallop around Morphettville and Flemington and Randwick.

“You had a tear in your eye?” asked a reporter after Saintly won his Melbourne Cup:

Cummings: Yeah, didn’t have enough on it…

Reporter: Some said Saintly couldn’t stay?

Cummings: He told me to tell you he can now…

Reporter: How good is Saintly?

Cummings: Quite a nice horse. •

The post Quiet, please appeared first on Inside Story.

]]>
The madness industry https://insidestory.org.au/the-madness-industry/ Wed, 17 Aug 2011 05:03:00 +0000 http://staging.insidestory.org.au/the-madness-industry/

Jon Ronson has chased psychopathology from Gothenburg to Florida. Brett Evans reviews his new book

The post The madness industry appeared first on Inside Story.

]]>
HAVE you ever met someone a bit glib, and easily bored, with a grandiose sense of self-worth? A charming but chronic liar perhaps – yet cunning, and manipulative, and lacking in empathy? A bit of a pants man, maybe, someone married a few times?

Have you ever met a bloke with these types of character traits who was also impulsive, irresponsible and lacking in realistic long-term goals? Did he also lack control over his behaviour? And when you found out that he had also been a delinquent in his youth and a versatile career criminal, did it come as no real surprise?

And most importantly, did this chap seem cold and detached, particularly at moments of real emotion? Did you ever catch him bunging it on, for example; shedding crocodile tears, perhaps?

In short, have you ever met the type of person who would rather push you out of a fourth-storey window than take responsibility for his own actions?

Well, if you have, you’ve almost certainly met a psychopath. And an encounter with a psychopath is unfortunately not as unlikely as you might hope. According to the people British writer and documentary maker Jon Ronson spoke to, one person in a hundred is a charming but dead-eyed nutter, incapable of feeling remorse.

Ronson has made a successful career reporting on the weird, the bizarre and the alarming. He’s hung out with David Icke, a man who thinks the world is run by shape-shifting, blood-drinking lizards from outer space. He’s visited down south with the Ku Klux Klan, despite being a Welsh-born Jew. And most famously, he’s exposed the secret program of the US military that sought to prove that a man could kill a goat by simply staring at it.

Given all this, I suppose it was just a matter of time before he got onto the trail of the most dangerous weirdos of them all: the psychopaths who live among us.


THE RONSON style of reporting is sometimes accused of faux naivety – a bit like watching Woody Allen’s screen persona covering Watergate: a nebbish with a notepad. But it gets results: people open up to him; he finds out stuff just like a good old-fashioned investigative reporter.

He often doesn’t do any research, for example, preferring to pursue his subjects with as few preconceptions as possible. And on these forays into other people’s realities – whether he’s giving an Islamic fundamentalist a lift in his car, or trying on the white pointed hood of a Klan member – Ronson likes to place himself as close to the action as possible, while at the same time remaining endearingly candid about his own confusion, obsessiveness and ignorance.

Ronson is not much interested in policy or passing judgement; his forte is people, in all their jaw-dropping craziness. It’s a style that has served him well for twenty years.

His latest book, The Psychopath Test, starts with a shaggy dog story. Scientists around the world are receiving copies of a strange book from an unknown source and Ronson is asked by one of them to track down its anonymous author. The beautifully printed book arrives in their pigeonholes out of the blue, with key words having been cut out, and contains handwritten cryptic exhortations. All the scientists think it must contain a hidden message.

Soon enough, Ronson discovers it was written by an “obsessive crackpot” in Sweden. The book doesn’t actually mean anything. Mystery solved.

When Ronson reveals the mystery’s solution to the scientist who asked him to look into it, she is a little underwhelmed. But Ronson is excited. The actions of a crackpot had set off all this activity: emails have sped around the globe; a little community of gossip had been established among the scientists concerned; and he had even made an uncomfortable trip to Gothenburg on a budget airline to crack the case. And it was all because of one man’s unbalanced mind.

“Suddenly madness was everywhere,” Ronson writes, “and I was determined to learn about the impact it had on the way society evolves. I’ve always believed society to be a fundamentally rational thing, but what if it isn’t? What if it is built on insanity?”

Society is rational? Has Ronson failed to read his own books or view his own documentaries? As Ronson discusses later in the book, all journalists – not just writers like himself who specialise in strangeness – are always seeking out those rare moments of craziness in interviews that will suddenly set a story alight. As his friend, the documentarian Adam Curtis, warns him at one point: “We all do it. We wait for gems. And the gems invariably turn out to be madness.”

So it’s not long before Ronson – rumpled, anxious, eager to please, and amazed at his own audacity – sets off in the same spirit as Mole heading for the River Bank on the first day of spring, looking for some psychopaths who would make good copy.


WHAT follows is a series of classic Ronsonian encounters. Unlike a traditional journalist Ronson doesn’t do interviews as such; he creates a relationship where his thoughts, reactions and overreactions are just as important as his subject’s story and opinions. He must have been very excited indeed when he discovered the rich seam of opal that is “Tony.”

Tony was arrested for a violent crime and cleverly, he thought, decided to fake madness to avoid a stay in prison. He expected to be put in a country hospital with “Sky TV and a PlayStation.” Instead he got sent to the Dangerous and Severe Personality Disorder unit at Broadmoor. For twelve years.

Ronson is introduced to Tony by the Church of Scientology. Scientologists are famously very down on psychiatry and they see Tony’s predicament as just another example of the profession’s perfidious influence. Tony, it turns out, is a charming, friendly man who, when he first meets Ronson, is neatly – and incongruously – dressed in a suit. But as we all know, first impressions often lie. In fact, as Ronson soon discovers, the authorities at Broadmoor have a very good reason for keeping Tony locked up: he’s a psychopath.

After meeting a real live one, Ronson sets out to talk to the people who spend their lives studying and trying to treat psychopaths. And Ronson being Ronson, it’s not long before he’s discovered “the world’s first ever marathon Nude Psychotherapy session for criminal psychopaths.” It’s what might be described as a journalistic diamond mine.

At the Oak Ridge Hospital for the Criminally Insane in Ontario back in – naturally – the sixties, a young psychiatrist named Elliott Barker managed to convince the authorities that the best way to treat psychopaths was to dig under their apparent normality and expose their hidden madness to themselves. Barker proposed achieving this by running LSD-fuelled encounter groups full of naked psychopaths.

So did it work, you ask? No. In fact, the Oak Ridge psychopaths reoffended at a higher rate than those released from other institutions. And most disturbingly, as one of its most infamous alumni, the serial killer Peter Woodcock, once explained on a BBC documentary, Oak Ridge only succeeded in making him, at least, into a more devious psychopath. “All those chats about empathy were like an empathy-faking finishing school for him,” Ronson comments.

Eventually Ronson finds another Canadian to guide his journey up river into the heart of psychopathology. The tough-minded and highly influential Dr Robert Hare developed a diagnostic tool called the Psychopathy Checklist-Revised, or PCL-R. It is now used all around the world to identify psychopaths. The staff at Broadmoor used the PCL-R to identify Tony as a psychopath.

Hare has been studying psychopaths for over thirty years and he doesn’t think getting them to hug each other – clothed or unclothed – will make any difference to their personalities. He believes that their amygdalae – the part of the brain that processes emotion – don’t work properly. When you or I look at a shocking photo – of a murder scene, for example – we are appalled, but a psychopath, Hare says, will be absorbed. Hare also thinks that not all psychopaths are just variations of Charles Manson. Sometimes they end up running large corporations. This idea intrigued Ronson so much he went and interviewed Al “Chainsaw” Dunlap – the man who tore apart Sunbeam when he was its CEO, and who for a while worked for Kerry Packer in Australia.

When Ronson arrives at Dunlap’s mansion in Florida he is bemused to discover the garden is decorated with sculptures of predatory animals. “They were everywhere,” Ronson writes, “stone lions and panthers and eagles soaring downwards, their teeth bared, and hawks with fish in their talons, and on and on.” And the garden reflected the man and his philosophy.

When Ronson runs him through Hare’s checklist, the former corporate tsar turns many of the questions on their head. Manipulative? “I think you could describe that as leadership,” answers the great man. In the end Ronson decides that Dunlap isn’t a psychopath – which is just as well, given that it’s never a good idea to defame a millionaire.


IN TYPICAL Ronson fashion, by the end of his journey through the world of psychopathology he is as confused as ever; his earlier certainty about tools like the PCL-R is replaced with concerns about what he calls the Madness Industry. Can you really decide to keep someone in Broadmoor on the basis of a checklist, he asks? After all, psychiatrists aren’t infallible, are they? Why, for example, is there an epidemic in the United States of childhood bipolar disorder? It’s because health professionals use a checklist to diagnose it, and sometimes kids who are just moody or disruptive get labelled as bipolar. And then the advocate groups and drug companies arrive to encourage ever more diagnoses.

Psychopaths are a big problem, Ronson says, but the madness industry as a whole is also a worry. “There are obviously a lot of very ill people out there,” he writes, “But there are also people in the middle, getting over-labelled, becoming nothing more than a big splurge of madness in the minds of the people who benefit from it.” •

The post The madness industry appeared first on Inside Story.

]]>