history • Topic • Inside Story https://insidestory.org.au/topic/history/ Current affairs and culture from Australia and beyond Sat, 30 Mar 2024 00:18:51 +0000 en-AU hourly 1 https://insidestory.org.au/wp-content/uploads/cropped-icon-WP-32x32.png history • Topic • Inside Story https://insidestory.org.au/topic/history/ 32 32 Roaring back https://insidestory.org.au/roaring-back/ https://insidestory.org.au/roaring-back/#comments Sat, 30 Mar 2024 00:16:35 +0000 https://insidestory.org.au/?p=77707

A major new series about the postwar world poses the inevitable question: has the cold war returned?

The post Roaring back appeared first on Inside Story.

]]>
“History has a way of roaring back into our lives,” warns Brian Knappenberger, whose latest documentary, Turning Point: the Bomb and the Cold War, is screening on Netflix. Tracking through ninety years of geopolitical upheaval from the rise of Stalin and Hitler to Putin’s invasion of Ukraine, the nine episodes give us history as a swirl rather than an arc. We are turning back into another phase of the cold war, it shows us, with equally massive and urgent risks.

An opening montage blends images of an atomic fireball, tanks in the streets, burning villages, crowds tearing down statues and leaders being saluted by military parades. Historian Timothy Naftali speaks through it all: at its peak, he says, the cold war touched every continent, shaping the decolonisation of empires and transforming domestic politics in the great cities of Europe, North America and Asia.

As Knappenberger acknowledges, the series is “insanely audacious.” It features original footage of critical moments, interviews with people who lived through worst of them, and commentary from around a hundred historians and political insiders, many of whom were directly involved in the crises. Lessons have been learned from documentary-maker Ken Burns, with talking heads presented as dramatis personae. It’s all about managing tone and pacing so that reflections from the present create depths of field for visually evoked scenes from the past.

Knappenberger achieves something of the Burns effect in bringing out an at-times unbearable sense of how these events were experienced by those caught up in them. Rapid montages conveying the scale and density of the upheavals are counterposed with sustained evocations of the experiences of those caught up in them.

Hiroshima, considered a purely military target by the US government, had a civilian population of 350,000. Prewar photographs show carts and bicycles in narrow streets spanned by arching lamps, a place of small traders and modest resources. People who were living in the city as small children deliver their testimonies steadily, quietly — though, as one of them says, visibly working to sustain his composure, “I hate to remember those days.”

Howard Kakita, aged seven, was on his way to school with his five-year-old brother when the warnings started. The explosion came as they returned to their grandparents’ house, which was obliterated. They dug themselves out of the rubble and fled the city through the ruins and carnage. Keiko Ogura’s brother told her he had seen something drop from one of the planes flying over, a tiny thing, which did not fall directly, but was caught for a while in the slipstream of the aircraft before arching down. Then came the flash, the loss of consciousness and the awakening to a world in which “everything was broken.”

The effect of the blast on human bodies creates scars in the memory. Corpses turned to ash on contact. The river was full of them. It’s hard to watch, and to listen to these accounts, as it should be. They are a necessary corrective to Christopher Nolan’s Oppenheimer, with its brief, stylised evocation of the horrors, firmly subordinated to the main story of an American hero and his tribulations.

Is it even possible to see such a disastrous train of events from “both sides?” That, surely, is the question we were left with by the cold war that followed. For the first time in history, two global superpowers were frozen in a deadlock of mutually assured destruction. The rush to catastrophe was paralysed by symmetry.

That, at least, was one version of the narrative. But mutually assured paranoia, the more complex and confusing side of things, was anything but paralysing. The belief in an enemy working in secret on unimaginably evil weaponry provokes an overriding conviction that your own side must secretly work on something equivalent or preferably more lethal. This is the “hot” equation behind the cold war.

With technological escalation seemingly taking on a life of its own, no one could comprehend the scale of what was being created. The American government’s messaging was all about survivability — backyard fallout shelters, “duck and dive” drill for schoolchildren — as if a small wooden desk might be an effective shield.

The language used at the time betrays a pitiful divorce from reality. A military officer flippantly describes a planned thermonuclear test as something that will make Hiroshima and Nagasaki look like firecrackers. The monstrous Bikini Atoll explosion, with 7,000 times the power of the Hiroshima blast, give its name to a new provocative style of swimwear.

“Institutional Insanity” is the title of the episode that deals with all this. It is as if the human brain simply isn’t coping with the consequences of its own activities. No one really knew what they were doing, comments nuclear historian Alex Wellerstein, and testing became a kind of game for hyperactive experimentalists.

In interviews recorded before his death last year, Daniel Ellsberg recalls joining “the smartest group of people I ever did associate with” at Rand Corporation, men seen in contemporary photographs relaxing with their feet up on their desks, sleeves rolled up, smoking. But it is Stanley Kubrick’s Dr Strangelove, grimacing in close-up as he advises on enemy psychology, who gets the last word in this particular sequence. “That was a documentary,” says Ellsberg.

Following Stalin’s death in 1953, his successor Nikita Khrushchev took a leaf out of the Strangelove manual. With an arsenal that couldn’t catch up with massive overreach of his opponents, he sought to weaponise American fears by making exaggerated claims, mounting the covert Active Measures program, which spread misinformation through news media and other forms of public communication.

Against this backdrop, the achievement of Khrushchev’s ultimate successor, Mikhail Gorbachev, in defusing the collective psychosis was extraordinary, whatever his political failings from the Russian perspective. Polarised views of Gorbachev’s legacy remain one of the deepest challenges to the West’s comprehension of post-Soviet Russia. Putin’s pronouncement that the break-up of the Soviet Union was the greatest geopolitical disaster of the modern era has driven the new wave of military aggression that now confronts us.


One of Turning Point’s great strengths is its engagement with the complexities of moral arbitration, which are explored in the extensive commentary offered those in a position to offer genuine insights. Khrushchev’s great-granddaughter, Nina Khrushcheva, now a professor of international affairs in New York, gives an account of the secret speech of 1956, in which Khrushchev made public the scale of the purges of the Stalin era and condemned the cult of personality that had poisoned Soviet politics.

Stephen Kinzer, author of Overthrow: America’s Century of Regime Change from Hawaii to Iraq and other books on American cold war policy, delivers an excoriating analysis of the thinking behind interventions in Guatemala, Chile and Iran. Covert operations like these were one of the defining elements of the cold war; we get insider views of the activities of the CIA and its Soviet counterpart from dissidents now free to tell the tale and bring into focus some of the minor players who shaped events.

The cult of personality accounts for much of the evil in the modern political world, but an excessive focus on these figures is a problem in itself, as we are learning with the media response to Trump in America now. A personality-driven view of history glosses over the influence of those in the supporting cast — the secret service directors, spies, foreign policy advisers, diplomats, propagandists, journalists — and, it must be stressed, the voting public, who allow themselves to be swayed by flagrant manipulation.

Are we returning to the cold war? That question runs through Turning Point, culminating in the final episode on Ukraine. “History is not history,” says journalist Lesley Blume, “but we are in an ongoing tide.” •

The post Roaring back appeared first on Inside Story.

]]>
https://insidestory.org.au/roaring-back/feed/ 2
Olympic origins https://insidestory.org.au/olympic-origins/ https://insidestory.org.au/olympic-origins/#comments Wed, 20 Mar 2024 00:57:36 +0000 https://insidestory.org.au/?p=77564

Queensland premier Steven Miles is learning an old lesson about sporting venues: sometimes it is best to love the ones you have

The post Olympic origins appeared first on Inside Story.

]]>
Brisbane’s deputy lord mayor was at the Commonwealth Games in Christchurch in January 1974, lobbying for the Queensland capital to host the 1982 Games, when the Brisbane River broke its banks.

On the night of the opening ceremony, 24 January, Cyclone Wanda crossed the coast at Double Island Point north of Noosa. It didn’t have the devastating winds of cyclones like Ada and Althea that smashed the Whitsundays in 1970 and Townsville in 1971, and it weakened rapidly, but the monsoonal trough it forced south to Brisbane stayed there for days. Small oscillations in its movement and intensity generated many stretches of drenching rain.

Across Brisbane, 600 millimetres fell on the first three days of competition in Christchurch — twenty-four inches, or two feet, in the language of the time. This was three times the city’s average rainfall for January, its wettest month. On 28 January the trough weakened and retreated north. A drier, cooler air mass from the south finally brought some blue sky to the capital of the Sunshine State.

The river peaked in the early hours of 29 January at a height not seen since 1893. Residents woke to find about 13,000 buildings damaged. Children due back at school that morning got an extra week added to their Christmas holidays.

Across the Tasman in Christchurch, Australia had won a bag of gold medals while the river rose. Raelene Boyle retained the 100 metres sprint title she won in Edinburgh, fourteen-year-old Newcastle schoolgirl Sonya Gray won the women’s 100 metres freestyle and Mexico Olympic champion Mike Wenden the men’s. As the waters receded, Boyle and Gray added the 200 metres to their 100-metre golds and Don Wagstaff completed a double in the diving pool.

The deputy lord mayor reported Brisbane’s promotional T-shirts “were without doubt the most sought-after item at the Games.” Its souvenir match boxes and coasters “were widely distributed and caused much interest.” Sandwiched amid coverage of the floods, the full-page advertisement for Brisbane’s bid in the Christchurch’s main paper, the Press, caused “some concern,” but it was not fatal because “most people realised that occurrences such as these were not the normal thing.”

Whether or not the 1974 flood was abnormal depended on the time scale. The “River City” had not seen a flood as high in the twentieth century. During the nineteenth century it had seen four as high, including three much higher, and a total of eight floods classed as “major” according to the Bureau of Meteorology’s current classification system (3.5 metres at the City Gauge). Only two other “major” floods occurred in the twentieth century, the last in February 1931. This century is different again. The February 2022 flood was Brisbane’s second major flood after the even higher one in January 2011, and a further “minor” one occurred in January 2013.


The inaugural meeting of Brisbane’s Commonwealth Games Committee was held two months before the Christchurch Games. Chaired by lord mayor and sports fan Clem Jones, the meeting was told an application had already been lodged for Brisbane to host the 1982 Games. Business representatives thought the city council’s report on possible venues was technically excellent but lacked ambition. By 1982, they thought, the city “would deserve a sporting complex of world-wide standard.”

Council representatives baulked at the zeal. They “could not commit the City to structures which could become ‘white elephants,’ or to a financial burden which it might be virtually impossible to meet.” After the floods, the committee’s next meeting was deferred, but not for long. Lord Mayor Jones and his deputy flew over the city in the 4KQ helicopter and were “amazed at the number of places which could be regarded as possible sites for the Games.” A sites sub-committee was whisked around nine possible venues in a council bus just three months after the flood’s peak.

The choice narrowed to the Northside versus the Southside. Deputy Mayor Walsh, representing the Chermside ward on the Northside, wanted Marchant Park redeveloped. Mayor Jones, representing the Southside’s Camp Hill ward, liked a site in the new suburb of Nathan, adjacent to the Mt Gravatt Cemetery and Griffith University, which would accept its first students the following year.

In late July, six months after the flood, a decision was reached: the Southside. It would be closer for visitors staying at the Gold Coast and more convenient for residents of the rapidly expanding southern suburbs.

The campaign for Brisbane to host the 1982 Games succeeded, although the likely “phenomenal” cost was much criticised. At the Montreal Olympics in 1976, where the Commonwealth Games Federation met to decide the venue for the ’82 Games, Brisbane found itself the only bidder. Montreal’s diabolical financial outcome scared others away.

New lord mayor Frank Sleeman assured Brisbane ratepayers they would pay only for the “bare essentials.” A new stadium would be built in the new suburb, but it would have a permanent grandstand seating just 10,000. “Temporary” seating would accommodate another 48,000. Work began immediately and the venue was first used in late 1975. Two years later, the twenty-fifth anniversary of the coronation of Queen Elizabeth II, it was named the “Queen Elizabeth II Jubilee Sports Centre,” or “QEII.”

There was one big problem with siting the main stadium on the top of a hill. One of the signature events at major games, the marathon, traditionally starts and finishes in the stadium. After the local distance-running community rejected a plan for the runners to complete three laps along the nearby South East Freeway, ending with a sharp climb back up to the stadium, organisers agreed to start and finish the race away from the stadium. (It was men’s only; the first women’s marathon was run at the 1986 Games in Edinburgh.)

A flatter, “city” course was mapped, like those becoming popular in places like New York, Chicago and London. For Brisbane, this meant using the river. The new route started and finished on the south bank, opposite the CBD. It headed out through the city and “The Valley,” across Breakfast Creek to the river at Kingsford Smith Drive, then doubled back to the river bank around the University of Queensland. TV cameras would capture the city at its most picturesque, spectators would get accessible viewing spots, runners would appreciate the cool breeze and flat ground in a city that doesn’t have much of it.

Held the day before the closing ceremony, the marathon did not disappoint. Big crowds lined the route. Australian favourite Robert De Castella found himself well behind two Tanzanians who were close to world record pace at the halfway mark. He set off to chase alone, catching Gidamis Shahanga just before they passed a heaving Regatta Hotel, then ran side-by-side with Juma Ikaanga for a kilometre along Coronation Drive (named in 1937 when George VI was crowned). Morning peak hour traffic on the Sydney Harbour Bridge slowed as commuters tuned car radios to the struggle. Finally, “Deek” made a decisive break and won by twelve seconds.


Building the main stadium for the Commonwealth Games on a hill in the southern suburbs had helped, paradoxically, and indirectly, to re-energise an old conceit. Decades earlier, tourism promotions dubbed Brisbane the “River City.” Soon, the first of several major arts and cultural organisations began setting up on the South Bank. Expo 88 would draw millions of people from the suburbs, the state, the nation and the world to the banks of the big river.

Despite the best intentions, QEII struggled to avoid the fate those Brisbane City Councillors feared: becoming a white elephant. Track and field events take centre stage in Olympic and Commonwealth Games but local athletics events, even the biggest interschool carnivals, attract much smaller crowds at other times.

For a while, in the 1990s and early 2000s, QEII was back in business. On joining the national rugby league competition in the late 1980s, the Brisbane Broncos played at the sport’s traditional home in the city, Lang Park. A few years later, after the temporary seating at QEII was made a little more permanent, they moved there and started drawing Commonwealth Games–like crowds to the renamed “ANZ Stadium.”

Annual State of Origin matches against New South Wales, though, stayed at Lang Park. The regular monster crowds at ANZ declined. Eventually the state government and others decided to revive the old cauldron. The two “Origin” matches played at ANZ in 2001 and 2002 while Lang Park was rebuilt were the last.

In 2003, the Maroons and Broncos returned to the new “Suncorp Stadium.” They have been there ever since, sharing the venue with the Queensland Reds (rugby union) and Brisbane Roar (soccer). Last year, it was at Suncorp that the Matildas played their World Cup quarter-final against France, which ended in that epic, victorious penalty shoot-out.

QEII went back to being a track and field venue, the Queensland Sports and Athletics Centre, “QSAC.” It was used as an evacuation centre during the 2011 floods. After Brisbane won the right to hold the 2032 Olympics, there was a chance it might be revived again as a temporary venue for cricket and AFL while the traditional home of those sports in Queensland, the Gabba, was being remade as the main Olympic stadium at a cost of $2.7 billion.

That was until Monday, when QSAC got an even bigger future. Queensland’s government considered the recommendations of a committee set up to propose further options after the earlier rejection of the Gabba rebuild. The committee recommended that a wholly new stadium be built at Victoria Park, at a cost of over $3 billion, and eventually replace the Gabba as the home of cricket and AFL in Brisbane. Both recommendations were rejected. (Victoria Park was one of the sites rejected by Clem Jones’s 1974 committee.)

The Gabba is going to stay the Gabba, with a modest upgrade. Victoria Park is going to stay Victoria Park.

The winner is… QSAC! The stadium on the hill will rise again to host the track and field events at an Olympic Games fifty years after it staged them for the Commonwealth Games. At a cost of $1.6 billion, permanent seating will be increased to 14,000, and total capacity will touch 40,000 for the period of the Olympics, some way below the 1982 full houses.

The other winner is Suncorp Stadium, with its larger capacity of more than 50,000, which will get the opening and closing ceremonies.

The marathoners? They will surely follow the river again, winding out, back, out and back, sticking to the old, deceptively gentle watercourse that has always drawn people to this place. •

Information about Commonwealth Games planning is taken from Brisbane City Council committee minutes and files, and about the 1974 floods from the Department of Science/Bureau of Meteorology’s “Brisbane Floods January 1974” (AGPS, 1974). Other information drawn from Melissa Lucashenko’s Edenglassie (2023), Margaret Cook’s A River with a City Problem (2019) and Jackie Ryan’s We’ll Show the World: Expo 88 (2018), all published by UQP.

The post Olympic origins appeared first on Inside Story.

]]>
https://insidestory.org.au/olympic-origins/feed/ 6
Prescient president https://insidestory.org.au/prescient-president/ https://insidestory.org.au/prescient-president/#comments Fri, 08 Mar 2024 01:59:19 +0000 https://insidestory.org.au/?p=77476

On the Middle East, renewable energy, American power and much else, Jimmy Carter was ahead of his time

The post Prescient president appeared first on Inside Story.

]]>
Forty-five years ago an American president took a great gamble. He invited the prime minister of Israel and the president of Egypt to the United States to negotiate a Middle East peace agreement.

Ambitious? Yes. Cyrus Vance, president Jimmy Carter’s secretary of state, called it “a daring stroke.” Foolhardy? Many thought so, including members of Carter’s staff.

Failure was a real possibility and would reflect badly on Carter, already struggling with a perception that he lacked authority. Egypt and Israel were sworn enemies who had been fighting wars since the creation of the state of Israel in 1948.

Carter took Menachem Begin and Anwar Sadat to Camp David, the presidential retreat in the Maryland mountains outside Washington, and kept them there for the next thirteen days. A media blackout prevailed until an agreement was reached. Kai Bird, author of The Outlier, a 2021 biography of Carter, described his approach as “sheer relentlessness.”

Sadat and Carter wore down an intransigent Begin until he succumbed, agreeing to a peace treaty with Egypt, including relinquishing control of the Sinai Peninsula, taken from Egypt in the 1967 war, and the dismantling of Israeli settlements there.

The agreement also included the election of a self-governing Palestinian authority in the West Bank within five years, together with (according to Carter’s detailed record) a five-year freeze on Israeli settlements there. Within three months, Israel started on a major expansion of West Bank settlements, with Begin denying the freeze had been part of the official agreement and Carter telling his staff that Begin had lied to him.

The peace treaty with Egypt, the strongest Arab state, stuck, although it cost Sadat his life. He was assassinated in 1981 by members of the Egyptian Islamic Jihad, who condemned him as a traitor for the Camp David accords.

Carter’s hopes for a broader Middle East peace have proved elusive ever since, although he could clearly see the consequences. Near the end of his presidency he wrote in his diary, “I don’t see how they” — the Israeli government — “can continue as an occupying power depriving the Palestinians of basic human rights and I don’t see how they can absorb three million more Arabs in Israel without letting the Jews become a minority in their own country.”

Nevertheless the accords were a notable achievement and unimaginable in the context of the Middle East politics of recent decades. Carter reaped a political dividend but also paid a cost: relations with the enormously powerful pro-Israel lobby in the United States were never the same again. They had not expected an American president to act as an honest broker.

Carter’s single term in the White House is generally rated among the less impressive in the presidential rankings. Yet his presidency has undergone a re-evaluation given his significant achievements in foreign and domestic policy, which look all the more substantial from today’s perspective.

In the tradition of the best political biographies, Bird gained access to volumes of material, including the copious personal diaries Carter kept as president as well as those of important figures in his administration. To learn that senior members were eating sandwiches at an important meeting in the cabinet room may not be vital to our understanding but it does point to a notable attention to detail.

Reading the narrative from the inside confirmed much of what I observed from the outside as a foreign correspondent in Washington during most of the Carter presidency. But it did so in much starker relief.

For example, the tensions between secretary of state Vance, the diplomat, and national security adviser Zbigniew Brzeziński, a cold war warrior, were evident at the time, but not their depth. Bird provides instances of what he called Brzeziński’s “highly manipulative” approach; Vance called him “evil, a liar, dangerous.”


Carter, a peanut farmer from small-town Georgia with a distinctive southern drawl, was an improbable candidate for the White House. He was a practising Baptist for whom, unlike many politicians, his religion was more than a veneer.

In a south where the echoes of the civil war still resonated and segregation continued in practice if not in name, he took a stand against racism. Yet he also was a skilled politician, elected as governor of Georgia despite his reputation as not being a typical white southerner and pragmatic when he thought he needed to be, including by downplaying his anti-racist credentials.

Still, running for president was a huge leap. He wasn’t taken seriously until he won the New Hampshire primary, and even then he was viewed with scepticism by leading members of the east-coast Democratic establishment. “He can’t be president,” said former New York governor Averell Harriman. “I don’t even know him!”

Sceptics dismissed him as self-righteous. His promise to voters that “I’ll never lie to you” prompted his friend and adviser Charles Kirbo to comment, perhaps not completely in jest, “You’re going to lose the liar vote.” But he came across to voters as sincere and authentic. And then, as now, coming from outside Washington was an advantage.

Circumstances played a large part: his Republican opponent was Gerald Ford, the sometimes hapless vice-president who had served the balance of president Richard Nixon’s term following Nixon’s resignation over Watergate. Even then, Carter won only narrowly.

In elite Washington, Carter’s team of knockabout southerners were often dismissed as hicks. But, like Carter, they were not easily deterred.

Carter brought a luminous intelligence, idealism and diligence to the White House that stands in stark contrast to the era of Trump. He argued that the world was not so easily categorised in traditional American black-and-white terms — that there was more to foreign policy than a contest between the United States and the Soviet Union. He preached against the “inordinate fear of communism” that had led to Washington’s embracing of some of the world’s nastiest right-wing dictators. The Vietnam war, he said of this approach, was “the best example of its intellectual and moral poverty.”

Bird writes that Carter rejected “any reflexive notions of American exceptionalism. He preached that there were limits to American power and limits to what we could inflict on the environment.” America didn’t go to war during Carter’s presidency — an exception up to that time and since.

He elevated human rights in foreign policy. It earned him derision from hardheads but it enhanced America’s reputation abroad, its so-called soft power.

Like any politician, though not as often, he compromised and backtracked when he judged that politics required it. Against his better instincts, he approved development of the MX missile, an expensive boondoggle championed by defence hawks, writing in his diary that he was sickened by “the gross waste of money going into nuclear weapons.”

In the wake of the OPEC oil embargo, when he was trying to persuade Congress to pass legislation to restrict energy consumption and provide funding for alternatives such as wind and solar, he diarised that “the influence of the oil and gas industry is unbelievable.” To set an example, he put solar panels on the White House roof and predicted that within two decades 20 per cent of the nation’s energy would be generated by solar power. He hadn’t count on his successor, Ronald Reagan, who removed the solar panels as one of his first acts as president, nor the ideological climate wars that followed.

While those actions were triggered by the energy crisis, he was receptive to the emerging issue of climate change. Just before leaving office, he released a report from his environmental think tank predicting “widespread and pervasive changes in global climatic, economic, social and agricultural patterns” if the world continued to rely on fossil fuels. It was a prescient warning almost half a century ago.

Carter’s domestic reforms included deregulation of sectors of the American economy, including banks and airlines, thereby increasing competition and reducing prices, though also bringing negative consequences. Consumer regulations led to mandatory seatbelts and airbags and fuel efficiency standards — something Australia is finally getting around to introducing almost half a century later. Environmental laws were passed to reduce air and water pollution; highly contested legislation locked up a large part of Alaska as wilderness and national parks, preventing oil and gas exploration.

In foreign policy, the Panama Canal treaties relinquished American control of the canal, returning sovereignty to Panama. Carter completed the normalisation of relations with China started under Nixon and negotiated an arms control agreement with the Soviet Union.

Other reforms proved to be harder sledding. Legislation on health reform that Carter thought could pass Congress was judged inadequate by Democratic liberals such as senator Edward Kennedy, who championed comprehensive national health insurance and used it as a platform to unsuccessfully challenge Carter for the Democratic nomination in 1980. It would take another thirty years for Barack Obama’s administration to enact significant, if still not comprehensive, healthcare reform.

Carter was never completely accepted by the traditional Democrats that people like Kennedy represented. It came down to suspicion about his Southern roots. Too conservative for northern Democrats, he was too much of a liberal for many southern Democrats and Republicans.


By 1979, with Americans waiting in long queues to buy petrol and paying what were then exorbitant prices for the privilege (US$1 a gallon), Carter’s presidency was at risk of sliding into oblivion. Against the almost unanimous advice of his staff, he decided on another Camp David retreat, this time a domestic summit, inviting some of the nation’s leading citizens to come up with ideas for the nation’s future. What was unusual then seems extraordinary now.

Over ten days a parade of “wise men” travelled to Camp David to diagnose the nation’s ailments and remedies. As with the Begin–Sadat summit, the rest of the nation was kept in the dark by a media blackout.

Carter emerged to give an address to the nation like none other. Sounding more preacher than president, he said America faced a fundamental crisis of confidence that no amount of legislation could fix. Americans were losing their faith in the future, worshiping “self-indulgence and consumption.”

Taking the side of the people while lecturing them at the same time, he said he no more liked the behaviour of a paralysed Congress pulled in every direction by special interests. The immediate test was beating the energy crisis, on which he announced a series of initiatives taking in a windfall profits tax on the oil industry to finance the development of domestic sources of energy, including coal and a national solar energy “bank.” (His focus was on cutting dependence on imported oil, rather than climate change.) He announced plans for rebuilding mass transit systems and a national program for Americans to conserve energy.

Contrary to the fears of his hard-headed advisors, the speech was a great success, reflected in surges in Carter’s approval ratings of 11 per cent in one poll and 17 per cent in another. He was able to convey that most precious of political commodities — sincerity.

But these and other achievements were overwhelmed late in his term by the Iranian hostage crisis. Its origins lay in the Islamic revolution and the toppling of the Shah, who the CIA effectively had re-instated as ruler of Iran in 1953 following the previous Iranian government’s nationalisation of the oil industry. Concerned by the risk to Americans in Iran, Carter resisted efforts to allow the Shah to seek refuge in the United States; but he eventually succumbed to pressure from David Rockefeller, Henry Kissinger and other establishment figures to allow him in on the pretext of urgent medical treatment.

Two weeks later, Carter’s worst fears were realised when Iranian students stormed the US embassy in Tehran and took sixty-six hostages. When diplomacy failed, Carter authorised a complex and risky rescue mission involving ninety-five commandos, a C-130 transport plane and six helicopters. A series of mechanical failures and accidents, including a collision between one of the helicopters and the C-130, resulted in the mission being abandoned.

The hostage crisis plagued the remainder of Carter’s term, reinforcing perceptions of him as a weak president. It subsequently became clear that the campaign team for Republican nominee Ronald Reagan worked behind the scenes with Iranian representatives to delay the release of the hostages, promising a better deal if he won the election. Yasser Arafat, leader of the Palestinian Liberation Organisation, had negotiated freedom for thirteen of the hostages the previous year and told Carter years later that he had rejected approaches from Reagan officials offering an arms deal if he could delay the release of those remaining.

The hostages were released on the day after Reagan’s inauguration following his landslide win in the 1980 election. Soon after taking office, the new administration, despite publicly maintaining Carter’s embargo on arms sales to Iran, secretly authorised Israel to sell military equipment to Iran.

The hostage crisis was not the only reason for the relatively rare election loss by a first-term president. Carter’s support was sapped by the 1970s ailment of stagflation — high inflation and stagnant economic growth — together with the energy crisis. Reagan, the former Hollywood actor, had an appealing personality and a now-familiar slogan: “Make America great again.”


James Fallows, speechwriter for the first two years of the administration, says that Carter invented the role of former president. He certainly had an active four decades of public life following the presidency, with the 110-strong staff of the Carter Centre in Atlanta working on human rights, preventive health care, election monitoring and international conflict resolution.

Carter raised millions of dollars for a program that virtually eradicated guinea worm, a parasitic disease that had disabled and disfigured 3.5 million people a year in Africa and India. His centre helped distribute twenty-nine million tablets in Africa and Latin America for the treatment of river blindness, another disease caused by a parasitic worm. “Americans got used to seeing this ex-president, dressed in blue jeans with a carpenter’s belt, hammering nails into two-by-fours for a house under construction by a team of volunteers for Habitat for Humanity,” Bird writes.

In the 1980s, he spoke out about the concerns he had developed about the Middle East when he was president but he had judged were too dangerous to express publicly. “Israel is the problem towards peace,” he said, citing particularly the expansion of settlements on the West Bank. Accused of bias, he responded that “a lot of the accusations about bias are deliberately designed to prevent further criticism of Israel’s policies. And I don’t choose to be intimidated.” In 2006, he published his twenty-first book with the provocative title, particularly then, of Palestine: Peace Not Apartheid, earning him epithets such as “liar,” “bigot” and “anti-Semite.”

By then Carter had been awarded the 2002 Nobel Peace Prize for “decades of untiring effort to find peaceful solutions to international conflicts, to advance democracy and human rights and to promote economic and social development.”

After he was diagnosed with cancer in 2015 he said, “I’d like for the last guinea worm to die before I do.” Nine years later, aged ninety-nine and in palliative care, he is still going, if not strongly — a metaphor for a lifetime of indefatigability. •

The post Prescient president appeared first on Inside Story.

]]>
https://insidestory.org.au/prescient-president/feed/ 9
Ben Chifley’s pipe https://insidestory.org.au/ben-chifleys-pipe/ https://insidestory.org.au/ben-chifleys-pipe/#comments Thu, 07 Mar 2024 05:22:22 +0000 https://insidestory.org.au/?p=77448

A stalwart supporter of the Labor leader emerges from history’s shadows

The post Ben Chifley’s pipe appeared first on Inside Story.

]]>
I once had the task of combing through a digitised file of letters to prime minister Ben Chifley held by the National Archives of Australia. Clicking away, I noticed one from a man named W.H. Reece, sent in August 1946.

“Would you please send me one of your pipes that you may have laid aside and you will not be likely to be using again,” wrote Mr Reece. “If it should be a bit strong, no matter. I know of a process that will overcome that. I have not been able to get a decent pipe for years.”

A quick glance was enough to tell me that this was not what I was looking for. But I printed the letter out for a closer look anyway. The writer was an aged pensioner, he said, twenty days short of seventy-five years, living alone in New Norfolk, Tasmania. He has raised a family of six daughters and three sons. All of the sons had served in the recent war, he added, with one still with the occupying force in Japan.

Reece had “battled for Labour” since he joined the Amalgamated Miners Union in 1889. “I started in poverty and I’m ending ditto, but I’ve no regrets and have no apologies to offer for my support of the ‘Grand Old Labour Movement.’”

If Mr Chifley were to visit Hobart during the forthcoming federal election campaign, and if Reece is spared that long, he promises to be in the audience. He is very optimistic that the Chifley government will be returned with a strong majority (it was). “I wish you and your good colleagues all the good luck that wishes can express.”

I was busy that day and so, having studied the letter for a few minutes and enjoying a giggle about the pipe thing (what was that all about?) I tossed it aside and moved on. Fortunately, the pile I tossed it into was the “do not throw out under any circumstances” pile, where it stayed until the inevitable desk clean-up late last year when, at last, Mr Reece finally had my full attention.


This is my favourite thing, the deep study of a single archival record. It could be a letter, a telegram or a bunch of postcards discovered in a junk shop. It is remarkable what can be gleaned from seemingly insignificant clues, especially now that these clues can be run through so many newly digitised sources. Becoming deeply immersed in someone else’s life, trying to see the world through their eyes, must be my form of meditation.

Why this Mr Reece though? What is it about him in particular? Partly it was his surname that guided my hand that day towards the “do not throw out” pile rather than the recycling bin. I grew up in Tasmania and I remember my parents talking about the redoubtable Eric Reece, a former long-time Labor premier known as “Electric Eric” because of his ardent support for hydroelectric projects. Surely it had to be the same family.

But mainly I was captivated by what I perceive as a yearning on Reece’s part to stay connected with the world. It’s unintentionally expressed, but it’s there. Looking back over his long life, this proud and, I think, lonely man tells of the things that most matter to him: his work, his family and the labour movement. Not only that, he also imagines Labor’s next victory even if he is not alive to see it.

And the pipe thing? Chifley made his pipe a signature accessory and was rarely seen without one, but it does seem awful cheek to expect him to simply hand one over on request. Chifley wrote back: “Dear Mr Reece, thanks for your letter… I am sorry that for the present I haven’t a suitable pipe to send you. As you say, good pipes are very scarce these days.” (Actually Chifley usually had several on hand, gifts from family and well-wishers.) “I was interested to read of your lengthy support of the Labour Movement. You must have many memories to look back on.” And he signed off with best wishes.

Reece didn’t get his pipe but I doubt he was disappointed. Pipe smoking was a companionable habit the two men shared but Reece’s request, I suspect, was just an opening gambit. It has been said of Chifley that he used the lighting of his pipe as a stalling tactic while he thought through a response to a problem. And so, preliminaries over, Reece felt perfectly free to address his prime minister as an equal, one Labor man to another, to tell his story.

The letter wasn’t really about the pipe, and — fair warning — this essay is not really about it either.


William Henry Reece (often known even in official records as Will Harry Reece) was born in 1872, and he was indeed an uncle to Eric Reece. Fortunately for me, there is a biography of Reece the younger, Jillian Koshin’s Electric Eric: The Life and Times of an Australian State Premier (2009).

Koshin’s book begins with an examination of the Reece family’s working-class origins in mining towns in the northeast and west of Tasmania. The discovery of minerals — gold, silver, copper, tin — in the 1870s brought a sudden and massive economic boom to the colony based on interstate investment, higher export income, higher wages and increased incoming migration. In his 2012 history of Tasmania, Henry Reynolds describes the 1880s as one of Tasmania’s “sunniest” decades.

Patriarch Owen Charles Reece established himself as a miner in the 1870s but was frequently on the move looking for work. Koshin is at pains to show how the wealth that enriched investors and beautified the cities rarely trickled down to the poorest folk who had laboured to produce it. Across three generations, even in so-called good times, little changed for the Reece family.

Owen and his wife Jane had fourteen children but the first three, triplets, died in infancy. Jane was thirty-eight when she died in Scottsdale hospital giving birth to twins, who also died. Owen was left a widower with nine children to raise; our man Will (“I started in poverty…”) was the eldest. A few brothers down the line was George, eventually to become the father of Eric, who was born in 1909.

The Reeces’ lives were characterised by insecure and dangerous work and the strain and expense of constantly moving from one primitive slab-and-shingle hut to another in remote and isolated settlements. Because these clusters of dwellings were expected to be temporary, authorities would rarely invest in public amenities. Close-knit families relied on one other.

Out of these struggles emerged a writer, Marie E.J. Pitt. Originally from Victoria, she was married to a miner, William Pitt, and for about a decade beginning in the 1890s went with him to mining settlements in the northeast and west of Tasmania. They had four children, one of whom died.

Scribbling by lamplight, Pitt wrote of “an austere land of mountain gorges of ice and snow, and raging torrents of creeping mist and never-ending rain.” The land spoke another language, “superb in its silence, appalling in its melancholy grandeur.” Her pen was also driven by anger. This is how she begins her poem “The Keening”:

We are the women and children
Of the men that mined for gold:
Heavy are we with sorrow,
Heavy as heart can hold;
Galled are we with injustice,
Sick to the soul of loss —
Husbands and sons and brothers
Slain for the yellow dross!

Over nine more bitter stanzas she attacks mine owners, politicians and churchmen for having averted their gaze from the misery right in front of them. “The Keening” was published in 1911, but by then the Pitts had moved to Victoria because William had contracted miner’s phthisis. He died in 1912.


Will Reece, his siblings, nieces and nephews were among those children of the men that mined for gold. All the Reece men became union men. Poetry aside, trade unionism was the practical agent of change, the structure within which to advocate for safer working conditions, better wages and political representation.

Reece was a seventeen-year-old apprentice blacksmith at the tin mine in Ringarooma when he joined the Amalgamated Miner’s Union in 1889, the year of its formation in Tasmania. For some reason, though, he broke away from the family and left the mines behind. His parents were married with Baptist rites but Will appears to have converted to Catholicism, a most unusual thing to do in those sectarian times, and certainly enough to cause a family rift.

From the late 1890s he roamed through several agricultural districts in the northeast and in 1909, at St Mary’s, he married a woman named Catherine Cannell. In 1912 they went south to New Norfolk, a town nestling in the Derwent valley thirty-five kilometres northwest of Hobart. The landscape was far kinder than anything Will Reece had known growing up, and here the family settled for good.

Literate, articulate and gregarious, Reece would join anything. He played cricket and football, would swing an axe at a local woodchopping event and was always ready to chair a meeting, MC a church fundraiser or write a letter to an editor about some local grievance. Forced in 1915 to give up blacksmithing because of an accident, he opened a photographic studio; it failed, and he was declared bankrupt in 1921.

Clearly this man had bucketloads of self-belief. He stood twice, unsuccessfully, for the municipal council and then, undeterred, turned to state politics and was a candidate for Labor in the elections of 1919, 1922, 1925 and 1928. He failed each time.

Meanwhile he became an organiser for the Australian Workers’ Union, and here he found his métier. His nephew’s biographer noticed Will Reece signing up shearers, shed-hands, miners, labourers and roadmen across the state, including in mining centres on the west coast. New heavy-industry projects provided fresh fields for the AWU, and there was Will Reece, visiting the new carbide factory at Electrona in the south and the hydroelectricity works at Waddamana in the central highlands. With regular reports (this one is typical) he made himself well-known to the readers of the AWU’s national paper, the Australian Worker.

But the 1930s brought reversals. In 1931, more than a quarter of Tasmanian trade unionists were unemployed because of the depression. All the Reece men let their union membership lapse. Will Reece returned to manual labour and in 1934, aged sixty-two, was severely injured in an explosives accident while quarrying for gravel. He sustained burns to his face and temporarily lost his sight. In 1935 his wife Catherine died suddenly, leaving him with a clutch of children and teenagers.

In 1939 Will’s fifty-year commitment to the labour cause was celebrated at a special meeting of the New Norfolk branch of the Labor Party. Local MP Jack Dwyer spoke Reece’s work to “uplift” the condition of the masses. Many of the privileges now enjoyed by the workers were due to his efforts, Dwyer noted, and the party was much indebted to him.

At about that time Will’s nephew Eric was embarking on his own (in his case spectacularly successful) political career. After failed attempts in 1940 and 1943, Eric was elected Labor member of the state House of Assembly in November 1946. He was in office as premier between 1958 and 1969, and again from 1972 to 1975, and was federal president of the Labor Party between 1952 and 1955.

His formative years had been similar to his uncle’s: he’d worked in mines and on farms from his early teens — joined the AWU at fifteen — spent most of the 1930s depression unemployed — got a job at the Mount Lyell copper mine in 1934 — was appointed organiser for the AWU there in 1935. Strangely, there does not seem to have been a strong association between uncle and nephew. In his 1946 letter to Ben Chifley, Will could have mentioned Eric as a promising youngster to keep an eye on, but he does not.

Still, Will and Eric Reece — and Ben Chifley as well, of course — were haunted by memories of hardship, and all strove for the same things: economic growth, full employment, increased standards of living, and social welfare for those who needed it.


There was nothing in Eric Reece’s makeup to prepare him for the social upheavals and cultural shifts of the 1960s and 1970s. He had grown up believing that the state’s natural resources — its water, timber and minerals — were there to be used for the common good. Famously, he rode roughshod over opposition to the hydroelectric scheme in southwest Tasmania that was to flood Lake Pedder in 1972–73.

Where some people wept at Pedder’s beauty, Eric Reece was belligerent and autocratic. In 1966 he taunted his opponents with the remark that Tasmania’s southwest contained only “a few badgers, kangaroos, wallabies, and some wildflowers that can be seen anywhere.” (Badgers? Did he mean wombats?) Tough old trade unionists like Reece knew what destitution looked like and were lit with a determination to do more than just overcome personal hardship; they were committed to structural reforms to improve the lives of all working people.

By this time, however, there had begun a great grinding of gears in progressive politics as young, idealistic, tertiary-educated people drifted away from Labor to the green movement. While this also happened elsewhere, perhaps the grinding came earlier in Tasmania.

Will Reece didn’t live to see any of this. Perhaps, as promised, he made it to Hobart in September 1946 to hear Ben Chifley’s two-hour campaign speech given to a capacity crowd at the town hall. “The whole country is prosperous,” Chifley declared that night. “That is the first ideal we have, and we go to the people on that record.”

Labor’s election loss in 1949 and Chifley’s death in 1951 must have saddened Reece. He died in 1953, with his boots on (so to speak) I hope, and his certainties still intact. •

The post Ben Chifley’s pipe appeared first on Inside Story.

]]>
https://insidestory.org.au/ben-chifleys-pipe/feed/ 6
Victors’ justice? https://insidestory.org.au/victors-justice/ https://insidestory.org.au/victors-justice/#comments Mon, 04 Mar 2024 03:53:51 +0000 https://insidestory.org.au/?p=77433

A major new book revisits the moral and legal ambiguities of the Tokyo war crimes trial

The post Victors’ justice? appeared first on Inside Story.

]]>
Now is a good time to be reassessing the Tokyo war crimes trial. Across East Asia and the world, the postwar global settlement is crumbling. This process has been very evident in Japan, though it has unfolded quietly there and attracted surprisingly little attention in the English-speaking world. Internationally, debates continue to rage about the definition of war crimes and processes for bringing war criminals to justice.

The Allies’ trial of Japanese wartime political and military leaders was intended to lay the foundations of a new, peaceful and democratic Japan by punishing the militarists who had led the country into a disastrous conflict. The notion that victors could judge the vanquished evoked controversy, both within Japan and internationally; yet in the late 1940s the pioneering Japanese feminist Kato Shizue could confidently write that “intelligent Japanese long ago decided that the punishment of the war criminals was inevitable, and they think the verdicts were just.”

Today, feelings are very different. Japanese conservative politicians (including prominent members of the present government) rail against what they label the “Tokyo Trial View of History,” which they blame for instilling a darkly masochistic view of the nation’s history in the minds of the Japanese population. The late prime minister Shinzo Abe was particularly emphatic in denying that the men convicted by the International Military Tribunal for the Far East should be regarded as criminals. The seven who were executed for war crimes following the Tokyo trial — as well as others convicted and given lesser sentences — are among those commemorated in the Yasukuni Shrine, where right-wing politicians and some senior military officers go to honour the spirits of the dead. As political scientist Gary J. Bass argues in his monumental new book Judgement at Tokyo, “the Tokyo trial misfired and fizzled,” revealing “some of the reasons why a liberal international order has not emerged in Asia, despite the wishes of some American strategists.”

The paradoxes at the heart of the Tokyo trial began to be visible well before the International Tribunal opened its hearings on 3 May 1946. Bass’s book starts by guiding readers through the concluding stages of the Pacific war and the impassioned debates among allied leaders about the treatment that should be meted out to the vanquished. (US secretary of state Cordell Hull was among those who initially favoured summary executions of Hitler and Japan’s wartime prime minister, Tojo Hideki.) A central figure in the early part of Bass’s narrative is Henry Stimson, US secretary of war at the time of the defeat of Germany and Japan, who played a key part in creating the conceptual framework that underlay both the German Nuremberg war crimes trials and the Tokyo trial.

In Nuremberg and Tokyo, the wartime leaders of the defeated nations faced three classes of criminal charge. Class A was the crime of waging (or conspiring to wage) aggressive war; Class B covered the war crimes set out in the existing Geneva Conventions, including mistreatment of prisoners of war; and Class C encompassed crimes against humanity. The difficulties lay in Classes A and C. There were no legal precedents for prosecuting people for waging aggressive war, nor for crimes against humanity, and even within the victorious allied nations some leading legal commentators were concerned that the trials were imposing newly invented laws retrospectively on the defeated.

The horrors revealed at Nuremberg helped to embed the notion of crimes against humanity both in public consciousness and in international law. But in Tokyo the key charge (though not the only one) was the crime of waging aggressive war — an offence for which no one had ever been prosecuted before the Nuremberg and Tokyo trials, and for which no one has been prosecuted since.

As Bass vividly shows, unease and disagreement about the moral and judicial basis of the International Tribunal’s proceedings haunted the Tokyo trial. Even Sir William Webb, the acerbic Australian judge who presided over the International Military Tribunal, privately questioned whether waging aggressive war could be treated as a crime, though he managed to suppress these doubts sufficiently to concur in, and hand down, the tribunal’s guilty sentences on all the twenty-five defendants who survived the trial. (Two died during the proceedings, and another was found mentally unfit to be tried.)

A further obvious paradox of the Tokyo trial was the fact that Emperor Hirohito, in whose name the war had been fought and hundreds of thousands of Japanese soldiers had gone to their deaths, never appeared in court. By the time Japan surrendered, the US government had decided that it would be politically expedient to retain the emperor as symbolic leader of the new Japan. Despite protests from Australia, he remained immune from prosecution.

Judgment at Tokyo, though, is not a dry analysis of judicial principles and legal arguments. It is a vivid blow-by-blow account of the trial, filled with colourful characters and moments of farce as well as tragedy. The Tokyo tribunal, though dominated by the colonial powers, was more international than its Nuremberg counterpart. Its eleven judges represented the United States, Canada, Britain, France, the Netherlands, Australia, New Zealand, the Soviet Union, China, India and the Philippines, and each judge brought with him (they were all men) his own experiences, professional training and personal prejudices. They spent their time in war-devastated Tokyo living an isolated existence in the Imperial Hotel, and relations between them were often tense. Chinese judge Mei Ruao took a deep dislike to Indian judge Radhabinod Pal; the British judge, Lord William Patrick, was derisively dismissive of his Filipino counterpart, Delfin Jaranilla. They were united, it seems, only in their shared aversion to the court’s president, William Webb.

Yet this is not a simple litany of fractiousness and failure. What the Tokyo trial achieved, in very difficult circumstances, was the collection of a mass of vivid and often searing evidence of the horrors of war, including of many conventional war crimes: among them, the massacres and mass rapes of civilians in the Philippines and China, the mistreatment and killing of prisoners of war, and the brutal forced labour inflicted on tens of thousands of Southeast Asians and of allied prisoners of war on the Thai–Burma Railway and elsewhere.

While taking readers through this evidence, Judgement at Tokyo also points out the silences: most notably, the absence from the trial of any serious discussion of Japan’s use of biological warfare in China. The US and Soviet authorities were well aware of this dark story but made sure that it was kept out of the trials because they were busy trying to obtain knowledge of Japan’s biological techniques for their own purposes.

Bass explores not only the events of the trial itself but also the subsequent destinies of the judges — particularly the very different fates of Mei Ruao and Radhabinod Pal. Mei, who had been appointed to the court by the Chinese Nationalist government of Chiang Kai-shek, decided hesitantly to return to mainland China in 1949 and throw in his lot with the new People’s Republic of China. Ironically, he fell foul of the communist authorities because of his fierce criticism of Japanese war crimes at a time when China’s government was trying to improve the country’s political relationship with Japan. He was publicly condemned during the Cultural Revolution and died soon after — only to be elevated to the status of national hero under current Chinese leader Xi Jinping, whose nationalist rhetoric echoes Mei’s own insistence that China should never forget the wartime horrors inflicted on its people by Japan.

The Indian judge Pal, on the contrary, famously wrote a dissenting judgment that sweepingly rejected the right of the International Tribunal to judge the defendants. (Later, he also questioned the Nuremberg judgements and the reality of the Holocaust.) Pal’s lengthy statement of dissent made him the hero of the Japanese right, who feted him on his later visits to Japan and have cited his judgement ever since as justification for their own revisionist views of the war.


Judgement at Tokyo is based on a mountain of court records, government archives and interviews with the descendants of the judges and defendants, and Bass skilfully weaves all this together into a fascinating narrative. Despite the scale and scope of the book, though, there is one odd lacuna. It barely mentions a crucial counterpoint to the Tokyo trials: the story of the 4000-odd Japanese soldiers and military auxiliaries who were found guilty of Class B and C war crimes in trials held throughout East and Southeast Asia and the Pacific, of whom almost 1000 received the death sentence.

As Utsumi Aiko and other Japanese scholars have pointed out, these were the most tragic of the war crimes proceedings, for many of those who received the harshest sentences were low-ranking auxiliaries — some of them drafted from Japan’s colonies of Taiwan and Korea into the violent world of the Japanese wartime military only to be abandoned to their fate by the collapsing military machine that had recruited them.

As Gary Bass shows, the Tokyo trial had far-reaching implications for Japan and its Asian neighbours. Its fundamental flaw was its shakily based attempt to define the waging of aggressive war as a crime. The spectre of double standards and retrospective justice raised by this concept has never been laid to rest. This in turn allows historical denialists today not only to dismiss the trial as “victors’ revenge” but also, by extension, to whitewash the history of the war and depict the Tokyo defendants as innocent martyrs to a just cause. And the growing influence of that denialism, as Bass trenchantly observes, risks shackling Japan to a narrative of the war that is both “morally odious and historically dubious.” •

Judgement at Tokyo: World War II on Trial and the Making of Modern Asia
By Gary J. Bass | Picador | $39.99 | 912 pages

The post Victors’ justice? appeared first on Inside Story.

]]>
https://insidestory.org.au/victors-justice/feed/ 8
A dynamic of acceptance and revolt https://insidestory.org.au/a-dynamic-of-acceptance-and-revolt/ https://insidestory.org.au/a-dynamic-of-acceptance-and-revolt/#comments Tue, 27 Feb 2024 04:36:21 +0000 https://insidestory.org.au/?p=77396

Why the extraordinary Jack Lindsay deserves to be better known

The post A dynamic of acceptance and revolt appeared first on Inside Story.

]]>
Few people have known so much about so many things as Jack Lindsay. Even fewer have published so much. Lindsay grew up in Brisbane in the early years of the twentieth century, moved to Sydney in 1921, and then embarked on a sixty-year career as journalist, publisher, poet, critic, translator, novelist and historian. Living in England after 1926, he produced an astonishing number of books that found readers around the world; in a multitude of direct and mediated ways he made a major contribution to mid-twentieth-century culture and thought. Thirty-five years after his death comes Anne Cranny-Francis’s Jack Lindsay: Writer, Romantic, Revolutionary.

Well-known to Lindsay enthusiasts, Cranny-Francis has written articles and organised conferences about his life and work, maintains a website, arranged the publication of his “political autobiography” The Fullness of Life and edited a volume of selected poems. In this first book-length single-author study of Lindsay’s life and work she has hit on an elegant solution to the problem of the hyperactively full life of her subject. He was someone whose works demand attention to his ideas, and whose ideas demand attention to his life. Jack Lindsay is structured around a core of six chapters, each dedicated to Lindsay’s book-length studies of English authors: John Bunyan (1937), Charles Dickens (1950), George Meredith (1956), William Morris (1974) and two on William Blake (1927 and 1978). This frame is filled in with chapters that provide biographical and intellectual context and discuss his other relevant works, helping the reader to understand, without being overwhelmed, how Lindsay’s approach to writing was influenced by his experiences and ideas.

This structure works well to illuminate Lindsay’s eclectic, self-fashioned life-philosophy, with its associated preoccupations, values and imagery: the struggle for unity, culture as expressive work, the archetype of death and renewal. The system evolved over time, but many elements were present from the first.

Inevitably Cranny-Francis omits or barely glances at much of Lindsay’s output. She makes barely a mention of his forty-three novels and seven biographies of artists. It would be hard to guess from it that Lindsay’s most cited study is about alchemy in Roman Egypt, or that the one most discussed by academics is a historical novel set in the British civil war.

Depending on what counts as a book, Lindsay published about 160 in his lifetime, as well as hundreds of articles, stories and poems. About a half of his writing was historical and biographical, a quarter fiction, and the remainder criticism, social theory, translations, polemics and poetry. Most of his publications were concerned with the past, usually the ancient Greek and Roman worlds. Lindsay’s classical training is apparent in the eclectic character of works in which history, mythology, philology, archaeology, anthropology, aesthetics and philosophy are seamlessly blended.


All of Lindsay’s mature writing was underwritten by a self-fashioned philosophy or credo. Its most fundamental principle was what Cranny-Francis describes as the “embodied connectedness” of things. He often called it “vital unity,” “wholeness,” “Life” or “the fullness of life.”

In Lindsay’s thought the concept of vital unity assumes as many guises as energy does in physics. One of his symbols for it was Dionysus, the mysterious deity of wine and rebirth, leader of a disorganised band of enthralled creatures — satyrs, maenads, nymphs, centaurs, Pan the god of shepherds — who found no place on Mount Olympus. Another symbol was the figure of “the people,” which he sometimes called “the folk,” and occasionally “the masses,” each term with its particular political inflection. Human unity implied solidarity, equality, ethical responsiveness and mutual aid.

As Cranny-Francis observes, Lindsay extends the idea of unity to all spheres of human activity, including the natural world. John Bellamy Foster, noting Lindsay’s evocations of a “patient earth… ‘eternally reborn’ through labour and ritual practice,” identifies him as a forerunner of Marxist ecology.

Lindsay found the origins of the idea of unity in Plato, or even further back in Parmenides and Pythagoras, but a slightly less distant inspiration was the sixteenth-century excommunicated priest Giordano Bruno (1548–1600), who melded Renaissance humanism with materialism. Lindsay was stirred when he encountered Bruno in the early 1930s, subsequently writing a novel about him (Adam of a New World, 1936), and translating De la causa, principio e uno (Cause, Principle, Unity, 1962). Later he would claim that reading Bruno led him directly to Marxism.

Lindsay’s intense awareness of the interconnectedness of the living world had implications for his everyday life. Cranny-Francis quotes from an episode in The Fullness of Life during his years with the poet Elza de Locre in the early 1930s, when he lived in desperate poverty.

A local farmer had gifted a couple of rabbits to them as a neighbourly gesture. Confronted with the reality of having to skin and disembowel the animals before cooking, Lindsay found himself unable to proceed. He contemplates the economy of death on which a meat-eating society is based, particularly when social organisation has reached a point where meat protein is no longer essential to the diet: “One’s symbiosis with the earth is therefore in terms of unceasing violence and murder; and one knows, deep in one’s being, that one lives only by a system of blood-victims.”

“A communist society which is not vegetarian,” he concluded, “seems to me a hopeless contradiction.”


The young Lindsay called the absence of unity abstraction or dissociation; later, under the influence of Hegel and Marx, he favoured the word alienation. He argued that alienation has always been present in human life and has always provoked resistance. Throughout history that resistance has taken many forms — initiation rituals, shamanic flights, alchemy, art and poetry, and political revolt. The struggle against alienation shapes people’s relationships with one another and the world, motivates the protests of the wretched and exploited, and underlies attitudes to nature. Great thinkers and creative artists throw light upon its diverse manifestations.

Blake’s prophetic books explore the “world of false consciousness, of alienation,” according to Lindsay, and he praised Dickens for “the discovery of dissociation and the alienation of man from his fellows and his own essence, the stages of struggle against the dissociative forces, and the intuition (uttered in symbolic forms) of the resolving unity.”

Lindsay regarded religion as both a product of alienation and a form of protest against it. His vision of the world was also infused with hope for a fulfilment somehow always just out of reach. In a letter to Edith Sitwell on her conversion to Roman Catholicism in 1955 he confessed to having been at times “very close to the catholic creed… indistinguishable perhaps from ekklesia of the faithful — the people who are Christ.”

Affinities between his system and Christianity are not difficult to uncover: sin as alienation, humanity crucified, Life the Eucharist, Paradise a vision of love and freedom. He was familiar with such syncretisms in the Ancient World: in a book about Roman Egypt he references a tomb in the Roman catacombs of Pretextatys on which Dionysus is identified with the Lord Sabaoth, the Lord of Hosts, and burials in the Vatican Necropolis of Christians who also worshipped Isis and Bacchus.

Alienation has become all-pervasive in the modern world, chiefly because of money and science. Following Thomas Carlyle, Lindsay often referred to the institutions and customs associated with money as the “cash-nexus.” From all the possible elements of human relationship associated with the exchange of goods, money abstracts a single factor, that of utility, and makes the remainder redundant. The dehumanisation implicit in the use of money reaches its apogee with capitalism, which turns life itself into a commodity. In his study of William Morris he declares that “a genuinely new society can be born only when commodity-production ends, and with it division of labour, money, market-systems, and alienation in all its many shapes and forms — above all alienation from labour.”

The other powerful alienating factor of modernity is the scientific method stemming from Galileo and Descartes, which Lindsay consistently attacked as “mechanical,” “divisive” and “quantitative.” Cranny-Francis notes that “Lindsay returns repeatedly… to Blake’s criticisms of science and the post-Enlightenment rationalism on which it is based.” Lindsay was not at all opposed to scientific inquiry, nor wholly dismissive of the achievements of post Enlightenment science. But in Marxism and Contemporary Science (1949) and a later trilogy on alchemy, astrology and physics in Greco-Roman Egypt he refused to separate knowledge of “nature” from other kinds of knowledge. There is a single interconnected world, and all ways of knowing it are likewise interconnected. The “sciences” discussed in Marxism and Contemporary Science are not physics, astronomy or chemistry, but biology, anthropology, art criticism, psychology and history.

For Lindsay, decisive proof that contemporary science has taken a wrong turning was the atomic bomb, the culmination of alienation’s will to self-destruction. Today he would no doubt make the same criticism of the digital revolution and genetics.


But there is a nagging problem with alienation, though Lindsay, more of a poet than a philosopher, seems never to have addressed it, and neither does Cranny-Francis. It parallels the problem of evil in religions that postulate a benign creator. Where does alienation come from? How can the world be a vital unity and at the same time a site of struggle against division?

Some cosmologies have an explanation. An idealist can say that the world of the senses is a flawed copy of a perfect and eternal world that is glimpsed only in thought. The unity is “above,” the struggle “below.” But Lindsay was trenchantly opposed both to idealism and to hierarchy. For him mental and spiritual phenomena are autonomous, but in the final analysis dependent on matter. Cranny-Francis mentions his debt to the Sydney-born philosopher Samuel Alexander. Alexander was an early twentieth-century advocate of emergence, the theory that complex systems produce attributes and activities that do not belong to their parts. Could emergence explain the origin of alienation? It isn’t clear how.

At a psychological level, though, Lindsay’s biography provides a paradigm case of a conflict between longed-for unity and actual division. Lindsay’s father was the writer and artist Norman Lindsay, one of Australia’s best-known humourists and artists in the first half of the twentieth century, notorious for his sexual libertarianism and hostility to Christianity. Cranny-Francis dwells sensitively on Jack’s difficult relationship with Norman. “The story of father-son relationships threads through all of Lindsay’s writing, fiction and non-fiction,” she writes. When Jack was nine years old, Norman left his wife and three sons. The fatherless family moved to Brisbane, where young Jack lived in a state of genteel but disorganised impoverishment, loved but neglected by his vague and increasingly alcoholic mother until her sister’s family finally took charge and sent him to school. Unsurprisingly, the theme of a lost birthright appears often in Lindsay’s novels and histories.

Norman renewed contact with his son only after his academic achievements had earned him scholarships to Brisbane’s elite Grammar School and the University of Queensland. Lindsay, ecstatic to be restored to his famous father’s attention, was Norman’s devoted acolyte for the next decade. Then they fell out bitterly.

Norman’s entire life was a fierce act of will to sustain the exhilarating freedom of his adolescence, when he had followed his older brother out of a shabby mined-out gold town to marvellous Melbourne and lived in careless poverty, pursuing a self-directed course in drawing, reading, flaneuring and witty companionship until Jack’s conception brought that delightful life to a sudden end. For the rest of his life Norman acted out his ambivalence, alternately praising and denouncing his son. In 1967 he wrote to him, “I can’t help but laugh when I think of what our biographers are going to make of the break and reunion of our relations. They will have to do the best they can with its human dramatics for it is quite impossible for them to realise the compulsions behind them.”

Jack Lindsay did not have children until his late fifties. He was an anxious, self-critical parent, and never ceased to yearn for his father’s distracted attention.

Turn for a moment I say
Turn from your obdurate place
In that clarity of stone,
That terrible folly of light,
Turn for a moment this way
Your abstracted face.

Lindsay understood the importance of this personal history for his literary career, confessing to a close friend that “if my parents hadn’t parted I doubt if I should have become a writer at all.” Cranny-Francis suggests that his description of William Morris also applies to himself:

From one aspect there never was a more impetuously frank man than Morris; he lives restlessly in the open and follows his convictions out without concern for the consequences to himself or anyone else. From another aspect he appears a hidden figure, moved by a passion of which the multiple effects are plain but the central impulse obscured. I suggest that along the lines I have sketched we can bring the man and the artist into a single focus, and see the way in which his personal dilemma was transformed into a dynamic of acceptance and revolt, of deepening insight into the nature of his world and into the ways in which the terrible wounds of alienation can be healed.


A succession of recent British scholars has sought to recover Lindsay as a forerunner of practitioners of cultural studies, an influential field of interdisciplinary research instigated by British theorists — among them Richard Hoggart, Stuart Hall and Raymond Williams — in the 1970s. Although they didn’t reference Lindsay, the founders of cultural studies were almost certainly familiar with some of his work, and there are strong points of similarity in their ideas. In particular, they all affirmed the political significance of culture.

Marx had suggested a base–superstructure model of social formation, according to which economic relationships ultimately determine the organisation of politics, law, religion and creative expression. The implication was that economic interests always trump cultural factors. The practical effect was to concentrate efforts to build socialism in workplaces, which in effect meant and trade unions. This left little place for cultural creators. Like cultural studies, Lindsay steadfastly rejected that model.

Another tenet of cultural studies that Lindsay anticipated was the idea that significant cultural change comes from “below.” Lindsay believed that plebeian practices and values, and their fraught and contradictory clashes with the practices and values of ruling elites, are the major source of cultural innovation. He made the point forcefully in a letter to his friend and fellow critic Alick West:

The concept is that culture is created by the expropriators, fundamentally expresses their position and needs, and has no close relation to the concrete labour-processes and the producing masses. I should like to suggest that something like the reverse is the truth. The people are the producers and reproducers of life, and in that role are also the begetters of culture in all its shapes and forms — though in a class-divided society the ruling class expropriates culture.

Lindsay’s view stemmed from the conviction — shared with Ruskin and Morris — that work and aesthetic production had once “been harmoniously united, and that they still ought to be, despite the general movement towards degradation and mechanisation.” Before commodity production alienated workers from the products of their labour — in this historical sketch uncommodified slavery is conveniently forgotten — work was done in order to create both necessary means of living and pleasing or profound emotions. Each was a joyful undertaking. Once, communal work had always been accompanied by singing and chanting. Understanding this had motivated William Morris to take on, in Lindsay’s dated language, “the full political and social struggle which alone could have as its aim the achievement of brotherhood and the ending of commodity-production.”

In A Short History of Culture Lindsay traced the essential identity of art and work back to the movement of bodies in space. From the classicist Jane Harrison he took the observation that the repetitive, rhythmic behaviours that create the necessities of life — poundings, liftings, plantings, weavings, cuttings, stalkings, throwings — are shared with dancing. Like her, he considered dance to be the primal kind of cultural creativity. Citing another book of Lindsay’s criticism, After the Thirties, Cranny-Francis writes:

Lindsay identifies in dance the rhythmical control of movement that characterises human activity and being. It bodily enacts the purposive behaviours that enable the group to maintain social coherence, engaging them through the rhythm of the breath: ‘Body and mind are thus keyed together in new adventurous and interfused ways.’ The dance becomes an exploration of the embodied being required to achieve a specific purpose, such as a hunt. It lifts the dancer (and observer) into the realm of ‘pure potentiality’ where ‘desire and act are one’; where the bodily disposition required to engage successfully in a particular activity is achieved and communicated. In this process, Lindsay argued, human beings imaginatively engage aspects of everyday life and rehearse the modes of being, thinking and acting that enable them to achieve their needs and desires. For Lindsay this is the role of culture in the formation of being and consciousness, whether it be the ritual art of early societies or contemporary literature, visual art, theatre and dance.


If communism means opposition to capitalism and desire for a future free of oppression and exploitation, Lindsay was certainly a communist. No one seems to know exactly when he joined or if he ever left the British Communist Party, but he was actively affiliated with it from the late 1930s until at least the 1970s. MI5 put him under surveillance. He stayed in the party when it demanded he recant his ideas, and again after Khrushchev’s denunciation of Stalin’s brutality in 1956. There is no doubt about the strength of his allegiance. But was Lindsay a Marxist communist? He certainly called himself one. Cranny-Francis, along with just about everyone else who has written about him, takes it for granted.

Yet there are grounds for wondering about Lindsay’s Marxism. What kind of Marxist converts on account of a Renaissance philosopher? Marxism profoundly shaped his thinking but it was not Lindsay’s foundational postulate. He came to it as a plausible derivation from a more fundamental constellation of ideas about culture and history that he had already arrived at. Some of his creed was shared with Marxism, some was dissonant with it. If, in the manner of a party apparatchik, one were called on to prepare a list of his heresies, it would be an easy brief: he largely discounts or ignores economic forces, flirts with idealism, sees revolutionary potential in “the people” rather than “the working class,” and has a Romantic, even reactionary, understanding of Communist aims.

Late in life, Lindsay began to concede the point. The Crisis in Marxism (1981) is highly critical of most prominent twentieth-century Marxist theorists, particularly Adorno and Althusser. In one of his last essays he declared that he was “diametrically opposed to all closed systems,” including Lenin’s. “I have found all Marxists, orthodox or not, to be hostile.” Among an eclectic list of influences ranging from Keats to Harrison to Dostoyevsky, only two Marxists appear: Lukacs, and Marx himself.

In a sense, of course, debating whether Lindsay was “really” Marxist is as futile as debating whether Mormons are Christian or Alevis Muslim. In another sense, though, it matters. As long as Lindsay is seen as first and foremost a Marxist, his ideas remain submerged beneath the complexity and weight of a hundred and fifty years of Marxist theorising. To perceive what is most original in his thought, it needs to be disentangled from what has become a distracting integument.


Promised a scholarship to Oxford after he graduated from the University of Queensland but told that he would have to wait a year, Lindsay refused to enrol. For most of his life the lack of a higher degree and his oppositional politics would have made it difficult if not impossible to work as an academic. He gave no sign of wanting to. Even his most esoteric books were not aimed primarily at academics, nor did they please many of them. Ironically, today it is chiefly they who keep his memory alive. Anne Cranny-Francis’s book is no exception, but it deserves a broader readership. We need not agree with Lindsay’s controversial opinions to hope that this remarkable thinker will become better known. •

Jack Lindsay: Writer, Romantic, Revolutionary
By Anne Cranny-Francis | Palgrave Macmillan | €119.99 | 416 pages

The post A dynamic of acceptance and revolt appeared first on Inside Story.

]]>
https://insidestory.org.au/a-dynamic-of-acceptance-and-revolt/feed/ 16
“Am I the one who’s missing something?” https://insidestory.org.au/am-i-the-one-whos-missing-something/ https://insidestory.org.au/am-i-the-one-whos-missing-something/#comments Mon, 26 Feb 2024 22:40:01 +0000 https://insidestory.org.au/?p=77390

A returned soldier’s belief in American virtue and progress is shaken

The post “Am I the one who’s missing something?” appeared first on Inside Story.

]]>
Brent Cummings — “a white male pickup-driving ex-soldier living in a Georgia county where in 2016 Donald Trump received 71 per cent of the vote” — might not seem a sufficiently interesting protagonist for a biographical study. Stereotypes of race, gender, occupation and region pile up to create an expectation that he is one of Hillary Clinton’s deplorables. As author David Finkel puts it:

He’d been born in Mississippi in 1968 and lived there in his formative years, so obviously he was a racist. He’d been raised in New Jersey, where he played centre on his high school football team, and then went on to play rugby in college, so of course he was brutish and crude. He had spent twenty-eight years in the US Army and had been in combat, so surely he had killed people.

Obviously, of course and surely, Brent Cummings eludes these reductive inferences. In An American Dreamer, Finkel, a Pulitzer Prize–winning writer for the Washington Post, unfurls Brent’s inner complexities and outer contradictions.

Brent appeared fifteen years earlier as an army major in Finkel’s The Good Soldiers, an embedded account of the 2007 troop surge in Iraq, and Finkel’s long connection to him has built the foundation for a work of gripping intimacy. An American Dreamer gets inside Brent’s skull, and those of his wife Laura and neighbour Mike, to capture the emotional landscape of contemporary American life from three diverging vantage points.

Brent is now working stateside at a college with his retirement from the army looming. His soul is troubled. He feels his country has lost its way in the last couple of decades, as if he’s come “out of one war and into another” against enemies on the home front. In a revealing slip, he remarks that the earlier time “felt… clean. No that’s not the right word… It’s slipping.”

What the pollutant might be is not clear to him. Trumpism is part of it. Despite being “probably more Republican than Democrat, probably more conservative than liberal,” he loathes the man for his egotism, ill-discipline and bullying more than for his policies. But the problem runs deeper: Brent has lost confidence in his country’s goodness and shared purpose. “Everything was fraying. That’s what it felt like.”

Brent’s concerns have more to do with meaning than with material or political realities. His belief in American virtue and progress is shaken, and while that abstract dream is disintegrating a real one disturbs his sleep. Not the post-traumatic image of desert horrors we might expect but a chorus of mocking voices from a profound darkness.

His sense that the ground has shifted under him is reinforced by a series of bafflements. He is shocked by the lack of support he receives from colleagues when he challenges the use of a confederate flag on an insignia, upset by activist attacks on his beloved military, appalled by the unthinkable assault on the Capitol. He finds himself in a vanishing middle where the mental habits of a lifetime, grounded in ideas of honour and fair play, have lost their traction. “Am I the one who’s wrong? Am I the one who’s missing something?”

Laura and Mike play second and third fiddle to Brent, but Finkel gives voice to them with the same empathic immediacy. Laura’s main register is anxiety rather than disorientation. She fears violent crime, feels a rising sense of menace in her neighbourhood and worries about the fate of her intellectually disabled daughter when she is no longer around.

Mike, for his part, overlays fear with anger, going full-bore MAGA while railing against the “socialist and communist” treachery of the Democrats. Why Mike, a quadriplegic of modest means, would set aside his early doubts about Trump and come to see him as his infallible saviour is a mystery. His political conversion creates tension with his neighbours, a microcosm of the severing of connections that has played out across the country.

Finkel is a wonderful guide to the inner terrain of his characters. He shows rather than tells, keeping their dialogue and the private thoughts behind it direct and relatable. Brent in particular is brought to vivid life through confrontations with events that confound him. Very occasionally these episodes seem a little forced, notably in the parallels between an encounter with the security wall on a visit to Jerusalem and Trump’s border wall. Mike’s characterisation can also appear ever so slightly two-dimensional by comparison with Brent’s, but the book as a whole is a triumph of compassionate and sympathetic attention.

Finkel inhabits Brent in a rare way, better than a life-long friend could hope to do. More a finely tuned recording instrument than a buddy, he makes no attempt to elevate Brent, hide his flaws or turn him into a morally instructive Everyman. He is an ordinary guy, standing somewhere on the slippery hump of the political bell curve, but he is also a creature of a specific time, place and tradition, not just a symbol of averageness. Witnessing his puzzlement at how things have changed, we might wonder how much his sense of loss comes from occupying a political centre that cannot hold and how much it is a sign that he is getting older and his generation is being unseated.

We hear so much about the growing polarisation of American life. Books like this one help to humanise the conflict, not only by plucking individuals from their political tribes but also by exploring the quieter emotional dimensions of their experience. Beyond the primal fears and hatreds, Finkel suggests, there are people seeking solutions to big, existential questions about purpose, meaning, legacy and value. An American Dream shows us that behind all the yelling and distrust and there is vulnerability and hope. •

An American Dreamer: Life in a Divided Country
By David Finkel | Scribe | $36.99 | 256 pages

The post “Am I the one who’s missing something?” appeared first on Inside Story.

]]>
https://insidestory.org.au/am-i-the-one-whos-missing-something/feed/ 5
Russia’s war against Ukraine: a longer view https://insidestory.org.au/russias-war-against-ukraine-a-longer-term-view/ https://insidestory.org.au/russias-war-against-ukraine-a-longer-term-view/#comments Thu, 22 Feb 2024 06:36:47 +0000 https://insidestory.org.au/?p=77324

With the full-scale invasion entering its third year, the stakes remain high

The post Russia’s war against Ukraine: a longer view appeared first on Inside Story.

]]>
Russia has been waging war against Ukraine for ten years now, if we start the clock back in 2014 with the illegal annexation of Crimea and the invasion of Ukraine’s east. The war remained geographically contained for its first eight years, though, and when the conflict became frozen life went on largely as normal in Kyiv, Lviv and elsewhere in unoccupied Ukraine, even if soldiers kept dying at the frontline.

This state of affairs came to an abrupt end with Russia’s all-out invasion on 24 February 2022. Not only did the fighting reach deep into Ukraine’s heartland, but life far behind the frontline also became militarised. Russia frequently bombards civilian infrastructure as well as cities in a type of terror warfare intended to break the will of Ukraine’s defenders. There is no longer any hinterland.

How long will this slaughter last? In August last year I warned against overly optimistic expectations, writing that “supporters of Ukraine’s democracy should prepare themselves for long-term, costly support.” Another six months on it is even clearer that patience and endurance will be needed if we want to see Ukraine survive and strive. We have to stop thinking in terms of short and decisive campaigns. This war has become a war of attrition.

Like Vladimir Putin, we need to think in the geographical and historical categories of what historian Timothy Snyder has memorably called the “bloodlands” — the vast territories between Russia in the east and Germany in the west, with Ukraine in the middle. This viewpoint expands the time horizon dramatically. The last three wars fought in this region were far from short campaigns. The first world war’s “eastern front” lasted from August 1914 to March 1918. The wars of the Romanov succession began in Central Asia in 1916 and elsewhere in 1918, only ending, depending on the region, in 1920, 1921, 1922 or even 1923. The German Soviet war — constantly invoked by Putin both in the run-up to the war and during Russia’s continuing cultural mobilisation — extended from the (northern hemisphere) summer of 1941 to the spring of 1945.

Hence, the normal duration a full scale military conflict in this part of the world seems to be three to four years. Ukraine has survived two so far.

But it’s not just the region’s history that suggests a long haul. Once battle lines are fully entrenched, conventional war takes time. The first world war’s western front was bogged down in costly trench warfare, with massive casualties but little territorial gains, for four years.

By the time the second world war rolled around, military specialists in all armies had found the technical means to overcome trenches, barbed wire and machine-gun emplacements. And yet it took the Allies close to a year after the invasion of Normandy in 1944 to defeat Germany, a country under assault from the east by the steamroller of the Red Army, from the south by the United States, British Empire forces and the Free French, and from the air by indiscriminate attack by the combined power of the US and British air forces. Both Ukraine and Russia are in much stronger positions today.

Historical analogies are miserable predictors. But they matter when historical actors think in and through them. Putin is an avid reader of history, constantly pondering where he fits in. He thinks in categories and time-spans informed by Russia’s historical experience.

While he didn’t expect Ukraine to resist so effectively and survive the initial onslaught, he had long prepared his country for a drawn-out conflict with the outside world. One indicator is the effort his regime spent on making Russia’s food system relatively independent of outside supplies. At a time when everybody praised the virtues of globalisation and international networks of trade and mutual dependence, Putin insisted Russia should be able to feed itself.

As a recent study points out, this is the kind of food system you build when you expect a long-term confrontation that might throw your country back on its own resources. Putin embarked on it over decades, at a time when barely anybody in Europe could imagine a war of this magnitude on the continent.

Putin also entrenched his dictatorship, also an anticipation of war. First came the slide towards authoritarianism that began on the first day of his presidency. More recently came its acceleration. The death last week of opposition figure Alexei Navalny is just the latest escalation of a massive crackdown that began in 2021 and quickened with the start of the all-out war in 2022. Russia is now a full-blown dictatorship.

Thus entrenched in the Kremlin, Putin expects the democracies of Europe to have the shorter breath. The way Ukraine has become a political football in US domestic politics might well feed this expectation.

We need to appreciate that this is Putin’s theory of victory: to pound Ukraine with artillery and air attacks; to bleed the defenders white by sacrificing large numbers of his own citizens; and to wait until “the decadent West” loses interest and returns to business as usual, depriving Ukraine of the weapons and economic support it needs to defend itself.

As things stand, he might well be proven right. As I wrote a year ago about the then unlikely prospect of a Russian victory:

Winning the war would require Russia to ramp up its military production and mobilisation of manpower and increase the quality of its training and leadership. It could do that over the long run, just as the Soviet Union did during World War II… It could do so particularly if some of the countries which today are sitting on the fence decide to defy the United States, NATO and the European Union and circumvent or ignore sanctions; the United States reverts to isolationism; NATO disintegrates into squabbles between its members; and the European Union implodes among disagreements between old and new, and rich and less prosperous nations.

This pessimistic scenario has not yet come to pass. Yes, Russia currently has the whip hand. It has massively increased its armaments production, found ways around sanctions and continued to field large numbers of men while avoiding all-out mobilisation. Meanwhile, the United States has shaped up as the weakest link in the chain of democracies supporting Ukraine.

But Russia has not won yet. Ukraine still has “a viable theory of victory,” as two leading military analysts recently wrote. Its military has become expert at war by attrition, which it fights intelligently, minimising its own losses while maximising the enemy’s. Supplied adequately, it will become even better at this terrible art, denying Russia victory and eventually turning the tide.

For this to happen, though, Ukraine needs the continued support of the outside world: from NATO countries, from the Europeans and from friends further afield, such as Australia. But these friends need to appreciate that this war is now a war of attrition. And those wars are not won in a day or a season.


What about negotiations? A strong commitment to long-term support should unite all friends of Ukraine, no matter whether they think that ultimately the war will end in Kyiv’s forces retaking all occupied territories, if necessary by military means (the current official Ukrainian position), or in a negotiated settlement of some sort, with compromises on both sides.

There are indeed models for a negotiated peace which, while painful, might satisfy Ukraine and guarantee its safety rather than simply giving Russia breathing space to rearm for the next assault or the chance to insist on Ukraine’s unconditional capitulation. The much-discussed “West German” solution is one such proposal. It proposes that Ukraine be divided into a democratic west with some of its eastern territories occupied or even annexed by Russia. The west would be integrated into NATO and the European Union and developed with a massive aid program similar to the Marshall Plan. This is certainly not an acceptable solution for either side at the moment, but it might well become one once exhaustion eventually sets in.

The key term here is “eventually.” Negotiating now only aids Russia in its imperialist and anti-democratic goals. Forcing Ukraine to negotiate at a moment when, with delayed and insufficient support from its democratic friends, it is on the defensive amounts to asking a democratic nation to surrender to a dictatorship. Negotiations are best held from a position of strength. If not backed by the ability to resist and indeed to inflict damage, talks with a militarily stronger opponent quickly lead to a loss of territory and sovereignty.

The Ukrainians learned this lesson in 1918 when they signed the first treaty of Brest–Litovsk with the Germans and Austrians, who subsequently occupied the country and squeezed out food reserves to feed their own war effort. The Russian Bolsheviks learned the same lesson shortly thereafter, when, devoid of the fighting force they themselves helped dissolve, they had to sign a punishing peace with the Germans just to get out of a war they could no longer fight. And, in an instance of remarkable historical justice, the Germans learned the same lesson in 1919, when they could do nothing but sign the famously unfriendly Versailles treaty.

Ukraine needs to be helped to avoid such a situation and negotiate from the position of strength, if a negotiated settlement will indeed end this war. •

The post Russia’s war against Ukraine: a longer view appeared first on Inside Story.

]]>
https://insidestory.org.au/russias-war-against-ukraine-a-longer-term-view/feed/ 13
Red flags https://insidestory.org.au/red-flags/ https://insidestory.org.au/red-flags/#comments Thu, 08 Feb 2024 04:01:14 +0000 https://insidestory.org.au/?p=77149

Communist or not, postwar refugees from the Soviet Union and Eastern Europe attracted the attention of Australia’s security services

The post Red flags appeared first on Inside Story.

]]>
Jakob came of age in occupied Germany’s American zone not long after the second world war had ended. Living in a refugee camp, he heard rumours about what happened to people like him — a teenager wrenched from his home to become a forced labourer in Nazi Germany — if they returned to their homeland, which was now part of Soviet Ukraine. He chose resettlement in the West instead.

When the International Refugee Organization sent him to faraway Australia in 1948, it probably sounded like an adventure. But the nineteen-year-old found himself doing back-breaking work in an isolated mine surrounded by dense Tasmanian forest. He would later tell government officials that it was “200 years behind European working conditions.”

After a year, Jakob decided he was finished with capitalist Australia and would return to the Soviet Union. Many of his peers were unimpressed by his decision — it even sparked a brawl during which he was stabbed. But his pro-Soviet migrant friends considered him a true patriot. Celebrating with them and a little drunk, the young refugee boasted that he would give the Soviets intelligence on Australia and go to Korea to fight the Western capitalists.

Unbeknown to Jakob, his audience of friends and acquaintances that night included two spies: a Soviet MVD colonel and an undercover agent for the Australian Security Intelligence Organisation, or ASIO. Concerned by their informant’s report, Australian security officers began keeping an eye on Jakob. They followed him all the way to the docks when he sailed for the Soviet Union. Dissatisfied with the West and full of praise for his Soviet homeland, he was considered a threat to Western security.

This is not the familiar refugee story told in countries like Australia: a story of desperate, hard-working migrants who gratefully become loyal contributors to their new homeland. Jakob had certainly been desperate — he became a forced labourer at just fourteen — and, for the most part, he had worked hard in Australia. But the war and displacement produced complex, shifting identities that didn’t simply disappear when the shooting stopped. And life in the West didn’t always live up to its promises.

The second world war had left forty million or more people displaced in Europe. Some wanted nothing more than to return to their homes, but for others, particularly those from now Soviet-occupied Eastern Europe, the home they had left no longer existed. As the International Refugee Organization worked to solve this “refugee problem,” thousands of Russians who had lived through the war in East Asia were being displaced by China’s communist revolution.

Most of these refugees, whether in Europe or China, were stridently anti-communist. Many had good reason to be, having lived as exiles after the 1917 Bolshevik revolution or through the Stalinist terror of the 1930s. The views of “White Russians” and Eastern Europeans who considered their homelands “captive nations” would fit neatly into the West as the fresh storm clouds of the cold war built on the horizon. Increasingly, each Soviet refugee was a propaganda victory for the West: these were individuals choosing freedom, expressing hatred of communism by voting with their feet.

Some, however, harboured more ambivalent views. A few could even be called “Red”: communists, socialists, trade unionists or, most commonly, pro-Soviet patriots who were proud of the victorious Red Army and their homeland’s achievements since the communist revolution. “Displaced persons,” known as DPs, were resettled primarily in countries that now defined themselves as the anti-communist West, with the largest contingents going to the United States, Australia, Canada and Israel.

The lives and experiences of anti-communist DPs — the refugees who became model migrants in the West — have been chronicled in the rich scholarship on postwar migration that has proliferated since the 1990s. Yet Soviet refugees with left-wing views, DPs like Jakob who did not fit the model, have remained essentially invisible.

Surveillance and the persistent shadow of espionage were central parts of their lives in the West. Former or current Soviet citizens who were Russian speakers and left-wing sympathisers threw up multiple red flags for Western intelligence organisations, which often struggled to understand their traumas, experiences and intra-community politics. Many had been socialised in the Soviet Union, their political views shaped by complex lives in Europe and China.

In the cold war West, their ideas took root in new ways. Ideological convictions — that the world could be better and fairer, or that the worker’s lot was difficult — mingled with personal ones, shaped by memories of lost homes, murdered family members or forced labour. These ideas made them potential threats, forcing them to negotiate the incursions of state security into their everyday lives.

In many ways, it is because these refugees loomed so large in the eyes of intelligence agencies that we struggle to catch sight of them. The lives of “ordinary” people are often difficult to locate in official records, but that marginalisation was compounded by cold war anti-communism and surveillance.

Left-wing Soviet DPs had particular cause to recede from view by lying about their politics and backgrounds or simply keeping their own counsel. They knew they were being watched; most were aware that both the state and other migrants regarded them with suspicion; very few recorded their experiences. History maintains a sense of irony, though: the very surveillance dossiers that marginalised these migrants can now provide the historian with a window into their worlds.

Intelligence agencies are notorious for their secrecy and reluctance to reveal the details of even decades-old operations. When they do reveal information, it is typically on their own terms and in the service of their public image — take, for example, the declassification of the CIA’s Canadian Caper operation, which formed the basis of the film Argo.

In some cases, researchers can appeal to legislation. In the United States, the Freedom of Information Act provides a well-trodden path to accessing FBI and CIA files. A similar provision in Canada allows requests for the Royal Canadian Mounted Police’s files. But both have, to differing degrees, proven limited in recent years. Britain’s MI5 is subject to very few access measures, releasing files only as it chooses. Further, its release policy targets higher-profile individuals, leaving the files of more ordinary subjects unknown and unknowable for historians.

By comparison, access procedures in Australia are quite liberal. A dedicated application process via the National Archives of Australia provides greater access to security files if one is sufficiently patient. These dossiers are still redacted, equivocal and frustrating, but they provide unique glimpses of a left-wing presence among the DPs. Presumably, similar migrants ended up elsewhere in the West.


Though they had chosen life in the West rather than the East, and in some cases had experienced the worst that Soviet communism had to offer, these migrants continued to align themselves with the political left. For the most part, they were not activists. They tended not to join Australian political parties and their ideas did not often fit neatly under labels like “communist,” “Marxist” or “Trotskyite.”

Their views were idiosyncratic patchworks rather than refined political doctrines, reflecting lives lived across East and West in turbulent times. Their experiences of Soviet terror and state support, Nazi and Japanese occupation, concentration camps and forced labour often informed their understanding of the twentieth century’s prevailing political philosophies more than books or manifestos. Their politics played out at street-level: in living rooms, church halls, night clubs, theatre groups, factory floors and discussions over glasses of wine (or vodka) at parties.

Though some refugees chose Australia specifically for its distance — the furthest they thought they could get from the Soviets — the cold war arrived there, too. By 1948, as the revolution in China compounded still-heightened fears of invasion by neighbouring Asian countries, anti-communism gained a firm foothold in Australia.

As the historian David Lowe has written, the cold war was “Australianised” with settler-colonial anxieties about maintaining white racial homogeneity and preventing territory loss. Australia saw itself as part of the English-speaking world but was surrounded by a decolonising Asia-Pacific region with a growing socialist and communist presence, and so sought the security of close ties with Britain and the United States.

One result was the formation of ASIO in response to American concerns about Australia’s lax security and a Soviet spy ring in Canberra. Domestically, the cold war flared in 1950–51 as Australian troops were shipped to Korea and prime minister Robert Menzies attempted to ban the Communist Party. A referendum on the ban saw the public drawn into an increasingly heated debate about communism, national security and civil liberties.

Similar tensions were sparked in 1954 by the defections of Soviet officials (and spies) Vladimir and Evdokia Petrov — an incident soon christened the Petrov affair. Vladimir Petrov had socialised extensively among Soviet migrants in Sydney and many of them waited with trepidation as ASIO investigated and a royal commission enquired.

Both moments were cold war watersheds for Australians, a time when debates about communism and espionage hit close to home. But they hit even closer for Soviet refugees as their homelands and the ideologies they had lived under and knew intimately were discussed in daily newspapers and nightly news broadcasts. Many of the refugees knew Petrov personally; the affair played out in their lives in distinctive ways, providing new, rich layers to our history of this event.

The Petrov affair’s most iconic and enduring moment — Evdokia Petrov, her husband having already defected alone, being escorted across Sydney’s airport tarmac by two Soviet couriers — was heightened by thousands of anti-communist Eastern European migrants. They turned out to protest what they saw as the forcible return of a terrified Russian woman to a dire fate in the Soviet Union. Many had themselves felt at risk of a similar fate, in Europe’s DP camps, and arrived with placards and raised voices to warn Australians and their government of the Soviet Union’s cruelty.

These anti-communist exile groups existed alongside and often in conflict with smaller communities of left-wing migrants. For some, joining a left-wing group related more to opposing diaspora norms — their vitriolic anti-Soviet rhetoric and strong attachment to the church — than cold war politics. Less conservative social mores and better entertainment often helped too, especially for young refugees. But whether they intended it or not, many were then cast into cold war conflicts.

Sydney’s left-leaning Russian Social Club brought DPs into the orbit of the broader Australian left and the Petrov affair. A corresponding Social Club was also set up in Melbourne, in 1952, though it seems to have been short-lived. These clubs facilitated migrants’ connections with Soviet embassy officials stationed in Australia, who were often working covertly as spies. A host of left-wing Jewish organisations were also established by, or drew in, postwar migrants, such as the Jewish Councils to Combat Fascism and Anti-Semitism in Sydney and Melbourne, the Volkscentre in Darlinghurst and Kadimah in Carlton.

Left-wing migrants often participated across multiple groups and sometimes became involved with Australian-run organisations as a result. The typical “communist front” groups which proliferated across the West — Australia–Russia societies (later renamed Australian–Soviet friendship societies) and peace councils — were also hubs for left-wing Soviet refugees. The Melbourne friendship society even had, for a time, a DP as chairman. These clubs facilitated migrants’ connections with Soviet officials but also attracted Australian surveillance, and thus, interactions with spies on both sides.

Most put down roots in Australia, establishing themselves in new communities and becoming neighbours, friends, fellow churchgoers and colleagues of both other migrants and those born in Australia. Some shifted between communities, burying their earlier years, and some became more conservative with age. Most were naturalised, giving up Soviet passports or statelessness in favour of Australian citizenship — though, again, they pursued this in order to access specific benefits, rights or stability just as often as a desire to become Australians.

With naturalisation, they became Australian voters. Soviet refugees’ voting patterns are near impossible to ascertain, but both Labor and Liberal parties tried to some extent to cultivate migrant votes. Few of the left-wing group (even if pro-communist) appear to have associated directly with the Communist Party of Australia, but some refugees joined or maintained connections to the Labor Party.

But not everyone settled down. Australia was not typically a refugee’s first choice, and some moved on to other countries, such as Canada or the United States. Some never made it past the two-year work contract, deported for absconding from their assigned employment. Others did their best to get themselves deported: one way to obtain a cheap ticket back to Europe.

The other way, for Soviets, was voluntary repatriation. The Soviet Union wanted its “stolen” DPs back and Soviet citizens who wanted to return could often do so at Soviet expense. Repatriation figures were only ever a tiny fraction of the tide of Westward migration during the early cold war — between 1947 and 1952, some twenty-eight Soviet DPs returned from Venezuela, twenty-two from Argentina, sixteen from Canada, nine from South Africa and only two from the United States. Nevertheless, they reflected the fact that life in the capitalist world could also be harsh, especially if you were a refugee.

In Australia, the two-year work contract was often a catalyst and some, like young Jakob, left soon after completing it, homesick and dissatisfied. Others remained longer, even decades, before making the decision to repatriate. China Russians could also return if they secured the appropriate paperwork, though the Soviets likely would not foot the bill. Nevertheless, some did repatriate.

But whether they chose to stay in Australia or not, many Soviet refugees lived through the early years of the cold war in the West. As these battle lines were drawn, they had to pick a stance: leave politics behind and remain quiet, become anti-communist “cold warriors,” or accept the surveillance and suspicion that came with life as a pro-Soviet “enemy alien.” •

This article is adapted from Ebony Nilsson’s new book Displaced Comrades: Politics and Surveillance in the Lives of Soviet Refugees in the West, published by Bloomsbury Academic.

The post Red flags appeared first on Inside Story.

]]>
https://insidestory.org.au/red-flags/feed/ 22
The younger Menzies https://insidestory.org.au/the-younger-menzies/ https://insidestory.org.au/the-younger-menzies/#comments Tue, 06 Feb 2024 05:49:32 +0000 https://insidestory.org.au/?p=77141

Australia’s longest-serving prime minister emerges sympathetically from the first two of a projected four-volume survey

The post The younger Menzies appeared first on Inside Story.

]]>
More than most prime ministers, though befitting his longevity, Robert Gordon Menzies has been the subject of a significant number of books, articles and commentary — including his own memoirs, political tracts and broadcasts made during and after his political career. For interested researchers, Menzies’s papers and recorded interviews and the many books in his own library are all housed at the Robert Menzies Institute at Melbourne University.

The sheer volume of material continues to fuel efforts to document and analyse the career of Australia’s longest-serving prime minister. The latest is a multi-author, multi-volume (four are promised) appraisal edited by the Menzies Institute’s Zachary Gorman. Based on a series of conferences, the books aim to promote “discussion, critical analysis and reflection on Menzies, the era he defined and his enduring legacy.” Contributions are not limited to those of unabashed admirers; writers from the other side of the political fence also offer their assessments, as do ostensible neutrals.

The first volume, The Young Menzies: Success, Failure, Resilience 1894–1942, covers the period from Menzies’s birth in 1894 to 1942, though not all chapters fit neatly within those boundaries. James Edelman and Angela Kittikhoun’s useful chapter on Menzies and the law, for example, takes in the Communist Party Dissolution Bill, eight years beyond 1942.

Following political scientist (and ex-MP) David Kemp’s introduction, the book’s early chapters focus on the family environment into which Menzies was born and the social and political culture of the era. As most readers will be aware, his father ran a general store in the western Victorian town of Jeparit, saving the son from any credible charges of having been born with a silver spoon in his mouth. But while the small business ethos had a crucial impact on Menzies’s political philosophy, he was exposed to a different worldview by his maternal grandfather, John Sampson, an active trade unionist, though without being persuaded to change his own emerging outlook.

Menzies’s academic record in Melbourne University’s law faculty was outstanding and he also took part in student politics and campus journalism. His failure to enlist during the first world war — a family decision prompted by the fact that two brothers were already serving — is well known, and journalist Troy Bramston reveals how it may have contributed to Menzies’s fiancée’s ultimate decision to break off their engagement. Menzies had no doubt that his failure to enlist propelled him away from a brilliant legal career and onto the parliamentary path. He needed to offer “public service.”

For this reviewer, one of the most interesting chapters is historian Greg Melluish’s account of Menzies’s advocacy of liberal education and its connection with his ideas about democracy. That Menzies was a “scholarship boy” at both school and university is reasonably well known and, Melluish argues, helps explain his support for “meritocracy” rather than inherited and entrenched privilege (with an obvious exemption for the monarchy). This commitment seems crucial in explaining Menzies’s insistence that he (and later, his party) was liberal, not conservative.

Of course, conservatism existed (and exists) in Australia, and the parties Menzies joined and led garnered the vast preponderance of that vote. He revered English political and legal institutions as springing from liberal values, but their defence surely entailed a conservative outlook. Melluish stresses that Menzies understood English democracy as reflective of a specific common culture; in contrast to the Americans, “he did not see democracy as being universally applicable.” This could help explain why conservatives may view multiculturalism as a problem, undermining the necessary foundations of their version of democracy — a question that will perhaps be tackled in later volumes. Of course, Menzies’s view could also lend itself to the darker idea that democracy is not suitable for all, especially those viewed as “backward.”

Among other prime ministers, probably only Gough Whitlam could be as closely identified with the case for liberal education. For Menzies, writing in the 1930s, British history demonstrated that such an education “would produce the sorts of people who possessed the capacities to make that system of government [Westminster] work properly.” Ironically, in view of today’s emphasis on utilitarian degrees, Menzies can be seen as enlisting the (now) maligned bachelor of arts in defence of the practical aim of good government.

Melluish also usefully distinguishes between Menzies’s idea of a liberal education and the wider idea of “Western civilisation.” Menzies was fixated on Australia’s British heritage; the Greek and Roman stuff could, it seems, be left to people like Whitlam.

Menzies’s version of the university was obviously not the “oppositional” one. But, as Melluish points out, this critical variant was emerging at the time Menzies was writing. It would probably approach its zenith during the second half of Menzies’s long term in office — which should make for an interesting discussion in the final volume in this series.

Political scientist Judith Brett explores the parallels between Menzies and Alfred Deakin, sons of small businessmen, both of them influenced by the liberalism of the Victorian goldfields, both following very similar educational paths, and of course, both having more than one go as prime minister. It is Deakin, she writes, “whom Menzies might have looked to as an exemplar of national leadership.”

A useful reminder of the important role religion could play in forming political beliefs comes in historian David Furse-Roberts’s chapter on the impact of Menzies’s Presbyterianism. The connection between his faith and his political philosophy seems so strong that a liberal atheist might have felt less than welcome in the party Menzies would form. And, had he been around, Menzies may well have been puzzled to observe some Liberal staffers take an affirmation rather than an oath when they appeared in the defamation case brought by Bruce Lehrmann against Network Ten and one of its journalists.

By contrast, it would be an oddity today if any senior politician identified mainstream religion (as opposed to the “prosperity gospel” variant embraced by some prominent conservatives) as a key factor in their political outlook. As judged by Furse-Roberts, Menzies’s version of Presbyterianism emphasised a “selfless individualism,” acknowledging the ameliorative role of the state but also its limitations: “it fell primarily to the compassionate spirit and self-sacrifice of individuals to succour the needy and further the common good.” This clearly eschews socialism, but Furse-Roberts suggests it goes “far beyond John Stuart Mill’s minimalist ethic of ‘no harm’ to others.” One might observe how that reference to the “common good” contrasts with the overwhelmingly individualist emphasis of the more recent version of the Liberal Party.

Historian Frank Bongiorno’s chapter, “Menzies and Curtin at War,” is a finely balanced contribution, acknowledging the positives of Menzies’s first prime ministership and also (in anticipation) recognising his “postwar nation-building achievements,” which “look better every year, as we contemplate the policy failures of our own century and the conspicuous absence of compelling vision.” This generosity from a Labor-leaning historian suggests that the defensiveness of Liberal partisans in certain chapters may to some extent have been directed at a shrinking target.

Anne Henderson mounts a characteristically robust defence of Menzies from charges of appeasement and softness on Nazi Germany, stressing the absence of a perfect record among any of the key players. Mindful of the passage of time, I was left wondering how many Australians would know to whom “Pig-Iron Bob” refers. How many in the press gallery?

Journalist Nick Cater examines the role of Menzies’s famous “The Forgotten People” radio address in 1942, highlighting the importance of the family home as the central focus of that talk. While a Labor minister could deride this support for increased home ownership as turning workers into “little capitalists,” Menzies’s philosophy emphasised the “social, economic and moral value of home ownership.” Saving for a home was a “concrete expression of the habits of frugality and saving.” National patriotism, in other words, “inevitably springs from the instinct to defend and preserve our own homes.” How might the renters on the battlefields in 1942 have responded to this observation, I wonder?

Political scientist Scott Prasser sums up the learning experiences that would enable Menzies to resurrect his career and become Australia’s longest-serving prime minister. This involves some projection, for he still had much learning to do (during seven more years as opposition leader) after the notional end date for this volume. That quibble aside, Prasser’s contribution is a useful one since Menzies’s success can’t be attributed mostly to luck and dud opponents. The checklist: modest promises, sound coalition relations, a willingness to adopt new directions, and an awareness of the nation’s political architecture. His return to power and the use to which he put his learning experiences await us in the next volume.


In his introduction to the second and latest of the series, The Menzies Watershed, editor Zachary Gorman acknowledges the limitations of the “call for conference papers” method the project employs, which risks missing “certain topics of great interest and relevance.” This dilemma is reflected in the ensuing chapters, with some likely to be of appeal to the general political scholar–aficionado and others more in the niche category. My focus will be largely on the former.

In his chapter on Menzies and the Movement, Lucas McLennan makes the case for a good deal of similarity of emphasis between Menzies’s Anglo-Protestantism and the version of Catholic social teaching (and consequent public policy) embraced by lawyer–activist B.A. Santamaria and his disciples in the (Catholic Social Studies) Movement. It is certainly the case that both men would have seen their vigorous anti-communism as having a strong religious component, especially reflected in the anti-communist foreign and defence policies embraced by Menzies’s party and endorsed by Santamaria and (after the Labor Party’s split in 1955) his political creation the Democratic Labor Party.

McLennan’s case is possibly less convincing on the domestic front. While the Movement may have preferred subsidiarity over centralism, it seems unlikely that Menzies would have seen much merit in the (frankly weird) land settlement proposals advanced by Santamaria. And we can be fairly confident that the Movement’s view (as expressed in 1948) that Christians should seek “to break up concentration of wealth” would not have secured much support at a meeting of the Kooyong branch of the Liberal Party. Ultimately, even Santamaria’s version of Catholic social teaching necessarily involved an element of collectivism that would not have appealed to Menzies.

Anne Henderson’s brief chapter on Menzies’s successful opposition to Labor’s bank nationalisation plans possibly tells the reader as much about the Chifley government’s ideological rigidity (or commitment to principle — take your pick) and misreading of the public mood as it does about Menzies’s deft exploitation of the issue. Two decades after the Depression, the anti-banks sentiment was clearly not what it used to be, although Henderson’s depiction of the banks battle as “class war as Australia had never seen it” might have been challenged by some survivors from that period. In passing, it might be observed that since Labor lost the double dissolution election it provoked on this issue in 1951, it has not held a Senate majority on any occasion.

Tom Switzer evidences and reinforces the generally accepted wisdom that Menzies was no radical right-wing reformer. He retained and relied on several of the senior bureaucrats who had advised Chifley, and his economic policies were of the Keynesian variety, reflecting a consensus that would persist until the end of the Fraser period. In his introduction to this volume, Gorman had noted Menzies’s good fortune in not being “exposed to a centre-right echo chamber of policy advice,” insulating him from big overreaches (with the exception of the attempt to ban the Communist Party).

Keynesianism is again a key theme in David Lee’s chapter on economic management. It also contains a useful outline of cabinet and public service structures and processes in the early years of the Menzies government.

Troy Bramston’s chapter, “The Art of Power,” draws on his well-received biography of Menzies and hence comment here will be minimal: Menzies had been an effective political campaigner, “but campaigning is not government” (wise advice). Building on his previous experience, consultation, reflection and wide reading, he developed a capacity for management and administration that served him well.

Charles Richardson examines aspects of Menzies’s approach to the crown and imperial relations, the Statute of Westminster and the office of governor-general, drawing some comparisons with the attitudes of his nemesis H.V. Evatt. In referring to Menzies’s concern about the “separate status of the crown in right of the different dominions”— the question of how the monarch could be at peace and war at the same time in relation to the same foreign power — Richardson delightfully describes this as an “absurdity” that we still live with. The fact that most wars are now waged without formal declarations of war may help, at least at a technical level.

Richardson endorses the view that Menzies should have made the switch from a British to an Australian governor-general before Casey’s appointment in 1965, but notes the prime minister’s quaint criterion that it was essential with any appointment that “the Queen knew them.”

Lyndon Megarrity seeks to correct the misconception that Australia’s involvement with overseas students only commenced with the Colombo Plan. He outlines the history of such activity (which could involve some fancy manoeuvring round the White Australia policy) and describes policy before the second world war as “ad hoc and reactive.” The Chifley government entered the soft diplomacy business of scholarships, but Megarrity sees any potential benefits as being negated by immigration minister Arthur Calwell’s notorious hardline attitude on deportations: no grey areas in the White Australia policy for him.

The role of the new external affairs minister Percy Spender in the creation of the Colombo Plan in 1950 is well known. While acknowledging the Chifley government’s creation (pre-Colombo) of a relevant policy management framework, Megarrity credits the Menzies government with a defter handling than Labor of tensions between the Plan and the White Australia policy, assisting with the overall enhancement of Australia’s reputation in the region. In the cold war context, the scheme could “help maintain stability in Southeast Asia and increase resistance to Communism.”

Chapters on the creation of the Australian Secret Intelligence Service and on the role of Spender in (among other things) negotiating the ANZUS treaty serve to highlight the electoral supremacy the Menzies government would establish as the guardian of national security, an advantage his party has largely retained to the present day. Nicolle Flint revisits the issue (it probably no longer qualifies as a “debate”) over whether Menzies’s role in the Liberal Party’s creation has been overstated (spoiler alert: no). Lorraine Finlay, addressing the dilemma of “what liberty should be provided for the enemies of liberty,” focuses on the attempts to ban the Communist Party, though current trends may remind us of the timelessness of that dilemma. Andrew Blyth provides an account of think tanks’ influence on the Menzies government, but to some extent the title is misleading: the Institute of Public Affairs was effectively the only player in that game, although pressure groups and committees of inquiry are also covered in the chapter.

Christopher Beer’s chapter uses the federal electorate of Robertson on the central New South Wales coast to make some observations about the impact of early Menzies government policies. He includes useful electoral information about the seat, which serves (for this reviewer) to highlight the absence of comparable nationwide electoral data and commentary on the elections of the period. Clearly, the “call for papers” did not evince the relevant interest.

By the end of the period covered in this volume, Menzies had won three elections as Liberal leader, disarming his internal critics, and even greater dominance lay ahead: Labor partisans might like to look away now. •

The Young Menzies: Success, Failure, Resilience 1894–1942
Edited by Zachary Gorman | Melbourne University Publishing | $44.99 | 222 pages

The Menzies Watershed: Liberalism, Anti-Communism, Continuities 1943–1954
Edited by Zachary Gorman | Melbourne University Press | $45 | 256 pages

The post The younger Menzies appeared first on Inside Story.

]]>
https://insidestory.org.au/the-younger-menzies/feed/ 23
John Curtin’s potato https://insidestory.org.au/john-curtins-potato/ https://insidestory.org.au/john-curtins-potato/#comments Thu, 25 Jan 2024 23:48:35 +0000 https://insidestory.org.au/?p=77070

A gift to a prime minister gives a glimpse of the life of an Australian toiler

The post John Curtin’s potato appeared first on Inside Story.

]]>
On 9 September 1942, Mr W. Frith, an aged pensioner giving his address as Wattle Flat via Bathurst, sent prime minister John Curtin a small package containing a potato. So important was this potato that Mr Frith felt obliged to include detailed instructions on its use.

The prime minister was to put the potato in his pocket, specifically in his left pocket if he was right-handed. In “a few weaks time” it will get a bit soft, Curtin was told. Take no notice of that but leave it there and it will flatten out “like a half crown” and then go “has hard as a pice of wood.” After three years it will “whear away to nothing.” And then the prime minister should repeat the process. “While you carrie a Potato in your pocket you will never suffer with any Pains.” Frith himself had been doing so for the previous twenty-seven years, he said, and suffered no akes or Pains.

The prime minister’s private secretary wrote to Mr Frith acknowledging with thanks — but no further comment — the arrival of the package. Frith’s letter was carefully filed with hundreds of other personal and official representations under “Correspondence F” for the year 1942.

In 2017, while I was working at the National Archives of Australia, a colleague of mine stumbled with delighted amazement upon the Frith correspondence. John Curtin was a popular prime minister, yes, but to send a potato as a gift? There were peals of laughter in the office that day, let me say, at the thought of a potato-induced protuberance in the prime ministerial pocket.

When one of us finally got around to doing some actual research, we discovered that carrying a potato in one’s pocket was a Victorian-era cure for rheumatism. Exactly how it was thought to work is unclear — folk remedies and superstitions do not admit of much close investigation anyway — but it was commonly believed that the potato had to have been stolen for it to work. (Frith makes no mention of this in his letter to Curtin.) The Pitt Rivers Museum at Oxford University includes a number of withered therapeutic potatoes — here’s one — among its holdings of folkloric material.

So, would Curtin have given the potato cure a try? Could a potato have been a silent witness at the next war cabinet meeting, in Canberra on 21 September 1942? I suspect not. Curtin’s health was poor, but rheumatism is not known to have been one of his afflictions. If he knew about Mr Frith’s gift — and his staff may well have thought he would enjoy the diversion — Curtin may simply have kept it in his pocket until he could hand it to domestic staff at the Lodge for use in the kitchen. Nothing was allowed to go to waste in those austere times.

Surprised to learn that folklore and superstition still lingered in 1940s Australia, I wondered if Frith’s offering to Curtin was considered odd at the time. As it turns out, yes, just a little. In late 1942 and early 1943, several major newspapers ran stories poking gentle fun at the weird and wonderful letters and packages Curtin often received. Each of these pieces — here’s one — was essentially the same, and probably drew on a compilation of letters (writers’ names withheld) offered to the press by Curtin’s indefatigable press secretary, Don Rodgers. His aim, I imagine, was to rub some edges off his boss’s rather stern public image.

Christians sent religious tracts, widows sent wedding rings (goodness!), a lot of people sent money (which went straight to Treasury), inventors sent war-winning suggestions, and one woman sent a cushion embroidered “God Bless Our Prime Minister.” The public was entertained with excerpts from letters to Curtin from various charmers and crackpots, among whom Mr Frith comes off as comparatively sane. Who knows if a copy of any of these ever reached him at Wattle Flat?

Years later, Frith’s words still come back to astonish me yet again with their specificity and conviction. Tempting though it is to dismiss him as a bit of a weirdo, it’s good to remember that few of us are completely rational all the time. Even though the evidence for its efficacy is slender I keep a bottle of echinacea on hand for when I feel a cold coming on. Which of us has not done something similar? A well-known chain of Australian discount chemists devotes several aisles in its enormous stores to complementary medicines and dietary supplements, and people obviously buy them. If we laugh at Mr W. Frith of Wattle Flat via Bathurst, we also laugh at ourselves.


The other reason I often think of Mr Frith is that he reminds me of when I first met the peasant Bodo during my undergraduate days. I still have my copy of Eileen Power’s wonderful book Medieval People, which was first published in 1924 and went through many subsequent editions. Power chose six people and wrote a chapter on each to personify ordinary life in the Middle Ages. Bodo is the first. He was a peasant living in the early ninth century on an estate attached to an abbey near Paris, owned by the emperor Charlemagne. Because of Charlemagne’s close interest in how his lands were managed, the records are extremely rich.

Power discovered Bodo, his wife Ermentrude and their three children, Wido, Gerbert and Hildegard in the abbot’s estate book. With enormous skill and imagination she presents them to us as living, breathing people. We learn of a typical day in their lives by watching Bodo as he sets out on a frosty morning with his ox for a day’s ploughing, little Wido coming along to help. Ermentrude’s morning was spent at the big house, where she had to pay the chicken rent (a fat pullet and five eggs), and her afternoon at home weaving cloth. Power goes further, boldly proposing not just what her people did but how they thought and felt about it. Bodo wasn’t happy on that cold morning, having to plough the abbot’s fields when his own were crying out for attention, but he sang lustily to cheer himself and Wido.

We learn that Bodo and Ermentrude spent Sundays and saints’ days singing and dancing to ribald pagan songs, a practice that greatly annoyed church authorities. Frankish Christians such as Bodo still clung to much earlier rites and superstitions, but these the church wisely left alone. Charms were said over sick cattle and incantations over fields to make them fertile. The cure for a stitch in one’s side, or any bad pain, was to lay a hot piece of metal next to it and say a charm to draw out the nine little worms that were eating one’s bones and flesh. (The sensation of the hot metal probably distracted the mind from the stitch, thus making this cure a mite more rational than Frith’s potato remedy.)

If Eileen Power speculated beyond the evidence in conjuring up the inner lives of her medieval people, her thorough immersion in a broad range of sources enabled her to, as she put it, “make the past live for the general reader.” She was a pioneering social historian and for her book’s epigraph she quotes a famous verse in the book of Ecclesiasticus: “Let us now praise famous men and our fathers that begat us.” The problem for many of her fellow historians, she said, was that they had forgotten the fathers that begat us. Her aim was to recognise the “unnamed, undistinguished mass of people, now sleeping in unknown graves,” upon whose slow toil “was built up the prosperity of the world.”


John Curtin is absolutely one of those famous men, and William Frith one of the toilers. What can be learned about him? If I have my genealogical research correct — and there is some ambiguity in the records — William Thomas Frith was born in the small town of Hartley, in central west New South Wales, in 1869, the son of British migrant parents. His father Oscar was a labourer who, in 1882, appeared before a magistrate for failing to send thirteen-year-old William to school. Probably the boy’s labour was needed at home. I have not discovered any evidence that it was a large family, but not all parents bothered to register the births of their children then.

Frith’s story can be told only through snippets; in fact we probably know less about him than we do about peasant Bodo. The Friths were living in the Carcoar region in 1904 when Oscar and William were charged with assault; William was found guilty but the case against Oscar was dismissed. In 1907 Oscar, aged sixty-six and still working, was seriously injured and nearly lost an ear when his horse and cart toppled over an embankment. The first world war offered an escape (of sorts) for rural families living on the edge of poverty but not so much for the Friths. William was too old to enlist, although his younger brother John did scrape in at age forty-four, in 1915. He was returned to Australia medically unfit in 1917.

By 1930 their parents had died and the brothers were living in Wattle Flat, a village thirty-two kilometres north of Bathurst. This, of course, is the famous region of New South Wales where gold had been discovered in 1851, and Wattle Flat apparently once boasted a population of 20,000. A small renewal of mining activity during the Depression might explain why the Friths were living there, listed as miners (“fossickers” might be more accurate) on the electoral roll. John gave up eventually and “went on the track,” but William stayed.

He was apparently unmarried and had no evident involvement in any church, sporting club, trade union, friendly society or any other of those organisations that were the glue that held society together in those times. In 1935 the National Advocate, Bathurst’s main newspaper, noted that Mr W. Frith of Wattle Flat had been admitted to hospital for “medical attention” (for something beyond the powers of a potato, we assume), suggesting that he did have some standing in the community, but in general he appears to have been a loner.

He must have been paying attention to what was going on in the world, however, or he would not have written to John Curtin. The National Advocate was a left-leaning newspaper (it had future prime minister Ben Chifley on its board of directors) and would have been his main source of news. In its pages Frith could have learned of the Japanese entry into the war in December 1941, its aggression in the Pacific in 1942 and the gravity of Australia’s position as a consequence. He could have read Curtin’s exhortations to his people to expect that each and every Australian would have to make sacrifices. The paper covered Curtin’s appeal to the United States for support and his declarations about the need to reorganise labour and industry, introduce rationing and raise funds through war loans. The Advocate supported Curtin throughout. He was one of the “greatest leaders in Australian history,” the paper claimed.

Historians have noted how Curtin’s background as a journalist helped him craft the messages he needed to gain the nation’s support for the unprecedented interventions in social and economic life necessary to win the war. In this he was assisted by press secretary Don Rodgers, but Curtin already had a natural ease with journalists and was frank and informal with them in his twice-daily briefings. He also spoke directly to millions of people in his frequent radio broadcasts, and by adopting a plain and direct style of address came across as a hardworking, humble and honest man.

Not everyone could have afforded a wireless I suppose. I wonder if William Frith had one in Wattle Flat, or could have joined a neighbour to listen in. If so, back in November 1941, shortly after Curtin became prime minister, Frith might have heard Curtin proclaim that:

This Australia is a land of cities and golden plains, of great rivers and vast spaces. It is a land in which countless thousands of plain, ordinary men and women have toiled long, mostly for little reward; who sacrificed and who built our heritage. If this heritage was worth their lives to build, it is worth ours to preserve.

It’s almost as if whoever wrote the broadcast script (Curtin? Rodgers?) had read and remembered Eileen Power’s Bodo and Ermentrude, those slow toilers who built the prosperity of the world. In any case, rhetoric of that kind was exactly what was needed to inspire people like William Frith, whose family had indeed toiled long for little reward. He may have felt (yes, I am speculating beyond the evidence) that now, at last, there was a place for them in the national story.

The effect of that could have been profound, certainly enough for Frith to decide eventually to devise something out of his own small means, in the form of a curative potato, as an offering back to Curtin. And quite possibly he also gave something that Curtin would have valued much more: his vote. In the federal election of August 1943, Curtin’s Labor government defeated the Country–United Australia Party coalition by a landslide. It remains one of the greatest victories in Labor history.

History, as Eileen Power said, is largely made up of Bodos. •

The post John Curtin’s potato appeared first on Inside Story.

]]>
https://insidestory.org.au/john-curtins-potato/feed/ 21
Making a meal of it https://insidestory.org.au/making-a-meal-of-it/ https://insidestory.org.au/making-a-meal-of-it/#comments Mon, 22 Jan 2024 05:49:21 +0000 https://insidestory.org.au/?p=77026

How technology, migration and population transformed crops, foods and ways of eating

The post Making a meal of it appeared first on Inside Story.

]]>
Anthropologists, archaeologists and historians embarked on “Food Studies” long before the discipline arrived in universities — and long before the twentieth century fostered an almost obsessive interest in food origins, recipes and exotic cuisines in the wider population. Culinary journalism and recipe books now frequently include evocative stories of the makeup of meals and the origins of ingredients, along with techniques for creating something approximating the accompanying carefully curated photographs.

Historian Benjamin Wurgaft and anthropologist Merry White’s Ways of Eating: Exploring Food through History and Culture features no images produced by a food stylist; nor does it include instructions about how to make any dish. Instead, the authors interweave stories and analyses of food — its production, its preparation and the meanings people attach to eating — to provide a fascinating cultural and historical overview.

People hunted and gathered food for thousands of years before they developed systems of agricultural subsistence. Wurgaft and White concentrate on how food was produced after agriculture’s arrival, noting the debates that rage in archaeology about its origins. Did people invent cultivation, drawing on observations of the reproductive cycles of animals and plants? Or did climatic change foster the conditions for sedentary settlements that required more intense food production?

Technological changes, migration and population pressure all contributed, and it is probably impossible to isolate any specific causal chain. But whatever the essential conditions were, the results were transformative — of landscapes, ecologies, work, crops, foods and ways of eating.

Wurgaft and White begin by exploring how the domestication of plants and animals spread across world. Drawing on James Scott’s historical analysis of state formation, they dismiss the notion of simple linear progress from nomadic barbarism to settled civilisation. Pastoral nomadism and sedentary farming coexisted for millennia. But they note that farming does appear to “encourage a particular style of cooperative work and social life” and that the material qualities of grain — it can be stored, transported and exchanged for other goods — “aided the rise of the state.” Wheat, rice and corn fed courts, armies and bureaucrats.

The relationship between imperialism and agriculture is complex and the authors succinctly summarise debates about their interaction. Roman and Persian empires, for instance, were built on the wheat that flourished in the regions they originally occupied. The Han Chinese empire was based on rice, and — as the authors write — “no other civilisation, until the rise of industrial agriculture in modernity, reached the same heights of agricultural productivity.” Deforestation, terracing and irrigation, nitrogenous fertilising and soil modification enabled intensification on a grand scale.

All along, productivity and population growth were interacting with changes in agricultural practices and cooking techniques. Deforestation, for example, meant that food preparation had to be quick in order to use a minimum amount of fuel; hence, the invention of the wok and a cuisine using small, thinly sliced meat and vegetables.

As Wurgaft and White observe, we know much more about the dining habits of the wealthy than we do of the poor. The feasts of Roman emperors, medieval courts and aristocratic households were far more likely to be documented than the everyday meals of peasants. Moreover, they were more varied and abundant. Descriptions of patrician feasts, from the Romans to the British Edwardians, reveal an astonishing range of meats, imported fruits and beverages. Patterns of consumption have always reflected economic and social status, with bread and cakes made from fine, white flour exclusively for the rich, and coarse grains providing bread and porridges for the majority. When famines strike, the poor starve.

The history of changing food and eating habits is the history of the movement of people, plants and animals across continents and between nations. During the Middle Ages, people from northern Europe encountered new foods as they waged wars and made pilgrimages. Conquerors brought back new ingredients and slaves who knew how to prepare them; pilgrims returned with a taste for “foreign” dishes and drinks. The use of rare and exotic ingredients, then as now, was indicative of wealth, social status and worldly sophistication. Spices, imported from China, India and the Middle East, were used not only to preserve food but also to display social status and cultural capital.

But the most dramatic transformation of European and Asian cuisines occurred during the “Columbian exchange” that followed the conquest and colonisation of the Americas. Historian Alfred Crosby, who coined the term in his 1972 book on the subject, revealed the complexity and extent of transatlantic exchange and the magnitude of its impact across the globe. Wurgaft and White endorse his view that this constituted a “tectonic shift” in agriculture, staple foods, national cuisines and eating habits.

Plants and foodstuffs now associated with Mediterranean cuisines, such as tomatoes, capsicums and corn, were initially treated with suspicion. Potatoes — disparaged as suitable only for peasants and their animals — were embraced by the bourgeoisie after cooks discovered their delicious flavour when combined with cream and butter. It is difficult to think of Italian food without tomatoes and astonishing to imagine the foods of Korea, India and other Asian countries without chillies.

People and plants flowed in both directions. Sugar, originally from India, was an established crop but a luxury foodstuff in Spain by the sixteenth century. Until the eighteenth century, honey remained the main culinary sweetener for rich and poor throughout Europe; then, with colonisation and the exploitation of African slave labour, sugarcane plantations flourished in the Caribbean.

English sweet puddings, German cakes, Belgian chocolate and French patisserie, all relatively recent inventions, evolved in the context of the Atlantic slave trade. Rice varieties from West Africa were introduced to feed slaves in the Caribbean and Central America, and were only gradually replaced by Asian varieties a century later. Peanuts arrived in Northern Africa from Peru and Bolivia, and were incorporated into many regional African cuisines. Creole cuisines in the southern states of America were dominated by rice and Old World vegetables, especially okra.


Ways of Eating, a broadbrush history written for a general readership, is full of fascinating stories. Vignettes interspersed between chapters describe specific food producers, foodstuffs, culinary techniques and cultural ideas about food. White, recounting a visit to a coffee plantation in Panama where the highly prized gesha beans are produced, compares her tour to a hajj, not only a signal of “a coffee person’s seriousness of intent” but also a means of gaining esoteric knowledge and status in the world of coffee connoisseurs. Gesha coffee’s apparently unique flavour ranges “from a tea-like smokiness to something like grapefruit peel.”

Novelty, rarity and heritage varieties continue to lure the gourmet and the chef. Pepper and cinnamon, once rare commodities, are now so common as to be mundane. Even so, spice’s exotic appeal persists, and for the discerning consumer Tellicherry pepper from Malabar or Kampot pepper from Cambodia are more prestigious than common black pepper, their use in a recipe lending cachet to dish and chef.

The emphasis on authenticity or the exact replication of a dish from a region or a restaurant menu is a recent phenomenon. White suggests that those who denigrate dishes that don’t match some culinary Platonic ideal make “a fetish of the social and environmental conditions that make an ingredient or dish possible.” Food has fashions and recipes have always depended on the availability of ingredients as well as the skill and imagination of cooks.

In fact, all “national” cuisines have adopted novel foreign ingredients and adapted recipes to local tastes. Japanese Hawaiians invented Spam sushi. After Senegalese soldiers in the French colonial army developed a taste for nem, sold as street food in Hanoi, some returned home with Vietnamese wives whose adaptations of the recipes using local ingredients naturalised these fried rolls. Senegalese nem are different from the Vietnamese originals — but they are not ersatz, just distinctive. The same can be said of Japanese croissants or Australian gelato. White and Wurgaft are clearly connoisseurs of food, but their book challenges ideas about refined taste, authenticity and tradition.

Colonisation, commoditisation, industrialisation and globalisation have transformed diets at an unprecedented rate. Rare and exotic ingredients that were formerly delicacies for the wealthy can now be found on supermarket shelves. Food has always provided ways of expressing cultural identity, regional differences, degrees of sophistication and economic status. Wurgaft and White trace these processes over centuries and across the globe. Their conclusions are both celebratory and thought-provoking.

Agriculture has brought humans extraordinary benefits, but it has also resulted in disastrous depletion of soils and environmental devastation. Many foods arrive in our homes with a heavy carbon footprint. The most common foods touted as “fair trade” are coffee, bananas, tea and cocoa — all grown in countries where many people, including growers, continue to live in poverty. There are ironies and paradoxes in contemporary ways of eating, and the combined forces of history and anthropology are excellent ways of thinking about them. •

Ways of Eating: Exploring Food through History and Culture
By Benjamin A. Wurgaft and Merry I. White | University of California Press | $45.95 | 256 pages

The post Making a meal of it appeared first on Inside Story.

]]>
https://insidestory.org.au/making-a-meal-of-it/feed/ 12
China’s underground historians https://insidestory.org.au/chinas-underground-historians/ https://insidestory.org.au/chinas-underground-historians/#comments Thu, 04 Jan 2024 22:50:23 +0000 https://insidestory.org.au/?p=76913

A veteran China watcher uncovers a network of counter-historians

The post China’s underground historians appeared first on Inside Story.

]]>
Over the past decade, under president Xi Jinping, China’s Communist Party has stepped up its efforts to subjugate history. Interlinked and increasingly high-tech mechanisms of surveillance, control and censorship are today on high alert for outbreaks of what the party calls “historical nihilism” — any telling of history that deviates from the official narrative in which the party is and always has been Great, Glorious and Correct.

A famine that killed tens of millions of people? Blame it on natural disasters and that damn Khrushchev. Political campaigns that became wildly murderous? Not our fault — those excesses were the work of overly zealous, even rogue, local officials. Any awkward truths that can’t be swept under the carpet must be explained away, woven together with half-truths and lies into the fringe of the carpet itself.

Journalist Ian Johnson’s new book, Sparks: China’s Underground Historians and Their Battle for the Future, relates the stories of people, extraordinary in their tenacity and courage, who persist in peering at the mess under that carpet and unpicking the tightly knit threads. They sneak their cameras into former labour camps to reveal human bones still protruding from the soil, interview the last survivors of famines and massacres, and create online archives and offline samizdat journals to record their findings. Among their number are the “citizen journalists” who record history in the making, including those who documented scenes in hospitals and elsewhere in Wuhan during that city’s draconian Covid-19 lockdown in early 2020.

For thousands of years, as Johnson notes, history has been “inseparable” in China from the concept of moral instruction. The independent researchers devoted to historical investigation he is writing about believe that “a moral society cannot be based on lies and silence.” But to refute the lies and break the silence, these intrepid men and women, sometimes armed with little more than curiosity, a smartphone and internet access, must play a dangerous cat-and-mouse game with security forces.

Many do their work under suffocating levels of surveillance. Others have been put under house arrest or worse, with some prison sentences longer than those handed out to convicted rapists. If they risk their freedom and even their lives to shine a light into some of contemporary Chinese history’s darkest corners, they do so because they believe that “the party’s monopoly of the past” is “the root of their country’s current authoritarian malaise.”

These “counter-historians” are generally less interested in the elite machinations behind catastrophic events than the “degradation of the individual” after the events have been set in motion. To discuss the culpability of party leaders would be asking for even more trouble, of course. But they are genuinely devoted to recovering and honouring the stories of ordinary people. At the same time, Johnson writes, they tend to avoid “heroizing” the victims. The histories they produce may necessarily be incomplete, but they are persuasively nuanced.

The carefully constructed official history, by contrast, is intolerant of nuance or deviation. Engraved in textbooks, promoted in films, enshrined in museums and embodied in the sacred sites of “Red tourism,” it lies at the heart of the party’s legitimacy. It narrates the story of how the communists saved the Chinese people from a “feudal” past as well as the Japanese enemy without and the class enemies within. It tells how the party has kept China safe in a hostile world, governed it wisely and justly, and raised it from poverty to prosperity and power.

Over the past eighty years the party has produced three historical resolutions, “each a cartoonish version of history” intended to justify the rule of the latest leader. The official history endorses the party’s right to rule China today, more than seventy years after the revolution that brought it to power. It paves the way for that rule to continue into the future without any need for checks and balances or popular elections.

To raise questions about the great famine or the systemic nature of the violence during the land reform era or the Cultural Revolution is to ask, in effect — why are you still the boss of us?

The party watched with apprehension and then with horror as the policy of glasnost (transparency) championed by Gorbachev in the mid 1980s led to a rush on the Soviet Union’s historical archives. Soviet citizens were suddenly free to remember and discuss the savagery of the Stalinist era: the political purges, the famines, the midnight knocks on the door, the desolate and murderous gulag of labour camps, the ruined and wasted lives. Just six years later, the Soviet Union collapsed.

Lesson taken. The Chinese Communist Party’s post-Mao leadership, also shaken by the mass pro-democracy protests of 1989, tightened control over political and intellectual discourse. Yet independent thinkers, many of whom had personal experience of upheavals like the famine and Cultural Revolution, both as participants and victims, were compelled to record, research and analyse. Work that couldn’t be published in the mainland was frequently published in Hong Kong.

Among those who laid the path walked by the generation described in Sparks were the oral historian Sang Ye, the writer Liu Binyan, the historical investigator Dai Qing and the journalist Yang Jisheng. If there is one criticism I have of Sparks, otherwise an exemplary, well-researched and vital book, it’s the author’s failure to mention these pathbreakers, the post-Mao pioneers of the movement to which the people he writes about belong. Another curious omission is Wang Youqin, whose epic archival work on the victims of the Cultural Revolution was published in English in an abridged and edited form in 2023.

Johnson’s focus, however, is on more recent times. He observes that a confluence of events and trends in 2003 led to a surge in grassroots history writing. Contributing factors included popular outrage over the government’s suppression of news about the SARS epidemic that year and the application of market forces to Chinese media, which led to a partial liberation from direct control by the party. Xi Jinping’s ascension less than a decade later marked the end of this brief golden age and the beginning of what Johnson describes as Xi’s “forever crackdown” on “historical nihilism.”

And yet the independent historians persist, driven by the importance of what they are doing. The focus of their work may be as narrow as the experience of a single county in a single month of the Cultural Revolution or as broad as the question of guilt and the value of apologies. Collectively, their work reveals that even when the Communist Party shifts the blame for mistakes and crimes onto a few bad eggs, it rarely punishes them, and if so, even more rarely to any degree commensurate with their crimes. They also demonstrate that violence has always been far more pervasive and systemic than the official story suggests.

It’s not just the Communist Party that resists telling these stories. Many of those who have suffered through the events these historians are studying don’t want to talk about them. Some just want to put the trauma behind them. Others don’t want to get in trouble or jeopardise their children’s futures. They have buried the past to rebuild their lives as though, Johnson writes, “The suffering somehow cheapened this world of newfound prosperity, a reminder that it was built on violence.” In a different context (the Wondery podcast Ghost Story) the British historian Nicholas Hiley has discussed the “destabilising” nature of revealed historical truth — the past is not always a happy place and the truth does not always set people free. And yet we carry that past around with us — and it informs the present whether we like it or not.


Ian Johnson is a veteran, Pulitzer Prize–winning China journalist and a Sinophone whose work balances academic rigour with good storytelling. Sparks is the culmination of years of meeting with and even going on reporting trips with the underground historians he profiles here. He inserts between the chapters short vignettes, “Memories,” that offer, in his words, “sketches of people, places, and iconic works of counter-memory that demonstrate the ambition of China’s underground historians: to write a new history of contemporary China in order to change their country’s future.”

Sparks takes its title from a samizdat journal from the 1950s whose history has been uncovered by one of the historians Johnson profiles. The book joins a growing list of publications in English that together are creating a far richer picture of China’s history than that to which non–Chinese speakers have previously had access. They include works like Yang Jisheng’s Tombstone and Wang Youqin’s Victims of the Cultural Revolution, both of which were edited and translated by Stacy Mosher and Guo Jian; Louisa Lim’s brilliant Indelible City, about Hong Kong; Jonathan Clements’s essential new history of Taiwan, Rebel Island; and Tania Branigan’s Red Memory, in which Wang Youqin features heavily.

Johnson contends that the “vibrancy of China’s counter-history movement” — which also includes creative reconstructions of historical events and personages by artists and writers — “should force us to retire certain clichéd ways of seeing China.” These include the tendency to see its authoritarianism as successfully monolithic. While not denying that “these are dark times,” he champions the counter-history movement as a significant form of resistance. As one of the young members of the group behind the original samizdat journal Spark put it back in the 1950s, “If you do not break out of silence, you will die in silence.” •

Sparks: China’s Underground Historians and Their Battle for the Future
By Ian Johnson | Allen Lane | $55 | 400 pages

The post China’s underground historians appeared first on Inside Story.

]]>
https://insidestory.org.au/chinas-underground-historians/feed/ 19
Ancient autocrats https://insidestory.org.au/ancient-autocrats/ https://insidestory.org.au/ancient-autocrats/#comments Wed, 03 Jan 2024 02:41:14 +0000 https://insidestory.org.au/?p=76891

The dangerous appeal of absolute rulers

The post Ancient autocrats appeared first on Inside Story.

]]>
Mary Beard insists that we shouldn’t look to the ancient Romans for answers to modern political problems. But in the final pages of her latest, compulsively readable history of Rome, the emerita professor of classics at Cambridge University does issue a clear warning about the dangerous appeal of one-man rule.

Beard, a deeply read classicist, is also a commentator, TV star and bestselling author; one of the great populist-scholars of our time, she has brought the ancient world into contemporary consciousness like no other.

She is right to insist that the inhabitants of the past can’t be expected to project any sort of ready-made solution onto our troubles. Besides, judging by the findings of her latest research, the kinds of suggestions the Romans themselves would make might not be palatable; they might indeed hasten the decline of our familiar institutions and make the slide towards one-man rule inexorable.

Beard’s third book-length analysis of Rome — following her narrative history of Rome SPQR (2015) and the quirkier art history of Twelve Caesars (2021) — focuses on the emperors who ruled Rome for more than 300 years from Augustus to Alexander Severus. Her latest book, Emperor of Rome: Ruling the Ancient Roman World, is not a chronological narrative of the careers of individual emperors but a thematic, institutional description of what Beard calls the “category” of emperor: the Roman system of one-man rule.

This institutional approach allows her to discern patterns and themes, and to ask questions like: How did one-man rule work? How did the emperors get things done, in Rome and abroad? And, more pertinently, how did they get a republic — albeit a deeply flawed oligarchy already succumbing to warlordism and civil war, and a slave state to boot, but a vibrant political and social culture all the same — to accept, comply with, and ultimately embody a system of autocracy, a culture of political strongmen?

The first part of Beard’s answer is that the emperors purchased stability by bringing the military under close personal control. Under the Roman republic, legions had theoretically been controlled by the Senate but increasingly, in practice, by wealthy warlords. From Augustus, the emperors put the legionaries, auxiliaries and veterans on the state budget and acted as commander-in-chief — to use the American term — to exert force against external enemies on the distant frontiers and to protect the regime against internal ones in Rome.

This did not mean Rome was “full of men in uniform and march-pasts… such as Trooping of the Colour or Bastille Day,” Beard notes. “The city of Rome itself was strikingly demilitarised even by the current standards of Western capitals.”

Indeed, a second part of Beard’s answer is that the emperors were also careful to preserve the trappings of republican norms and institutions. Even if the Senate became a powerless debating forum, and even if the consular officials served for only two months instead of twelve as formerly, they kept the wealthy “senatorial elite,” as Beard calls them, busy, respectable and ever looking to promotion and proximity to imperial power. And, as Beard dryly notes, replacing elections with imperial appointment saved the elite the tedium and expense of populist politics.

With the imperial palace now the real source of executive authority, who did the actual work of running the city and the empire? Beard describes a system of “government by correspondence,” with letters flowing between the emperor and governors around the empire. The emperor was also personally involved in receiving petitions and adjudicating tricky lawsuits. Literate and loyal staff were needed.

But this kind of work was too menial for the wealthy elite. They would rather govern a province or command a legion than push paper in the palace. Besides, allowing a powerful citizen to become established in the back rooms of the palace was a risk an emperor might well have wanted to avoid.

Better, it seems, to rely on trusted and tractable slaves. In the imperial court, it was slaves and ex-slaves, or freedmen, who did most of the actual work. Cooks, doctors, footmen, hairdressers, gardeners and the all-important food-tasters: they provided the personal service to keep the ruler, and his family, comfortable and alive.

And they also provided the administrative, managerial and clerical muscle that the imperial system — the financial controllers, secretaries, letter writers in Latin and in Greek, librarians, petitions clerks, advisers and counsellors, and trusted emissaries — needed to control a boisterous city and a huge empire. Skilled and experienced officials provided administrative continuity from one emperor to the next.

But this politico-administrative logic generated what Beard calls “pressure points” in the imperial system. Senators were aghast that freedmen could enjoy imperial trust and exercise imperial power. Ultimately, it forced the question: who was really running the show? Pliny complained that the “chief sign of a powerless emperor was powerful freedmen.” As Beard acutely observes, that “d” in freedman is crucial.

Ultimately, though, even strongmen weaken and die, highlighting the ultimate vulnerability of one-man rule: mortality and the problem of succession. All despots are would-be dynasts, but hereditary succession can be messy when it involves feckless sons or scheming brothers; mutinous generals and seditious senators pose further risks. Augustus hit upon a novel solution. His natural heirs having all died, he ended up choosing his wife’s son by a former marriage, Tiberius, and made his wishes clear by “adopting” him as his son.

Adoptive succession became the norm. It served to widen the talent pool, even creating the impression of an imperial meritocracy, Beard observes, “while still presenting the transmission of power in family terms.” It was not until 79 CE, after more than a hundred years of imperial rule, that a biological son (Titus) actually succeeded his father (Vespasian); it didn’t happen again for another century. Meanwhile, for eighty years, five emperors in a row from Nerva to Marcus Aurelius “adopted” their successors.

It is as if an American president could nominate a vice-president into a powerless but prominent role on the explicit understanding that she will succeed when the president dies (and not before). There is no “lesson” in this for today’s politics, of course.

Though plenty of assassinations and two further civil wars caused rapid turnover in the imperial throne, the practice of adoption did stabilise and strengthen the imperial system.


Beard’s entertaining style conceals the seemingly effortless command of the scholarship that underpins Emperor of Rome. She ranges across time and across source material; she tirelessly draws on the archaeological evidence of palaces, statues and coins, the literary evidence of poetry, speeches, letters and histories, the epigraphic evidence of tombstones and miraculously preserved legal documents; she takes nothing at face value, and is constantly challenging received historiographical wisdom. A particular pleasure are the bibliographical essays that summarise the relevant scholarship of each chapter in place of footnotes.

But at times the great populariser is guilty of over-popularising. Her focus on the institution of the emperor doesn’t prevent her from indulging some of the juicier tales. “This was a world of toddlers and teenagers as well as grown-ups and greybeards” is one of several sentences that could safely have been subbed out. Mary Beard is also unaccountably mean to Marcus Aurelius’s Meditations, retitling the great Stoic’s philosophical autobiography as Jottings to Himself.

Anyone familiar with the murderous court intrigues of Hilary Mantel’s Tudor Court in Wolf Hall, or the tamer Windsor jostling in The Crown, will be right at home in Beard’s imperial Rome. Less fictionally, the cronyism and flattery of Trump’s (first?) White House, its performative bombast and self-indulgent fakery, may also be recognisable in Beard’s account of the same pathologies displayed by the ancient autocrats. Don’t forget that Rome, cynical and secular, even posthumously declared some of its emperors as divine.

As one-man rule arrived and stayed, Beard points out, scarcely anyone complained. Senators became, in the words of Tacitus, ineffective dissidents or cowards, flatterers and job-seekers; “power dining” with the emperor became, as she brilliantly demonstrates, an occasion of risk and uncertainty as well as an opportunity for promotion and proximity to power.

In short, “despite the loud protests against the crimes and misdemeanours of individual rulers, or the discontent with some aspects of one-man rule, there is hardly any trace of significant resistance to one-man rule as such.”

Indeed, as Beard observes in the closing paragraphs of this book, autocracy throughout history has depended on people at all levels accepting and adjusting. “It is not violence or the secret police, it is collaboration and cooperation — knowing or naive, well-meaning or not — that keeps autocracy going.” This is the lesson we are to mark. •

Emperor of Rome: Ruling the Ancient Roman World
By Mary Beard | Allen & Unwin | $65 | 512 pages

The post Ancient autocrats appeared first on Inside Story.

]]>
https://insidestory.org.au/ancient-autocrats/feed/ 32
To Paris, from the land of fire https://insidestory.org.au/to-paris-from-the-land-of-fire/ https://insidestory.org.au/to-paris-from-the-land-of-fire/#comments Fri, 22 Dec 2023 09:02:29 +0000 https://insidestory.org.au/?p=76821

Newly translated, Azerbaijan-born Banine’s memoirs chronicle her extraordinary early years

The post To Paris, from the land of fire appeared first on Inside Story.

]]>
On the recent celebration of my eighty-fifth birthday my children surprised me by asking what I thought was the best decade of my life. I shrugged and said there was good and bad in each of them. I knew even then it was a fairly limp answer for such an important question, and wished I could come up with something better, at least with a little more flair. Something more on the lines of this: “When I look back over my already very long life I am always surprised, astounded even, by its not very poetic resemblance to a Neapolitan ice cream with its layers of different colours and flavours.”

That delicious sentence was written by a woman born Umm El-Banu Assadullayeva, and comes from Days in the Caucasus, her memoir’s first volume. It reveals a distinctive juxtaposition in her prose, in this book and in its sequel, Parisian Days. There’s a curious self-effacement combined with a resolute lightheartedness and flashes of wry wit, the work of a woman whose life was a rollercoaster of heartache, love and adventure.

She was born in 1905 in Baku, the capital of Azerbaijan on the Caspian Sea, and came to be known in twentieth-century Paris as the writer Banine. Her mother had died giving birth to her, her three sisters were quite a bit older and her father didn’t remarry for many years, though the family “welcomed polygamy and disapproved of celibacy.” The family she wrote of were “oil millionaires” — stupendously, one might say ridiculously, rich — who in one generation had leapt from peasantry to plutocracy from the oil discovered on their land.

She was a lonely but happy and imaginative child. Her father, still in his thirties and, like his brothers, thoroughly Europeanised from his travels, had hired a Baltic German governess for his daughters. Fraulein Anna was Banine’s mainstay, a mother substitute and “guardian angel” who schooled her in German and encouraged her to learn the piano.

But her paternal grandmother, “a large, fat, authoritarian woman, veiled and excessively fanatical,” ruled the roost, sticking to the old traditions. She loathed Christians, spoke only Azeri, a Turkic language itself a sub-branch of Azerbaijani, wore the clothes typical of observant Muslims at the time, and preferred sitting on floor cushions to any of the sumptuous European-type furniture to be found in the “reception rooms” of Banine’s father’s apartment.

Thus, here was a young girl buffeted between two radically different influences and traditions, though apart from the grandmother the family was not particularly religious. Banine took refuge in books and daydreaming, the necessary humus for any writer it seems, although it took many years before she became one.

Azerbaijan (Persian “land of fire,” for the spontaneous fires occasioned by its oil slicks) was part of the Russian empire. Its people were mainly Christian Armenians and Shiite Azerbaijanis who, as Banine describes it, periodically massacred each other in revolving reprisals. A smattering of Georgians and Russians also lived there. In the year of her birth the empire was in turmoil, until Tsar Nicholas II made his small, grudging concession to democracy.

Then, early in 1918, the year Banine turned twelve, Nicholas was forced to abdicate, not long after which the province became the Azerbaijan Democratic Republic and Banine’s father, now remarried and father to a son, was its minister of commerce. When the Bolsheviks solidified their control, the province lost its independence. Her father was thrown into prison.

The family’s traditionally pragmatic attitude to sex and marriage is relevant here. Polygamy was normalised in Islam, as was same-sex coupling for young unmarried males. For Banine’s father and others of his generation this was changing, but marriage in the upper class was still essentially a business proposition with love reserved for extramarital liaisons.

In this scheme of things the hymen was the husband’s trophy, pleasure an incidental consideration. Banine’s cousin Gulnar, for instance, was eager to get married so she could indulge her sexual appetite with a succession of partners in addition to her promised husband. But Banine, the dreamer, longed for a different trajectory, and had fallen deeply in love with a dashing Bolshevik commissar. Unlike any of Gulnar’s conquests, hers was an intensely romantic affair fuelled by a mutual love of literature (he her Prince Andrey, she his Natasha) but had yet to be consummated. There were plans, though, for her to elope with him to Moscow and be wedded there.

Knowing nothing of this, the family had two other suitors in mind. One was another cousin, the other a man who’d ingratiated himself by helping get Banine’s increasingly weak and emaciated father released. Then there was the problem of getting her father to Paris, where his wife and young son were waiting, and it was this same man’s connections he depended on for that. Still the dutiful daughter, and even though she hated her father for “blackmailing” her, she agreed to marry the man.

“Filial affection,” as she wistfully defined it, won the day. Without a word to her commissar, she failed to turn up at the designated rendezvous that would have swept her off with him to Moscow. Instead she was yoked to a man twenty years her senior whom she loathed with all her heart. She was all of fifteen.

The tone of the memoir’s sequel is even more bittersweet. In Days in the Caucasus she had written of her father and two sisters eventually finding refuge in Paris. Parisian Days finds her on the Orient Express to join them. In Paris her father and stepmother are renting a large, luxurious apartment on the fashionable Rue Louis Boilly, where they stay until they run out of jewellery: “the sole, slim remains of our oil barons’ fortune, democratised, collectivised, nationalised, volatilised in the revolutionary explosion, which consumed all our privileges in its flames.”

From the moment of her arrival, Banine is enthralled with Paris. She is even happy when her father’s “last pearl” is sold and they are all forced to move from the Rue Louis Boilly apartment. Now on her own, she is lent a maid’s room seven flights up in a building on the Champ de Mars, and like many Russian émigrés of the day, some of whom were princesses, she finds work as a mannequin in an upscale Parisian fashion house.

What are they to make of her too-Oriental looks, her large derrière, not to mention the over-fuzzy Azerbaijani hairstyle? She moves to another, more simpatico house, and there she picks up tricks of the trade. But although she makes friends easily there and the job is her only means of survival, she is unrelievedly bored. Augmenting their pitiful wages as courtesans, the women talk exclusively of beauty, clothes and catching ever more wealthy men. They dub Banine the “little Caucasian goose.”

Salvation comes in the form of an older sister. Zuleykha, a painter, had settled in Paris long before, and she and her Spanish husband José, another painter, set up a bohemian salon in their studio compound. (Banine referred to it as Josézous.) “The guests drank, ate, debated and danced with the passion of youth and exotic temperaments prone to excess of all kinds. We couldn’t get away without a bullfight, almost as noisy as a real one.” Her sister and brother-in-law introduce her to the Montparnasse nightclubs and Paris’s huge community of Russians who’d fled the revolution.

These are the Années folles, those crazy years that spanned the end of the first world war and the onset of the Depression. And though she is definitely the young hanger-on, the timid third wheel, she revels in the company and ambience. She is watching, listening, slotting it all into memory.

In a curious way, poverty has released her, as it has softened her father. Regretting her coerced marriage, he readily sanctions divorce. (Because of her refugee status and the husband’s Turkish residence, this is more easily said than done.) Nonetheless the conjugal experience leaves her resolutely chaste for years. The Montparnasse campaigns to correct this routinely fail, even when intensified by the surprise arrival of long-lost cousin Gulnar, who has finally made it out of Baku through her own particular version of the legerdemain that émigrés were forced to adopt. Within a matter of minutes, Gulnar has Banine abandoning her seventh-floor maid’s room and sharing a flat with her.

Was Gulnar the full-blown sexual predator portrayed? The relationship was doubtlessly complicated, yet I detect the writer at work here. Striking, full-lipped Gulnar is the perfect foil, a gift to any memoirist. As is Jerome, the cultured Frenchman who acts as a kind of psychopomp, ushering the two women through the high life of Paris, its sparkling nightlife and the tangles of their love lives. As for Banine, she finally succumbs to the blandishments of one of Jerome’s rich friends, an older Orléans widower surgeon to whom she was unaccountably mean and who, after some time and hardly surprisingly, unceremoniously dumps her.

And so Parisian Days ends. Gulnar has sailed off to America, having bagged a handsome, young, fabulously rich Texan. As generous as she is acquisitive and life-loving, she has left behind all her money for Banine, the handsome husband offering her a pension. Needless to say, Banine is stunned. “My cousin whom I had so often envied and hated overwhelmed me with largesse.”

Alone now, she finds her way to the Bois de Boulogne, considering her future. Because of Gulnar’s wholly unexpected legacy, she can contemplate leaving the fashion house and chance her arm at writing. The book’s last sentences encapsulate the special amalgam of bravery and self-deprecation that characterises its protagonist throughout: “Life was waiting for me. I had to go and meet it despite the burden of my reluctant heart.”


Banine’s first published work was a novel, Nami. Set in Baku and Russia, and based on her experiences of the revolution and civil war, it appeared in 1942. She made her name in Parisian literary circles with Days in the Caucasus, published three years later. Parisian Days appeared in 1947. She wrote in French, which by then had become her natural language. I Chose Opium deals with her conversion to Roman Catholicism. It too had a sequel, After. She also supported herself translating Dostoevsky’s books and those of other writers into French.

Banine is in the process of being rediscovered. Anne Thompson-Ahmadova, the translator of these two books into English, tells us that Days in the Caucasus was reissued in French in 1985. Banine revised Parisian Days in 1990, and it is this version that Pushkin Press has published. The Soviets invited Banine to Baku after Days in the Caucasus appeared, but she declined the invitation, a decision she regrets in an author’s note to its reissue. An Azerbaijani translation didn’t appear until 1992, the year of Banine’s death.

Not having read Banine in her original French, and as is the case with any such translation, I can only take Thompson-Ahmadova’s on trust. Once or twice I came across a phrase where the English rang just a little too colloquial, but overall she seems to have captured the flavour of the author’s voice, and the vividness of the people and events she brought to life.

It’s always exciting to see a long-neglected writer resurrected, and what a gift to readers Days in the Caucasus and Parisian Days are. Others have praised Banine for being another Colette, and there is some truth in that. But I doubt if there’ll ever be another Banine. •

Days in the Caucasus
By Banine | Pushkin Press | $34.99 | 274 pages

Parisian Days
By Banine | Pushkin Press | $34.99 | 255 pages

The post To Paris, from the land of fire appeared first on Inside Story.

]]>
https://insidestory.org.au/to-paris-from-the-land-of-fire/feed/ 25
Domino days https://insidestory.org.au/domino-days/ https://insidestory.org.au/domino-days/#comments Thu, 14 Dec 2023 04:59:20 +0000 https://insidestory.org.au/?p=76757

Fifty years later, the Vietnam war still echoes around Southeast Asia and across the Pacific

The post Domino days appeared first on Inside Story.

]]>
The fifty-year anniversaries of the Vietnam war — America’s greatest strategic blunder of the twentieth century — keep arriving. January marked the signing of the Paris Peace Accords in 1973, March commemorated the departure of the last American combat soldier from Vietnam, and this month was the fiftieth anniversary of the Nobel Peace Prize awarded to Vietnam’s Le Duc Tho and the United States’ Henry Kissinger for negotiating the ceasefire.

Amid those anniversary moments, US president Joe Biden flew to Vietnam in September, the fifth sitting American president to visit since Bill Clinton re-established diplomatic ties in 2000 and “drew a line under a bloody and bitter past.”

In Hanoi, Biden and Communist Party general secretary Nguyen Phu Trong “hailed a historic new phase of bilateral cooperation and friendship,” creating a strategic partnership that expressed US support for “a strong, independent, prosperous, and resilient Vietnam.”

With such flourishes, history delivers irony garnished with diplomatic pomp. Expect many shades of irony in April 2025, the fiftieth anniversary of the end of the war, when Saigon fell to North Vietnamese forces. (Note the way the war is named: Australia joins America in calling it the Vietnam war; the Vietnamese call it the American war, the concluding phase of a thirty-year conflict.)

The shockwaves that ran through Asia after the second world war were driven by geopolitical fears that imagined nations as dominos toppling into communism. As France fled Indochina and Britain retreated from Southeast Asia, the United States stepped in to stabilise what it saw as a series of tottering states in Southeast Asia.

The proposition that the Vietnam war was “fought for, by, and through the Pacific” was the focus of a conference at Sydney’s Macquarie University that is now a book with nineteen chapters from different authors.

The editors of The Vietnam War in the Pacific World, Brian Cuddy and Fredrik Logevall, describe a wide gap between US rhetoric and the military reality of the region. The US claimed it was acting to save the whole of Southeast Asia, they write, but “the documentary record suggests that Washington lacked a suitable appreciation of how the war in Vietnam was linked to the politics of the wider region.”

In a chapter on “the fantasy driving Australian involvement in the Vietnam war,” the historian Greg Lockhart, a veteran of the war, writes that the “red peril” rhetoric of the Menzies government “disguised its race-based sense of the threat from Asia.” By 1950, he writes, Australian policy had been shaped by an early British version of domino thinking and the “downward thrust of communist China,” a thrust that linked the perils of geography to the force of gravity.

Just before the defeat of French colonial forces at Dien Bien Phu in 1954, US president Dwight Eisenhower proclaimed the fear that drove US policy: “You have a row of dominos set up, you knock over the first one, and what will happen to the last one is the certainty that it will go over fairly quickly.” The theory held that the Vietnam domino, with pushing by China, would topple the rest of Indochina. Burma and Malaya and Indonesia would follow. And then the threat would cascade towards Australia and New Zealand.

Lockhart scorches the way these fears led Australia to Vietnam:

Between 1945 and 1965, no major official Australian intelligence assessment found evidence to support the domino theory. Quite the reverse, those assessments concluded that communist China posed no threat to Australia. Shaped by the geographical illusion that “China,” or at least “Chinese” were “coming down” in a dagger-like thrust through the Malay Peninsula, the domino theory was the fearful side of the race fantasy, the nightmare that vanished once it had fulfilled its political function.

The US strategic ambition of containing communism in Asia “had been very largely achieved before the escalation of US forces in Vietnam in 1965,” Lockhart concludes, because Thailand, Malaysia, Singapore, the Philippines and Indonesia were already “anti-communist nation-states.”

The same quickly became true of Indonesia, where the military takeover in 1965 was a decisive shift towards the United States, destroying the largest communist party outside the Eastern bloc. Yet US president Lyndon Johnson used Indonesia to proclaim what American historian Mark Atwood Lawrence calls “the domino theory in reverse.” LBJ’s argument by 1967 was that the Vietnam war was necessary as a “shield” for a virtuous cycle of political and economic development across Southeast Asia.

Lawrence laments that few in Washington followed the logic that “Indonesia’s lurch to the right, far from justifying the war in Vietnam, made that campaign unnecessary by successfully resolving Washington’s major problem in the region.” He cites evidence to a Senate committee in 1966 by a legend of US diplomacy, George Kennan, that events in Indonesia made the risk of communism spreading through the region “considerably less.”

In 1967, the US Central Intelligence Agency appraised the geopolitical consequences of a communist takeover of South Vietnam. Lawrence says a thirty-three-page report “concluded that the US would suffer no permanent or devastating setbacks anywhere in the world, including even in the areas closest to the Indochinese states, as long as Washington made clear its determination to remain active internationally after a setback in Vietnam.” The study, as he observes, had no discernible impact on LBJ’s thinking. Instead, Washington stuck with its “iffy” and “problematic” assumptions about falling dominos and the interconnections among Southeast Asian societies.

For the new nation of Singapore, separated from Malaysia in 1965, the era offered the chance to build links with the United States and hedge against bilateral troubles with Malaysia and Indonesia. S.R. Joey Long writes that prime minister Lee Kuan Yew used Washington’s Vietnam focus to cultivate America for both weapons and investment: “The inflow of American military equipment and capital enhanced the Singaporean regime’s capacity to defend its interests against adversarial neighbours, further its development strategies, distribute rewards to supporters, neutralise or win over detractors, and consolidate its control of the city-state.” A later chapter quotes a CIA report in 1967 that 15 per cent of Singapore’s gross national product came from American procurements related to the war.

During his long leadership, Lee Kuan Yew always proclaimed the one remaining vestige of an argument for the US war — the “buying time” thesis, which claims that the US provided time for the rest of Southeast Asia to grow strong enough to resist domino wobbles.

Mattias Fibiger’s chapter on buying time calls the idea a “remarkably durable” effort to transmute US failure into triumph. What president Ronald Reagan later called a “noble cause” is elevated to a constructive breathing space. “America failed in Vietnam,” according to the Henry Kissinger line, “but it gave the other nations of Southeast Asia time to deal with their own insurrections.”

From 1965 to 1975, the region “became far more prosperous, more united and more secure,” Fibiger notes, and he finds “some truth to the claims that the Vietnam war strengthened Southeast Asia’s non-communist states, stimulated the region’s economic growth, and led to the creation of ASEAN — all of which left the region more stable and secure.”

The creation of the Association of Southeast Asian Nations in 1967 (with an original membership of Indonesia, Malaysia, the Philippines, Singapore and Thailand) is a milestone in the region’s idea of itself. ASEAN’s greatest achievement is to banish — or bury deeply — the danger of war between its members. This is region-building of the highest order. Earlier attempts at regional organisation had failed. Indeed, Fibiger notes, conflict seemed so endemic that a 1962 study was headlined, “Southeast Asia: The Balkans of the Orient?” ASEAN has helped lift the Balkan curse.

The founders of ASEAN certainly looked at Vietnam and knew what they didn’t want. While the war inspired “fear of American abandonment,” Fibiger thinks any relationship between the conflict and the strength of the region’s non-communist states is indirect. American military actions had little bearing on the ability of governments outside Indochina to command the loyalty of their populations.

Commerce, not conflict, became the region’s guiding star. In the quarter-century after 1965, the economies of East and Southeast Asia expanded more than twice as quickly as those in other regions. The eight “miracle” economies — Japan, South Korea, Taiwan, Hong Kong, Indonesia, Singapore, Malaysia and Thailand — grew more prosperous and more equal, lifting huge numbers of people out of poverty.

Fibiger writes that the Vietnam war served as an engine of economic growth in Southeast Asia and fuelled exports to the US market. Growth legitimised rather than undermined authoritarian regimes in ASEAN, and deepened oligarchy. The war, he says, helped create strong states, regional prosperity, and ASEAN.

Beyond that summation, Fibiger attacks the buying time thesis as morally bankrupt because it is a metaphor of transaction, “implying that the Vietnam war’s salutary effects in Southeast Asia somehow cancel out its massive human and environmental cost in Indochina.”

America’s allies joined the war to serve alliance purposes with the United States. South Korea sent 320,000 troops to South Vietnam between 1965 and 1973, Australia 60,000, Thailand 40,000 and New Zealand 3800. The Philippines contribution was a total of 2000 medical and logistical personnel. Taiwan stationed an advisory group of around thirty officers at any one time in Saigon but sent no combat troops for fear of offending China.

For their part, Australia, New Zealand and South Korea fought “not for Saigon,” writes David L. Anderson, “but in keeping with their established practices of protecting their regional interests and constructing their national defence with allies.” By 1970, Australian opinion was divided over the war, Anderson notes, but the alliance with the United States still had popular support:

The war polarised the politics of the US, Australia and New Zealand. Antiwar sentiment in the three countries did not alone bring an end to their military engagement, but protest movements conditioned the political process to accept negotiation and withdrawal when government strategists decided national security no longer required the cost and sacrifice of the conflict.

In the years after the Vietnam war, Anderson says, the former junior partners maintained friendly relations with Washington even though the United States “was seen as a less reliable partner.” The new need was “greater self-reliance and independence from the US.”

Editors Cuddy and Logevall conclude that studying the regional dynamics of the Vietnam war is not purely of historical interest: “American foreign policy is turning its attention — even if haltingly and haphazardly — back to the Pacific… Understanding how the region reacted to the American war in Vietnam and how the war changed the region might help the United States and its Asia-Pacific partners navigate the currents of competition in the future.”

The Vietnam history offers cautions about the new competition between the United States and China. The United States again seeks regional allies and is gripped by vivid fears about the threat China poses to the system. The region again ponders the level of US commitment and its reliability.

The two giants compete to hold friends close and ensure no dominos fall to the other side.

Vietnam is a haunting demonstration that the Washington consensus can misread or even obscure Asian understandings and the complex politics of the region. Those truths from history matter again today. As America’s greatest strategic blunder of the twentieth century was in Asia, so in this century America’s greatest strategic challenge is in Asia. •

The Vietnam War in the Pacific World
Edited by Brian Cuddy and Fredrik Logevall | University of North Carolina Press | US$29.95 | 382 pages

The post Domino days appeared first on Inside Story.

]]>
https://insidestory.org.au/domino-days/feed/ 35
Demythologising the frontier https://insidestory.org.au/demythologising-the-frontier/ https://insidestory.org.au/demythologising-the-frontier/#comments Wed, 06 Dec 2023 00:24:08 +0000 https://insidestory.org.au/?p=76641

David Marr’s intergenerational account of colonisation challenges us to think differently about truth-telling

The post Demythologising the frontier appeared first on Inside Story.

]]>
True histories are often not for the faint-hearted, and David Marr’s ambitious and sweeping account of his own family story is among those that challenge the reader not to look away. Killing for Country is framed by the exploits of four figures — Richard Jones, his brother-in-law Edmund B. Uhr and Edmund’s sons Reg and D’Arcy Uhr — each of whom embody ambition, entitlement, conquest and brutality. Taken together, their stories reveal the real price of nation-building in a colonial country through the experiences of the kinds of pioneers Australian history has often mythologised.

Marr paints his characters meticulously. Richard Jones, an astute businessman and Christian evangelical, amassed large tracts of land on which to graze his sheep. His accumulation of wealth rested on clearing the land of its original occupants. Aboriginal people were slaughtered with no protection from the colony, a process that Jones was able to keep at arm’s length by giving others the bloody work of “dispersal” or “reprisal.”

Like many other Christians, Jones failed to see the humanity in Aboriginal people and so his charity couldn’t extend to them. At the height of his landholding, he would have more than 600,000 acres. His is the story of the “entrepreneur gentleman,” a type whose wealth was often portrayed as coming from acumen and savvy but who, in truth, took land with brute force (even if not by his own hand) and then used his wealth, power and position (in parliament, commerce and banking) to ensure that laws favoured his interests and didn’t protect those who had been removed.

Marr’s profile of Edmund B. Uhr reveals a man with his own aspirations for large landholdings and status, whose appointment as magistrate gave him the power to condone violence or turn a blind eye to it, often for his own convenience. Marr then tracks the exploits of two of Uhr’s sons, brothers Reg and D’Arcy, as they move through Queensland and into the Gulf country as officers of the Native Police. They helped clear Aboriginal populations from traditional lands, suppressing resistance while avoiding having literal blood on their hands.

This is a two-generational account of the colonisation process. Through the stories of his subjects, Marr shows how land was taken by force and without payment, and how these men used their wealth, status and power to create the rules that validated their theft and turned a blind eye to the violence used to take the land. Marr also illustrates how, on the rare occasions when colonial law sought to temper the excessive violence and murder and hold its perpetrators to account, new strategies emerged to achieve the same end with less accountability. Humanitarian concerns were met with derision by those who charged that sympathetic city-dwellers had no understanding of life on the frontier and in the newly conquered lands.

Marr’s recounting of barbarities against Aboriginal men, women and children is factual and to the point. He understands that this is no place for timidity and euphemisms. But nor is there any need for exaggeration or hyperbole. His account is thorough, searing and unflinching, the stories of these four men compellingly framed by his knowledge of the legal frameworks and politics of the time and his attention to the public debates playing out in the era’s newspapers.

Killing for Country also sets what was occurring in Australia in the context of colonisation processes around the globe, a facet that is too often overlooked. He notes, for instance, that the American war of independence had been sparked by a refusal to grant more land to the colonisers, and the British weren’t keen to make that mistake again. When the line around the Sydney colony’s nineteen counties was breached, there was little appetite to rein anyone in.

Through these prisms, Marr presents a broader narrative of Australian history. In bringing together the personal and the political, he presents a story of power, privilege and the process of aggressive colonisation — of brutal, concerted and bloody dispossession — with not even the facade of a treaty offered to those being conquered.

But Killing for Country is also the story of resistance. Unmasking the violence needed to take country and keep it exposes as a convenient and necessary colonial lie the myth that Aboriginal people simply faded into the background, inevitably ceding ground to a superior force. What emerges clearly from the conflict explored in Marr’s book is that Aboriginal people fought tenaciously for their land, at many moments repelling the onslaught of colonisation ferociously and fearlessly. It was the depth of this refusal to cede that prompted the more shamelessly brutal force in which Marr’s four subjects played their roles.

Marr has been one of the great intellectual contributors to the critique of Australia’s national narrative. Through his body of work, including his books Dark Victory and Panic, and his Quarterly Essays The White Queen and His Master’s Voice, he has been a persuasive critic of the divisive race politics of the Howard era and their legacy, and a compassionate contributor to debates about the type of country we should be. Killing for Country continues his thoughtful interventions in and critiques of the story Australia wants to tell about itself.


With his meticulous, time-consuming research clear in each page, Marr couldn’t have anticipated the exact moment in history when his book would be released. His account of how the country was taken comes just as Australia is processing the fallout of the failed Voice referendum.

The lead-up to the vote saw an increase in the visibility and vitriol of racist tropes — that Aboriginal people are savage and backward, that they were getting ‘something for nothing’ and that recognition of their distinct place would “divide the nation.” Killing for Country reminds us of the seeds of those antiquated and racist ideologies and the purpose they serve.

Marr would also be fully awake to the ripples in the pond that a book like this creates. As a public intellectual who has constantly interrogated what racism and prejudice does to the fabric of Australian society, he is aware of the history of ideological resistance to the forceful telling of frontier history. There is no longer a black armband or white blindfold: they were part of a contest of ideologies over what story Australia wanted to tell itself. Marr slices through this. There is no place for the heroic “white man conquering the land” narrative and there is no excuse for shying away from words like “conquest,” “invasion,” “genocide” and “massacre.” These debates about whether there were massacres or stolen generations were never about a truthful Australian history.

But it is hard to read Marr’s book as an Aboriginal woman and not feel it personally. Massacre sites litter the rivers of my traditional country, and accounts like Killing for Country are a form of validation, however tough the read is at times. The violence perpetrated against our ancestors, often coupled with sexual violence against Aboriginal women, was sparked not just by the taking of land but by Aboriginal communities standing up against settler abuse of women and children. The treatment of Aboriginal women in the process of colonisation still feels under-researched; it is not adequately captured in the archives or written colonial records.

I read Killing for Country as I was travelling on my own traditional country where family members can still point out where massacres took place along our rivers. While the material in Marr’s book felt raw, it is also incredibly important that the gaslighting of history that remains strong in the oral histories of Aboriginal people is replaced with an honest account of what it takes to claim a continent and its human cost.

This is a deeply personal historical account for Marr as well. These are his ancestors. His approach challenges us to think differently about what a truth-telling process might be. It is not just about creating the space for Aboriginal and Torres Strait Islander people to tell their truths. It also requires the kind of historical accountability that sits on the other side of the ledger.

What is admirable about Marr’s approach in Killing for Country is that there is no handwringing as he lays bare the exploits of his ancestors. Instead, he poses the question that should be asked: if we can accept true accounts of our past, what will it mean for the shape of our future? What happens if the true history is acknowledged and we can admit that this is a country that was acquired by conquest?

The impact of this type of truth-telling should not be a sense of collective guilt but instead the impetus for meaningful and collective action. With that message, Killing for Country could not be more necessary or more timely. •

Killing for Country: A Family Story
By David Marr | Black Inc. | $39.99 | 432 pages

The post Demythologising the frontier appeared first on Inside Story.

]]>
https://insidestory.org.au/demythologising-the-frontier/feed/ 166
Continent of fire https://insidestory.org.au/continent-of-fire/ https://insidestory.org.au/continent-of-fire/#comments Wed, 06 Dec 2023 00:05:05 +0000 https://insidestory.org.au/?p=76644

Australia’s fatal firestorms have a distinctive and mainly Victorian lineage, but the 2019–20 season was frighteningly new

The post Continent of fire appeared first on Inside Story.

]]>
One of the arguments deployed to dismiss global warming and the uniqueness of the long, gruelling fire season of 2019–20 was that Australia has always had bushfires. Bushfire is indeed integral to our ecology, culture and identity; it is scripted into the deep biological and human history of the fire continent. But some politicians and media commentators used history lazily to deny that anything extraordinary is happening and drew on the history of the Victorian firestorm as if it represented national experience.

We need to bring some historical discrimination to debates about what was new about the Black Summer. In particular we need to look at the history of firestorms, the distinctive fatal fires of southeastern Australia that culminated in named days of terror: Black Thursday 1851, Red Tuesday 1898, Black Sunday 1926, Black Friday 1939, Black Tuesday 1967, Ash Wednesday 1983 and Black Saturday 2009. How did the summer of 2019–20 relate to this grim lineage?

Black Thursday, 1851

The British colonists of Australia came to “this continent of smoke” from a green, wet land where fire was cosseted and coddled. They had rarely, if ever, seen free-ranging fire at home for it had been suppressed and domesticated over generations. They had so tamed fire that they had literally internalised it in the “internal combustion” of the steam engine.

These representatives of the industrial revolution brought to Australia many new sources of ignition, yet they also introduced houses, cattle, sheep, fences and all kinds of material belongings that made them fear wild fire. And they found themselves in a land that nature and human culture had sculpted with fire over millennia, a land hungry for fire and widowed of its stewards by the European invasion. It was an explosive combination. They did not know what the bush could do.

The foundational firestorm of Australian settler history occurred a few months after the residents of the Port Phillip District heard the news that British approval had been given for their “separation” from New South Wales. The impending creation of a distinct colony, soon to be called Victoria, was a cause for much celebration in Melbourne in November 1850, and a five-day holiday was declared.

Three months later, on Thursday the sixth of February 1851, in the soaring heat of a scorching summer, terrifying fires swept across the forests, woodlands and farms of the southeast. “Separation” had been celebrated with hilltop bonfires and now it was sealed by a scarifying firestorm. It was right that fire should forge the political identity of the most dangerous fire region on the planet.

“Black Thursday,” wrote the visiting British writer William Howitt, who arrived the year after the fire, “is one of the most remarkable days in the annals of Australia.” “The whole country, for a time, was a furious furnace,” he reported, “and, what was the most singular, the greatest part of the mischief was done in one single day.” He then went on to make some startling parallels. “It is a day as frequently referred to by the people in this colony as that of the Revolution of 1688 in England, of the first Revolution in France, or of the establishment of Independence in the United States of America.” In Australia, Howitt seemed to be suggesting, it was nature more than politics that would shape our identity.

Black Thursday, “the Great Bush Fire,” was a revolution of a kind. It was the first of the Black Days to be named by Europeans, the first recorded firestorm to shock and humble the colonists. Although the newcomers had quickly learned to expect bushfires, this was something else; its magnitude and ferocity terrified all who experienced and survived it.

At first the Melbourne Argus could hardly credit the reports from the bush, but then the breathless testimony kept tumbling in. Drought, high temperatures and ferocious northerly winds fanned the flames into a giant conflagration. People rushed to fight with green boughs “as in ordinary bushfires,” but all were forced to flee. Flames leaped from tree to tree like lightning; the fire careered “at the rate of a horse at full gallop”; sheep, cattle, horses, kangaroos and smaller native animals hurtled before it and hosts of birds were swept up in it: “the destruction of the wild creatures of the woods, which were roasted alive in their holes and haunts, was something fearful to contemplate.” People “went to bed, or lay down (for many did not dare go to bed), in a state of the greatest suspense and doubt as to whether they should see daylight next morning.”

Four days after the fire, Frances Perry, wife of the Bishop of Melbourne, recorded that “in some parts of the country the people are completely panic-struck. They thought, and well they might, that the world was coming to an end.”

The words of survivors painted a picture strikingly similar to the grand panorama of Black Thursday (1864) by artist William Strutt. For his imagery he drew on reportage as well as his own experience of the heat, smoke and fear of the day. Over three metres in breadth, the painting depicts what Strutt called “a stampede for life,” where people and animals, eyes wild with panic, flee southwards in terror.

Stampede for life: William Strutt’s Black Thursday, February 6th 1851 (1864). Click here to enlarge. State Library of Victoria

The “Great Bush Fire” of 1851 was the first large-scale firestorm to terrorise the British colonists. It wreaked its havoc just a decade and a half after British pastoralists invaded the Port Phillip District of New South Wales. Sheep, cattle and people had swiftly moved into the grasslands of the southeastern corner of the continent, but in 1851 the invaders had only recently outnumbered Aboriginal peoples and Indigenous burning regimes persisted in some places.

Because of its timing on the cusp of this change, Black Thursday was an intriguing amalgam of old and new Australia. It was an event embedded in the unravelling ecological and cultural rhythms of the southeastern corner of the continent. But Black Thursday was also an outrageous outbreak of disorder, the first schism in the new antipodean fire regime, a portent of things to come.

Red Tuesday, 1898

European settlers feared and suppressed fire near their properties and towns, and misjudged its power in the bush. But it did not take them long to begin to use fire for their own purposes, even if clumsily and dangerously. “The whole Australian race,” declared one bushman, has “a weakness for burning.” The language the bush workers used — “burning to clean up the country” — was uncannily like that of Aboriginal peoples.

In the drier forests of the ranges (but generally not the wet mountain ash forests, which had less grass), graziers used fire as Aboriginal peoples had done: to keep the forest open, to clean up the scrub, to encourage a “green pick,” and to protect themselves and their stock from dangerous bushfire. But, unlike Aboriginal peoples, the newcomers were prepared to burn in any season. And the legislative imperative for settlers was to “improve” the land they had colonised — and “improvement” first meant clearing. The Australian settler or “pioneer” was a heroic figure depicted as battling the land and especially the trees.

This fight with the forest assumed theatrical dimensions in South Gippsland, where each summer neighbours gathered to watch the giant burns that, they hoped, would turn last year’s fallen and ring-barked forest into this year’s clearing. They needed to establish pastures as quickly and cheaply as possible. Small trees were chopped, undergrowth was slashed, and sometimes large trees were felled so as to demolish smaller timber that had previously been “nicked,” thereby creating, as one settler put it, “a vast, crashing, smashing, splintering, roaring and thundering avalanche of falling timber!” The slashed forest was left to dry until the weather was hot enough for the annual burn, the frightening climax of the pioneer’s year.

In the mostly wet sclerophyll forest of the South Gippsland ranges, some of it mountain ash, it was often hard to get a “good burn” because of the heavy rainfall and the thick scrub’s resistance to wind. Farmers therefore chose the hottest summer days for these burns, “the windier and hotter the day the better for our purpose.” These settlers of the world’s most fire-prone forests awaited the most fatal days.

A “good burn” could so easily become a firestorm and in Gippsland in 1898 it did. “Red Tuesday” (1 February) was the most terrifying day of the “Great Fires” that year, a whole summer of fear and peril. Intense clearing fires had accompanied ringbarking, ploughing, sowing and road-making in Gippsland for two decades, but settlers were still shocked by the Great Fires, which were like nothing they had ever experienced. Although they were stunned by the speed and violence of the firestorm, the new farmers understood that it was a product of their mode of settlement. Their principal pioneering weapon had run amok. As farmers burned their clearings into the encircling edges of the wet, green forest, they might have guessed that soon the fires would link up and overwhelm them.

Just as Black Thursday was memorialised in a great painting so was Red Tuesday captured in a grand work of art. When historian Stephen Pyne surveyed fire art around the world, he found Australian paintings to be exceptional for their gravitas, their capacity to speak to cultural identity or moral drama. “Bushfires did not simply illuminate the landscape like a bonfire or a corroboree,” he wrote, “they were the landscape.”

This is vividly true of John Longstaff’s depiction of Gippsland, Sunday Night, February 20th, 1898. Longstaff was born on the Victorian goldfields a decade after Black Thursday and travelled to Warragul to witness the long tail of the 1898 fires. Whereas Strutt’s painting was intimate in its terror and chaos, showing us the whites of the eyes of people and animals, Longstaff evoked the drama through its magisterial setting. Human figures are dwarfed by towering mountain ash trees and the immensity of the bush at night, and appear encircled and illuminated by fire. Flames lick at the edge of the clearing and a leaping firestorm races towards us from a high, distant horizon.

Longstaff exhibited his grand painting in his Melbourne studio in August of that year, lit by a flickering row of kerosene-lamp footlights. Gippsland, Sunday night, February 20th, 1898 is a painting of a landscape, and it focuses on the forest as much as the fire and the settlers. “The Great Scrub,” the enemy of the settlers, is a powerful presence in the panorama; it inspires as much awe as the flames. The people in the painting, who are seeking to “settle” this fearful forest, are enclosed and entrapped by its vast darkness. The erupting bushfire is both a threat and a promise.

Burning off

Firestorms became more frequent in the twentieth century, as sawmilling and settlement moved more deeply into the mountain forests of Victoria. The greatest of them came on Friday 13 January 1939, the grim climax of a week of horror and a summer of fire across New South Wales, South Australia, the Australian Capital Territory and Victoria. In that week, 1.4 million hectares of Victoria burned, whole settlements were incinerated, and seventy-one people died. Sixty-nine timber mills were engulfed, “steel girders and machinery were twisted by heat as if they had been of fine wire,” and the whole state seemed to be alight.

Judge Leonard Stretton, who presided over the royal commission into the causes of the fires, pitied the innocence of the bush workers, immigrants in a land whose natural rhythms they did not yet understand:

Men who had lived their lives in the bush went their ways in the shadow of dread expectancy. But though they felt the imminence of danger they could not tell that it was to be far greater than they could imagine. They had not lived long enough. The experience of the past could not guide them to an understanding of what might, and did, happen.

Stretton investigated the settlers’ culture of burning, taking his commission to bush townships and holding hearings in temperatures over 100°F (38°C). His shocking finding was that “These fires were lit by the hand of man.” Yet rarely were they malevolent arsonists. Mostly they were farmers and bush workers, and their fire lighting was casual and selfish, sometimes systematic and sensible, and increasingly clandestine and rebellious. They were settlers burning to clear land and graziers firing the forest floor to promote new grass. Burning was a rite — and a right. They were landowners who, when they saw smoke on the horizon, threw a match into their home paddock.

Settlers felt “burning off” helped to keep them and their neighbours safe. Travellers to the Yarra Valley in the first decades of the twentieth century wouldn’t have been surprised to see “half a dozen fires on the sides of mountains.”

When the Forests Commission of Victoria was founded in 1918, it assumed control of the state forests and forced graziers out if they did not stop burning their leases. Forest officers, charged with conservation of timber, tried to suppress fire, but farmers and graziers believed that their burning kept the forest safe from fire by keeping fuel loads down. George Purvis, a storekeeper and grazier at Moe in Gippsland, explained to the 1939 royal commission that everybody used to burn off many years ago: “We could meet a few of our neighbours and say ‘What about a fire’… Nowadays, if we want a fire we nick out in the dark, light it, and let it go. We are afraid to tell even our next door neighbour because the Forests Commission is so definitely opposed to fires anywhere, that we are afraid to admit that we have anything to do with them.”

As a result, Purvis explained, the bulk of farmers did not burn their land as much as they wished. And so, as fires gathered force in the week before Black Friday, people desperately burned to save their property and their lives. It was considered better to burn late than never, and these fires (indeed “lit by the hand of man”) “went back into the forest where they all met in one huge fire.”

Perhaps fire was so much a part of the Australian landscape and character that it could never be eliminated or suppressed. It had to be accepted and used, and perhaps it could be controlled. The 1939 royal commission signalled a new direction. In his recommendations, Stretton gave official recognition to a folk reality and tried to give focus and discipline to the widespread popular practice of burning to keep the forest safe. He recommended that the best protection against fire was regular light burning of undergrowth at times other than summer. Only fire could beat fire.

Vivid word-picture: the report of the 1939 royal commission.

As Stephen Pyne observed, this “Australian strategy” was in defiant counterpoise to the North American model of total fire suppression. The strategy was reinforced by another royal commission, this one following the 1961 Dwellingup fires in Western Australia, which endorsed systematic, expansive, hazard-reduction burning of the jarrah forests of the southwest.

It took time for official “controlled burning” to supplant unofficial “burning off.” In 1967, a Tasmanian firestorm provided dramatic evidence of the persistence of rural traditions of burning. On 7 February, which became known as Black Tuesday, a “fire hurricane” stormed through bushland and invaded Hobart’s suburbs, coming within two kilometres of the CBD. The fire caused the largest loss of life and property on any single day in Australia to that time.

Black Tuesday had strong elements of Black Friday 1939 embedded within it. Of the 110 fires burning on that Tuesday, ninety started prior to the day and seventy were uncontrolled on the morning of the 7th. Significantly, only twenty-two of the 110 fires were started accidentally; eighty-eight were deliberately lit. In other words, bushfires were common, deliberate and allowed to burn unchecked. “No one worried about them too much,” reflected Tasmanian fire officer John Gledhill, echoing Stretton.

Tasmania’s 1967 Black Tuesday fire, with its heart in the expanding suburbs of Hobart, signalled a new type of firestorm in Australian history. The bush had come to town. But the town had also come to the bush, insinuating its commuters and their homes among the gums. This event initiated an era of fires that would invade the growing urban interface with the bush: Ash Wednesday 1983 (Adelaide and Melbourne); Sydney 1994; Canberra 2003, when more than 500 suburban homes were destroyed in the nation’s capital; and Black Saturday 2009, when only a wind change prevented the Kilmore East fire from ploughing into Melbourne’s densely populated eastern suburbs.

During the second half of the twentieth century, casual rural fire lighting gradually became criminalised. The law was enforced more strongly and public acceptance of open flame declined. Fire was gradually eliminated from normal daily experience as electricity took over from candles, kerosene and, eventually, even wood stoves. Firewood for the home became more recreational. “Smoke nights” — once part of the fabric of social life and an especially masculine ritual — went into decline as smoking itself became a health issue. Instead of being a social accompaniment and enhancement, smoking was pushed to the margins of social life, even becoming antisocial.

It had been different in the interwar years: in 1939 the Red Cross, “concerned about the health of the bush fire refugees,” appealed to the public for “gifts of tobacco.” Even for victims of fire, smoke was then considered a balm. On Black Sunday 1926, Harry King, a young survivor at Worrley’s Mill where fourteen people died, crawled scorched and half-blinded for four kilometres through the smoking forest to tell his story in gasps. At the end of his breathless account, he opened one badly burned eye and whispered: “I’m dying for a smoke, dig.”

The ferocity of “the flume”

The years of the most fatal firestorms were burned into the memories of bush dwellers: 1851, 1898, 1926, 1939, 1967, 1983, 2002–03 and 2009. Stretton’s vivid word-picture of Black Friday 1939, which became a prescribed text in Victorian Matriculation English, joined the paintings by Strutt and Longstaff in forming a lineage of luminous fire art.

The most frightening and fatal firestorms have all roared out of the “fire flume.” That’s what historian Stephen Pyne called the region where hot northerly winds sweep scorching air from the central deserts into the forested ranges of Victoria and Tasmania. In the flume, bushfires strike every year, firestorms every few decades. Firestorms are generated when spot fires ahead of the flaming front coalesce and intensify, even creating their own weather. They entrap and surround. Firestorms are bushfires of a different order of magnitude; they cannot be fought; they rampage and kill. Their timing, however, can be predicted. They come at the end of long droughts, in prolonged heatwaves, on days of high temperatures, low humidity and fierce northerly winds.

The firestorms are intensified by particular species of trees — the mountain ash and the alpine ash — that conspire to create a raging crown fire that kills and then reproduces the whole forest en masse. These tall ash-type eucalypts need a hot, fast-moving crown fire, upon which their regeneration uniquely depends, to crack open their seeds. The ecology of the forest depends on firestorms, so we know they also happened under Aboriginal ecological management.

In the last 200 years, the clearing, burning and intensive logging of the new settlers exaggerated and intensified the existing rhythm. In many remaining forest districts firestorms have come too frequently for the young ash saplings to grow seed, and so towering trees have given way to scrubby bracken and acacia. Those two colonial paintings captured the fatal, colliding elements of the Victorian firestorm: the peril, horror and panic of the people, and the indifferent magnificence of the tall, fire-hungry trees.

In 2009, I resisted use of the word “unprecedented” to describe Black Saturday because it was the familiarity of the firestorm that horrified me. Although the event was probably exacerbated by climate change, the recurrent realities were more haunting. As I wrote in Inside Story at the time, “the 2009 bushfires were 1939 all over again, laced with 1983. The same images, the same stories, the same words and phrases, and the same frightening and awesome natural force that we find so hard to remember and perhaps unconsciously strive to forget.” As a historian of the fire flume, I was disturbed by Black Saturday’s revelation that we had still not come to terms with what we had already experienced.

In the months following Black Saturday (2009), I was invited to assist the small community of Steels Creek in the Yarra Valley to capture stories of their traumatic experience. Working with historians Christine Hansen, Moira Fahy and Peter Stanley, I wrote a history of fire for the community that presented the ubiquity and sheer repetitive predictability of the phenomenon in that valley. One bushfire after another, year in year out. As we set out this rhythm, a deeper pattern emerged, which was the distinction in this region between bushfires and firestorms. The ferocity of the firestorms was generated not necessarily by trees near a settlement but by forests more than ten kilometres away, perhaps thirty or forty kilometres away. Survival in summer is not just a matter of clearing the gutter but also knowing what forests live in your region.

It has proven too tempting and too easy for Australians to overlook or deny the deep local history of the Victorian firestorm. Sometimes Aboriginal mosaic burning, which was applied to so many drier woodlands across the continent, is assumed to have been used in the wet ash forests too. For example, in his book Dark Emu, Bruce Pascoe argued that “a mosaic pattern of low-level burns” was used in mountain ash forests and suggested that wild fires in the forests affected by Black Saturday “were largely unknown before the arrival of Europeans.” But this cannot have been the case, for when Europeans arrived they found mature, even-aged ash forests, the very existence of which was evidence of historical, powerful crown fires.

For example, botanist David Ashton identified one old stand of mountain ash at Wallaby Creek as dating from a firestorm in 1730. Furthermore, ash forests would have been destroyed by frequent fires, and low-level burns are not feasible in such a wet ecosystem. Aboriginal peoples would have used low-level cool burns to manage the drier foothill forests but not the ash forests themselves, for mature mountain ash trees can easily be killed (without germinating seed) by light surface fire. Woiwurrung, Daungwurrung and Gunaikurnai peoples used the tall forests seasonally and probably burned their margins, maintaining clearings and pathways along river flats and ridgetops. They were familiar with the forest’s firestorms and would have foreseen and avoided the dangerous days.

Even six generations after Black Thursday 1851, we stubbornly resist acknowledging the ecological and historical distinctiveness of the Victorian firestorm. It is astonishing that the Black Saturday royal commission cranked through 155 days of testimony but failed to provide a vegetation map in either its interim or final report. In one of my submissions to the inquiry, I drew the commission’s attention to this absence in their interim report, but it was not remedied. Senior counsel Rachel Doyle was more interested in pursuing the former Victorian police chief Christine Nixon about her haircut on 7 February than in directing the commission’s attention to the unusually combustible forests through which the fires stormed.

The royal commission went some way towards being more discriminating about the variety of bushfire, weather, topography and ecology, but not far enough. Forests featured in the commission’s report mostly as “fuel.” “The natural environment,” the commissioners explained in opaque bureaucratic language, “was heavily impacted.”

Thus the firestorm’s origin in the ecology of the forest was ignored even by a royal commission. Or people explained it away by interpreting such outbreaks as entirely new, as products of either the cessation of Aboriginal burning or of anthropogenic climate change. Indigenous fire and global warming are highly significant cultural factors in the making of fire regimes, but both work with the biological imperative. It is clearly hard for humanity to accept the innate power of nature.

The same tendency led Victorians up the garden path of fire policy. The most shocking fact about Black Saturday 2009 was that people died where they thought they were safest, where they were told they would be safest. Of the 173 people killed on Black Saturday, two-thirds of them died in their own homes. Of those, a quarter died sheltering in the bath.

As I wrote in Inside Story in 2009 and 2012, the “Stay or Go” policy was a death sentence in Victorian mountain communities in firestorm weather. Although the policy guided people well in many areas of Australia and had demonstrably saved lives and homes elsewhere, it misled people in this distinctively deadly fire region to believe that they could defend an ordinary home in the face of an atomic force. And it was this confidence in the defensibility of the home and denial of the difference of the firestorm (coupled with a faith in modern firefighting capacity) that underpinned the lack of warnings issued by authorities to local residents about the movement of the fire front on Black Saturday.

For much of the history of these forests, including their long Aboriginal history, no one believed their homes were safe in a firestorm. Evacuation was the norm. Sometimes the elderly and vulnerable were extracted by force from their homes by caring relatives and friends. Most people fled of their own accord. A “safe place” was a creek, a bare or ploughed paddock, a safely prepared or quickly excavated dug-out, a mining adit or railway tunnel, or just somewhere else. If you were trapped at home, there was an art to abandoning it at the right moment. The acknowledged vulnerability of homes made it essential for those caught in them to get out. And people in those earlier times were more inclined to look out the window, go outside and watch the horizon, sniff the air.

In 2009, the internet was a killer. The private, domestic computer screen with its illusion of omniscience and instant communication compounded the vulnerability of the home.

The Black Summer

The fire season of 2019–20 was completely different in character from Black Thursday (1851) and its successors. It might be compared best with the alpine fires of 2002–03, which were also mostly started by lightning in remote terrain and burned for months.

Coming after severe drought and more record heatwaves, the summer of 2019–20 tipped fire patterns into widespread rogue behaviour. It is not unusual for Australians to have smoke in their eyes and lungs over summer — the Great Fires of our history are remembered not only for their death tolls but also for their weeks of smoke and dread. But in the summer of 2019–20 the smoke was worse, more widespread and more enduring, the fires were more extensive and also more intense, NSW fires started behaving more like Victorian ones, and the endless “border fire” symbolically erased the boundary anyway.

Australia was burning from the end of winter to the end of summer, from Queensland to Western Australia, from the Adelaide Hills to East Gippsland, from the NSW south coast to Kangaroo Island, from the Great Western Woodlands to Tasmania. Everywhere, suddenly, bushfire was tipping into something new.

As spring edged into summer and the fires worked their way down the Great Dividing Range and turned the corner into Victoria, people who remembered Ash Wednesday (1983) and Black Saturday (2009) braced themselves. January and February are traditionally the most dangerous months in the southern forests. But this time central Victoria’s good winter rainfall and wetter, cooler February prevented the flume from ripping into full gear.

Therefore an unusual aspect of the fire season of 2019–20 was that these Great Fires did not explode out of the firestorm forests of Victoria and Tasmania. It was one reason why the death toll for such extensive and enduring fires was relatively low; they did not break out in the most fatal forests. Another reason was that Black Saturday had led to a new survival policy: to leave early rather than to stay and defend. Early evacuation thus became the enforced strategy of authorities well beyond the firestorm forests. Again, a regional and ecologically specific strategy became generalised as a universal policy. But at least this time it erred on the side of caution and surely saved lives.

The sheer range, scale, length and enduring ferocity of these fires made them unprecedented. The blackness of the named days of Australia’s fire history describe the aftermath of the sudden, shocking violence of a firestorm; it evokes mourning, grief and the funereal silence of the burned, empty forests. Black and still.

But when the fires burn for months, a single Black Day morphs into a Black Summer. There seemed never to be a black day-after; instead the days, the weeks, the months were relentlessly red. Red and restless. The colour of danger, of ever-lurking flame, of acrid orange smoke and pyrocumuli of peril. The smoke killed ten times more people than the flames. The threat was always there; it was not over until the season itself turned — and only then was it declared black. But the enduring image is of people cowering on beaches in a red-orange glow, awaiting evacuation. I think of it as the Red Summer.

Living with fire

A long historical perspective can help us come to terms with “disasters” and even ameliorate them, but most significantly it can also enable us to see beyond the idea of fire as disaster. There will be more Black Days and, under the influence of climate change, longer Red Summers. We have to accept and plan for them, like drought and flood. We should aim to survive them, even if we can’t hope to prevent or control them. We must acknowledge the role of global climate change in accelerating bushfire and urgently reduce carbon emissions. And we should celebrate, as I think we are already beginning to do, the stimulus that bushfire can give to community and culture.

In the quest for how to live with fire, Indigenous cultural burning philosophies and practices have much to offer all Australians. Sometimes we can even see a fired landscape (of the right intensity and frequency) as beautiful or “clean,” as Aboriginal peoples do. We are slowly learning to respect cultural burning and its capacity to put good fire back into a land that needs fire. But we must go further and actually allow Indigenous fire practitioners to take the lead again.

Victor Steffensen, a Tagalaka descendant from North Queensland, has written a humble and hopeful book, Fire Country (2020), which is as much about negotiating the bureaucratic hierarchies of fire power as it is about fire itself. As his mentor, Tommy George, declared in frustration, “Those bloody national park rangers, they should be learning from us.”

But cultural burning is not the same as prescribed burning. Sensitive controlled burning might, in some ecosystems, render the land safer for habitation, although it has proven difficult to achieve required levels in a warming world. And in a landscape of transformed ecologies, greatly increased population and rapidly changing climate, it is unreasonable and dangerous to expect Indigenous peoples to make the land safe for the proliferating newcomers; it would again set vulnerable people up to fail. Anthropologist Tim Neale has argued that the settler “dream of control” places an “impossible burden” on Aboriginal peoples, trapping them again within an idealised expectation of unchanging ancient behaviour.

Renewing and reviving Indigenous fire practices is important, first and foremost, for human rights, native title and the health, wellbeing and self-esteem of First Nations communities. We are fortunate that an additional opportunity presents itself: for a rapprochement between the exercise of Indigenous responsibility to Country and modern Australia’s need for labour-intensive and ecologically sensitive fire management on the ground. There is much creative promise in that partnership, and developing it will take time, patience and respect.

Throughout 2019, fire experts pleaded with the federal government to hold a bushfire summit to prepare for the dreaded summer, but the prime minister refused, fearing that acknowledging the crisis would give credence to climate action. Yet at the end of the summer he established another retrospective bushfire inquiry, the fifty-eighth since 1939. Many of the sensible, urgent recommendations of those earlier commissions have been ignored and await enactment. Rather than spending millions of dollars on lawyers after the flames, the nation would do better to spend a few thousand on environmental historians to distil and interpret existing, hard-earned wisdom.

Australian scholars of fire need to work on at least three temporal scales. First, there is the deep-time environmental and cultural history of the continent and its management over millennia. Second, there is the century-scale history of invasion, documenting the changes wrought by the collision of a naive fire people with the fire continent. And third, there is the long future of climate-changed nature and society. Black Thursday was the first firestorm after the invasion, an ancient ecological cycle with new social dimensions. Red Tuesday, Black Sunday and Black Friday were exaggerated by settlement and rampant exploitation. Black Saturday was more like the past than the future, a frighteningly familiar and fatal amalgam of nature and culture. But the Red Summer of 2019–20 was a scary shift to something new, fast-forwarding Australians into a new Fire Age. •

This is an abridged version of “The Fires: A Long Historical Perspective,” Tom Griffiths’s contribution to The Fires Next Time: Understanding Australia’s Black Summer, edited by Peter Christoff (Melbourne University Publishing, 2023).

The post Continent of fire appeared first on Inside Story.

]]>
https://insidestory.org.au/continent-of-fire/feed/ 9
Kissinger and his critics https://insidestory.org.au/kissinger-and-his-critics/ https://insidestory.org.au/kissinger-and-his-critics/#comments Thu, 30 Nov 2023 21:13:22 +0000 https://insidestory.org.au/?p=74406

How does the former secretary of state feel about being called a war criminal?

The post Kissinger and his critics appeared first on Inside Story.

]]>
What was Henry Kissinger thinking on his hundredth birthday last month? He was surely gratified to be feted by the world’s foreign policy elite, who still crave his counsel on today’s global challenges and the reflected glow of his celebrity. The grandees who thronged to various celebrations included US secretary of state Antony Blinken.

But Kissinger’s pleasure was surely mixed with bitterness at the outpouring of vitriol from the anti-imperialist left, who have long condemned him as a war criminal who deserves prison rather than praise. Marking the centenary birthday, Mehdi Hasan on MSNBC said he wanted to talk about “the many, many people around the world” who didn’t get to live even to the age of sixty because of Kissinger. He should be “ashamed to be seen in public,” Bhaskar Sunkara and Jonah Walters wrote in the Guardian.

They and many others trotted out the usual charges from his days as top foreign policy adviser to Richard Nixon and Gerald Ford: brutally and unnecessarily prolonging the Vietnam war, bombing neutral Cambodia, trying to help overthrow a democratically elected leader in Chile, greenlighting the Indonesian invasion of East Timor, abetting genocide in East Pakistan, winking at state torture and killings in Argentina, and more.

The former secretary of state has heard this line of criticism for half a century, and the evidence suggests that it stings. Deeply protective of his honour, he has typically reacted angrily to challenges to his integrity and intelligence. In early 1970, for example, at Johns Hopkins University, a student asked whether he considered himself a war criminal — presumably referring to heavy civilian casualties in the Vietnam war. Kissinger walked out and refused to speak there again for the next twenty years.

In later years, his reaction to such questions changed very little. What did change was his willingness to be put in situations where he could be asked such questions. Being grilled about “crimes” was something that happened only when Kissinger was taken by surprise.

In 1979, for example, British journalist David Frost shocked Kissinger by posing hard-hitting questions about the bombing and invasion of Cambodia. Frost suggested that the policies Kissinger promoted had created conditions that led to the Khmer Rouge takeover and the genocide that left up to two million Cambodians dead. After the first taping, an irate Kissinger complained long and hard to the top brass at NBC, who leaned on Frost to go softer on the famed diplomat in the next taping.

There is little evidence that his skin has thickened since then. In 1999, when Kissinger was plugging the third and final volume of his memoirs, British journalist Jeremy Paxman challenged him about Cambodia and Chile. Kissinger answered, testily, and then walked out. At a State Department event in 2010, historian Nick Turse asked him about the number of Cambodians who were killed in the US bombings. “Oh, come on,” Kissinger said angrily. When Turse followed up later, Kissinger became sarcastic — “I’m not smart enough for you,” he said — and stalked off.

Last month, when Kissinger sat down to talk about his hundredth birthday with his long-time friend Ted Koppel, former host of the popular television show Nightline, he probably expected a softball interview like the ones the American media usually serve him. But Koppel felt obliged to point out that some people consider him a war criminal. He brought up the bombing of Cambodia, intending to suggest that Kissinger had valid strategic reasons for supporting it. “You did it in order to interdict…,” Koppel said, heading to the explanation Kissinger has always given: it was not so much Cambodia that was being bombed, and certainly not Cambodians, but North Vietnamese supply lines.

Kissinger heard only criticism. “Come on,” he interjected in an irritated tone. When Koppel tried to press him on the price Cambodia paid, Kissinger again interrupted with “Come on now.” For Kissinger, the topic is not worthy of discussion.

As this exchange indicates, Kissinger’s belief in his own righteousness is unbudgeable. In his view, then and now, the bombing of Cambodia was a strategic necessity. He long claimed, falsely, that bombs hit areas “either minimally populated or totally unpopulated by civilians.” In fact, American bombs killed and wounded tens of thousands of innocent Cambodians who were simply trying to live their lives in their own villages.

It’s not that Kissinger wants to argue that the costs of his (and Nixon’s) policies were worthwhile. He prefers to ignore the costs altogether. To him, counting the lives lost, in Cambodia and elsewhere, is a distraction. What matters is that the policies he advocated, in his perception, prevented American deaths and led to a more peaceful world order that saved millions of lives — indeed, potentially saved humanity from nuclear conflagration.

Detractors obsessed with the costs are, in his view, disingenuous and even deranged. In the 1970s he sneered at anti-war protesters as driven by “self-hatred.” Dismissing their arguments as irrational, he has said that leftists just want to “feel sorry for themselves.” Those who talk about his alleged criminality, in his view, merely show their “ignorance.” If you question the bombing of Cambodia, he told Koppel, you simply don’t want “to think.”

Hurt and resentful at being denounced, Kissinger has as little empathy for his critics and their perspectives as he had for the Cambodians and others who bore the brunt of his choices. If he had shown anything other than smug indifference to the price paid for his diplomacy, he might have diminished some of the zeal of his tormentors. But he remains locked in a maximalist position: unwilling to express any remorse, he ensures that his antagonists see only his guilt. •

The post Kissinger and his critics appeared first on Inside Story.

]]>
https://insidestory.org.au/kissinger-and-his-critics/feed/ 10
The world after John Curtin https://insidestory.org.au/the-world-after-john-curtin/ https://insidestory.org.au/the-world-after-john-curtin/#comments Fri, 24 Nov 2023 05:02:39 +0000 https://insidestory.org.au/?p=76523

What guidance for the challenges facing the planet can we find in the words of one of Australia’s greatest prime ministers?

The post The world after John Curtin appeared first on Inside Story.

]]>
The statement for which John Curtin is most renowned came early in his prime ministership, at the end of 1941. It is recalled now almost as a sacred text. As news from Malaya worsened and the Japanese forces swiftly advanced south, Curtin readied Australians for war in their own hemisphere. The war against Japan, he explained, was “a new war.” “The Pacific struggle” was distinct; this war in Australia’s own region, he implied, was equal in gravity to the war against Germany.

Curtin’s famous statement came in late December — and I will quote it because it is meet and right so to do. The prime minister said: “Without any inhibitions of any kind, I make it quite clear that Australia looks to America, free of any pangs as to our traditional links or kinship with Britain.”

With those carefully chosen words — “inhibitions,” “pangs” and “kinship” — Curtin acknowledged that this geopolitical pivot carried an emotional cost for Australians. The population was still overwhelmingly of British descent and “home” was Britain, even for many of those born here. Curtin’s words therefore implied a national coming of age, a relinquishment of childhood dependence, a step into maturity. A British dominion was asserting an independent foreign policy. Australia, facing peril, was insisting on a direct, unmediated relationship with the United States of America.

When we think of Curtin, it is so often this declaration that comes to mind for it represents a cool Australian assessment of geopolitical realities at a moment of existential threat for the nation. My predecessors as lecturers in this series have often revisited this declaration too. They have analysed the geopolitical world of Curtin and its transformation through the decades that followed: superpower rivalry and the cold war, the reconstruction of postwar society, the strengthening American alliance, the rise of China, empire and decolonisation, the reckoning with a settler nation’s colonial past, Australia’s defence and security in a globalised world. These are all extrapolations of the world Curtin knew; he either played a part in bringing them about or might reasonably have foreseen them. His words echo down the years with enduring meaning.

But there is a dimension of the future that he could not possibly see or even imagine. Indeed, it has blindsided us all. That is my subject tonight.

When John Curtin died in office in 1945, his legendary status was confirmed and his words gained even more weight. The year of his death became another turning point: the loss of a revered prime minister, the end of the second world war, a new era of social reconstruction in which Curtin had invested, the beginning of a long economic boom such as Australia had not known since the 1880s, and the unleashing of the atomic bomb.

The atomic era was born eleven days after Curtin’s death. On 16 July, the world’s first nuclear device was exploded at the Trinity test site in New Mexico. Stratigraphers identify geological eras by residues in rocks, and 1945 is marked in sediment by the abrupt global geological signature of nuclear fallout.

Curtin was acutely conscious of Australia’s place in the world. “World-mindedness” was a common phrase in the 1940s, expressing an aspiration for peace and understanding after decades of war. Curtin also thought globally, for he was a citizen of an empire that spanned the Earth, a pacifist and a politician keenly aware of the international labour movement. He was conscious that a land at the bottom of the globe could not isolate itself from an increasingly connected world. He revived and extended immigration and joined international negotiations leading to global institutions like the International Monetary Fund and the World Bank. His colleague, Dr Evatt, would later serve as president of the United Nations General Assembly.

So there was world-mindedness and there were global social and political perspectives, but did Curtin ever think in terms of the planet, a living, breathing, vulnerable Earth? Probably not. This requires environmental thinking in deep time and deep space, a consciousness that has evolved in our own lifetimes. It’s a perspective and an understanding that Curtin and his contemporary leaders could not have foreseen or even imagined.

John Edwards writes beautifully in the first volume of his book, John Curtin’s War (2007), of Curtin’s sense of time and space. Edwards reconstructs Curtin’s regular commute across the Nullarbor — his crossing of the vast treeless plain by train from Perth to Canberra, a journey that took him five nights and four days on six different trains with five changes of gauge. He describes Curtin and his fellow passengers smelling “the faint dry fragrance” of saltbush and mallee scrub “as it had been for millions of years.” When stretching their legs during the stops, they walked the bed of an ancient sea and “crunched fossils of sea creatures underfoot.” Edwards reminds us that “In its entire length the Trans-Australian track did not cross a single permanent stream of water.”

What a path to the parliament! There were 500 kilometres of “precisely straight track” surrounded by desert where Curtin “could see the circle of the plain around him from horizon to horizon.” At night through the right-hand windows he could pick out the points of the Southern Cross. He preferred not to fly, and anyway, the air services were neither frequent nor comfortable. But later during wartime, when he was forced to fly the Atlantic, Curtin told his secretary that he placed his hopes of making the crossing in the skill of the pilot, the rotation of Earth, and God Almighty. That is, human ingenuity, the steady old reliable planet, and God.

It is that view of the steady old reliable planet, the unchanging Earth, that has been disrupted in our lifetimes. How has our understanding of the world — the planet — changed since John Curtin’s death?


In the first decades of the twenty-first century we are living in “uncanny times,” weird, strange and unsettling in ways that question nature and culture and even the possibility of distinguishing between them. The modern history of the Western world — the Renaissance, the expansion of European peoples across the globe, the Scientific Revolution of the seventeenth and eighteenth centuries, the dawning of the Enlightenment, the Industrial Revolution — these are chiefly stories of the separation of culture from nature; indeed, they are stories of the mastery of culture over nature. Now in our own time we find nature and culture collapsing into one another all around us. No wonder it feels uncanny.

The Bengali writer Amitav Ghosh uses the term “uncanny” in his book The Great Derangement: Climate Change and the Unthinkable (2016). For him, the word “uncanny” captures our experience of what he calls “the urgent proximity of non-human presences.” He’s referring to other creatures, insects, animals, plants, biota, the very elements themselves — water, earth, air, fire — and our renewed and long-forgotten sense of dependence upon them.

The planet is alive, says Ghosh, and only for the last three centuries have we forgotten that. We have been suffering from “the Great Derangement,” a disturbing condition of wilful and systematic blindness to the consequences of our own actions, when we are knowingly killing the planetary systems that support the survival of our species. That’s what’s uncanny about our times: that we are half-aware of this predicament yet also paralysed by it, caught between horror and hubris.

We inhabit a critical moment in the history of Earth and of life on this planet, and a most unusual one in terms of our own human history. To understand the implications of the present, we have to learn to think in deep time.

It’s very hard for us humans to comprehend or even imagine deep time. If you think of Earth’s history as the old measure of the English yard, that is, the distance from the King’s nose to the tip of his outstretched hand, then one stroke of a nail file on his middle finger erases all of human history. The discussion of deep time is full of these sorts of metaphors — human history as the last inch of the cosmic mile, the last few seconds before midnight, the skin of paint atop the Eiffel Tower. Metaphor is possibly the only level on which we can comprehend such immensities of time.

In the last couple of decades we have developed three powerful historical metaphors for making sense of the ecological crisis we inhabit. One is that we live in the Sixth Extinction. Humans have wiped out about two-thirds of the world’s wildlife in just the last half-century. Let that sentence sink in. It has happened in less than a human lifetime. This is an extinction rate a hundred to a thousand times higher than was normal in nature.

There have been other such catastrophic collapses in the diversity of life on Earth, five of them sudden shocking falls in the graph of biodiversity separated by tens of millions of years, the last one in the immediate aftermath of the asteroid impact that ended the age of the dinosaurs sixty-five million years ago. We now have to ask ourselves: are we inhabiting — and causing — the Sixth Extinction? In 2014 the American journalist Elizabeth Kolbert wrote an influential book called The Sixth Extinction, and she subtitled it An Unnatural History. It is unnatural because the Sixth Extinction involves, to some extent, our consciousness and intent.

Another metaphor for the extraordinary character of our times is the idea of the Anthropocene. This is the insight that we have entered a new geological epoch in the history of Earth and have now left behind the 12,000 years of the relatively stable epoch known as the Holocene, the period since the last great ice age. The new epoch of the Anthropocene recognises the power of humans in changing the nature of the planet, its atmosphere, oceans, climate, biodiversity, even its rocks and stratigraphy. It places humans on a par with variations in Earth’s orbit, glaciers, volcanoes, asteroid strikes and other geophysical forces.

There is debate about exactly when the Anthropocene began, but one definition is that we were first jolted into the new epoch by the Industrial Revolution in the late eighteenth century, when we began digging up and burning fossil fuels. That brilliant and profligate exploitation of a finite, buried resource underpinned population growth and economic expansion — and it also unleashed carbon on a massive and accelerating scale and began changing the atmosphere of the planet.

Another date given for the beginning of the Anthropocene is around 1945, the year of Curtin’s death. It was, as we’ve seen, the beginning of the atomic era. It also initiated an exponential shift in the impact of humans on the planet. In the mid twentieth century, the human enterprise exploded dramatically in population and energy use and rapidly began to outstrip its planetary support systems. World population, water use, tropical forest loss, ocean acidification, species extinction, carbon dioxide and methane emissions, fertiliser consumption and so on, all soared after 1950. This turning point is known as the Great Acceleration.

So I’ve talked about the Sixth Extinction and the Anthropocene. And there is a third potent metaphor for the moment we inhabit. It concerns the history and future of fire. It suggests that we are entering not just the Anthropocene but also a fire age that historian Stephen Pyne has called the Pyrocene. The planet is heating due to human greenhouse gas emissions and it is heating so quickly that it threatens to tip Earth into an escalating cycle of fire. In other words, we are entering an extended fire age that is comparable to past ice ages.

Let’s take a moment to think about those ice ages.

Some 2.6 million years ago, Earth entered a period of rhythmical ice ages — a geological epoch called the Pleistocene — and during this epoch average global temperatures dropped 6–10°C and ice sheets at the poles extended dramatically across Eurasia and North America. These repeated glaciations were harsh and demanded innovation and versatility; they were a selective pressure on evolution and promoted the emergence of humanity on Earth. Throughout the Pleistocene, the ice ages were punctuated by brief warmer periods known as interglacials, which generally lasted about 10,000 years.

We are living in an interglacial right now; geologists have separated it off from the Pleistocene and called it the Holocene, which means “recent.” But it is really part of the same rhythmic pattern that has prevailed since we evolved. We humans are creatures of the ice. The Pyrocene — the fire age — is something we’ve never seen before. The Pyrocene threatens to knock Earth out of the steady planetary rhythm that has seen the birth of our own species.

How do we know about these ancient rhythmic ice ages? By reading the rocks, of course, but now also by studying the ice itself. I’m fortunate to have visited both of Earth’s ice caps, and the most awesome one is definitely ours, the southern one, Antarctica. I twice voyaged south with the Australian Antarctic Division, on the second occasion at the invitation of the Australian government to mark the centenary of Douglas Mawson’s Australasian Antarctic Expedition of 1911–14. After a long wait for a break in the weather, we held a ceremony on the ice at the historic huts, the place Mawson called “the home of the blizzard.” Through the years of Curtin’s political life, Antarctica was becoming a primary site for Australia’s world-mindedness, and in 1959 our nation was one of the original twelve signatories to the Antarctic Treaty, which was effectively the first disarmament treaty of the nuclear age.

Antarctica is where nine-tenths of the world’s land ice resides. Seventy per cent of Earth’s fresh water is locked up in that ice cap. That’s a discovery humans made in my lifetime. Antarctica is not only the coldest and windiest continent; it is also paradoxically the driest — and it is the highest. It has the highest average height of any continent because it is a great dome of ice four or five kilometres thick that has built up over millions of years. In the 1950s we discovered that the driest of all continents is actually a vast elevated plateau of frozen water. The implications of that discovery are immense: it means that world sea levels are principally controlled by the state of the Antarctic ice sheet. If the southern ice cap melted, oceans would rise by more than sixty metres.

As we enter the Pyrocene, Antarctica is vulnerable and fragile, more brittle than we expected. This year the expanse of winter sea ice around Antarctica diminished dramatically below its average by the size of Western Australia. The continent of ice is a precious glistening jewel that holds the key to our future and to our past. It’s a giant white fossil, a luminous relic, a clue to lost ages: it enables us to travel through time to the Pleistocene Earth. The ice is an amazing archive. Embedded in an ice cap are tiny air bubbles from hundreds of thousands of years ago. When you drill into an ice cap kilometres thick, you can extract a core that is layered year by year, a precious archive of deep time. I think of ice cores as the holy scripts, the sacred scrolls of our age.

The deepest Antarctic cores currently retrieve 800,000 years of climate history. Right now, the search is on for the first million-year ice core, and Australia is involved in the quest.

In the 1990s, a long 400,000-year Antarctic ice core was extracted from the inland ice sheet. It produced a rhythmic, sawtooth graph of past ice ages, revealing the heartbeat of the planet. The brief peaks on the graph represented warmer interglacials; the extended troughs were the cold ice ages. The ice core charted four full cycles of glacial and interglacial periods and established that the carbon dioxide and methane concentrations in the atmosphere moved in lockstep with the ice sheets and the temperature. It’s the barometer of the planet’s health — a graph of its nervous system — through hundreds of thousands of years.

Ice cores also revealed that present-day levels of greenhouse gases are unprecedented during the past 800,000 years. The level of carbon dioxide in the historical air bubbles has leapt since the Industrial Revolution, and especially since 1950. So, before Antarctica was even seen by humans, it was recording our impact. And it was this glimpse of the deep past as revealed in the archive of ice that shocked people into a real sense of urgency about the climate crisis.


These three metaphors — the Sixth Extinction, the Anthropocene and the Pyrocene — are historical concepts that require us to travel in geological and biological time across hundreds of millions of years and then to arrive back at the present with a sense not of continuity but of discontinuity, of profound rupture in our own time. That’s what Earth system science has revealed: it’s now too late to go back to the Holocene. It may even be too late to hang onto the Pleistocene, the long epoch that birthed our species. We’ve irrevocably changed the Earth system and unwittingly steered the planet into an uncertain future; now we can’t take our hand off the tiller. We have to use our awesome power wisely.

The metaphors of deep time that we’ve been considering have some visual counterparts in deep space that have also emerged in the last half-century. In 1968, the historic Apollo 8 mission launched humans beyond Earth’s orbit for the first time, out and across the void and into the gravitational power of another heavenly body. For three lunar orbits, the three astronauts studied the strange, desolate, cratered surface below them and then, as they came out from the dark side of the Moon for the fourth time, they looked up and gasped:

Bill Anders: Oh my God! Look at that picture over there! Here’s the Earth coming up. Wow, is that pretty!

Frank Borman: Hey, don’t take that, it’s not scheduled.

They did take the unscheduled photo, excitedly, and Earthrise became famous, perhaps the most famous photograph of the twentieth century, the blue planet floating alone, finite and vulnerable in space above a dead lunar landscape. Frank Borman said, “It was the most beautiful, heart-catching sight of my life.” And Bill Anders declared, “We came all this way to explore the Moon, and the most important thing is that we discovered the Earth.”

A few years later, in 1972, a photo taken by the Apollo 17 mission and known as The Blue Marble became one of the most reproduced pictures in the world, showing Earth as a luminous breathing garden in the dark void. Earthrise and The Blue Marble had a profound impact on environmental politics and sensibilities.

Within a few years, the American scientists Lynn Margulis and James Lovelock put forward “the Gaia hypothesis”: that Earth is a single, self-regulating organism. In the year of the Apollo 8 mission, Paul Ehrlich published his book The Population Bomb, an urgent appraisal of a finite Earth. During the years of the Moon missions, British economist Barbara Ward wrote Spaceship Earth and Only One Earth, revealing how economics failed to account for environmental damage and degradation, and arguing that exponential growth could not continue forever.

Earth Day was established in 1970, a day to honour the planet as a whole, a total environment needing protection. In 1972, the Club of Rome released its controversial and enormously influential report The Limits to Growth, which sold more than thirteen million copies and went into more than thirty translations. Authors Donella Meadows and Dennis Meadows wrestled with the contradiction of trying to force infinite material growth on a finite planet. The cover of their book depicted a whole Earth, a shrinking Earth.

Two decades later, on Valentine’s Day 1990, the Voyager spacecraft was tracking beyond Saturn, six billion kilometres away, when it unexpectedly glanced over its shoulder. Again, Voyager was not programmed to look behind as it journeyed into the unknown, but scientists decided to take a risk and commanded the spacecraft to look back. And so we have a picture of Earth as a mere speck of dust in space, an image that astronomer Carl Sagan called Pale Blue Dot. “Look again at that dot,” wrote Sagan. “That’s here. That’s home. That’s us.”

These images from outer space of the unity, finiteness and loneliness of Earth helped escalate planetary thinking. From a colossal integration of Earth systems data came a keen understanding of planetary boundaries — thresholds in planetary ecology — and the extent to which the human enterprise is threatening or exceeding them. Three identified thresholds have already been crossed: changes in climate, biodiversity and the nitrogen cycle. At least we now understand our predicament even if we are perilously slow to act. The fossil fuels that got humans to the Moon now endanger our civilisation.


Now let’s bring this story back home to our place on this Earth. Australia is uniquely exposed to the grim, rough edges of these new world narratives. Shockingly, we are leading the world into the Sixth Extinction. Modern Australian history is like a giant experiment in ecological crisis and management. Ecologists working in Australia today often feel like they are ambulance drivers arriving at the scene of an accident. The southwest of Western Australia, for example, is one of the world’s biodiversity hotspots and it is experiencing an exceptional loss of habitat. It is the site of what literary historian Tony Hughes-d’Aeth has called a “radical disappearance,” “an extinction event on a grand scale.”

And we inhabit the continent of fire, the driest inhabited continent, a land of drought and flooding rains that is held in the grip of the El Niño–Southern Oscillation, which means that Australia is on the frontline of the Pyrocene. Southwest Western Australia, with its sudden 30 per cent decline in rainfall since the 1970s, is one of the first places to experience the climatic shift expected with global warming. The Black Summer fires — when more than twenty-four million hectares of Australia’s southern and eastern forests burnt, including a million hectares of the Great Western Woodlands — were a symptom of our condition and became a planetary event. Smoke from those fires encircled the globe.

Furthermore, our modern history is a by-product of the Anthropocene. The British invasion of Australia was part of the age of empire and took place as the Industrial Revolution gathered momentum in England. Thus ancient Australia’s transformation into a colony coincided with the start of the fossil fuel era. The Endeavour was a repurposed coal ship. The new nation became highly dependent on fossil fuels, especially on coal, and in recent decades it drew world attention by persisting with the political denial of climate change. Modern Australia, we have to remember, was built on denial: the denial of Aboriginal sovereignty and cultural sophistication, the denial of frontier violence and warfare. At the recent referendum about the Voice, we witnessed a further national expression of denial.

But we have many opportunities here too. Our robust democracy, our active citizenship, our capacity for creativity and innovation, our impressive community leaders (many of them young, most of them women), our unique and inspiring environment, our destiny as a renewable energy superpower. And the continent’s deep Indigenous human history. In just a generation we have turned upside down the way we understand the history of Australia.

When I was in primary school, the history of this country was told as a footnote to the story of the British empire. In my classroom, the book we used was A Short History of Australia, written in 1916 by Professor Ernest Scott. It began with what he declared was “a blank space on the map” and it ended with “a new name on the map” — that of Anzac. So the story of Australia climaxed with a national sacrifice on a beach on the other side of the world. Australia at that time was seen as a new, transplanted society with a short and derivative history, a planned, peaceful and successful offshoot of imperial Britain. Aboriginal peoples, depicted as non-literate, non-agricultural, non-urban and non-national, could have no “history” and did not constitute a “civilisation” — thus they could find no place in the national polity or the national story or even as citizens of the Commonwealth.

But in the half-century that followed, Australians realised that the New World they thought they’d discovered was actually the Old, and that the true “nomads” were themselves, the colonisers who had come in ships. From the early 1960s, archaeologists confirmed what Aboriginal people had always known: that Australia’s human history went back eons, into the Pleistocene, well into the last ice age, earlier than Europe’s. The timescale of Australia’s human history increased tenfold in just thirty years and the journey to the other side of the frontier became a journey back into deep time.

We now recognise the first Australians as the most adventurous of all humans, pioneer sea-voyagers who, over 60,000 years ago, saw the beckoning, burning continent of eucalypts glowing over the horizon of the ocean. The island continent girt by sea was transformed into a complex jigsaw of beloved and inhabited Aboriginal Countries and ecologies. Aboriginal societies were — and are — diverse, innovative and adaptive; over 300 languages flourished here. Now our histories of Australia strive, as the Uluru Statement puts it, to let “this ancient sovereignty… shine through as a fuller expression of Australia’s nationhood.” This challenge is not going away, no matter how many toddler tantrums the nation has. Reckoning with our colonial history is a daily responsibility of living on this continent.

Therefore we can now see more clearly that, on Australian beaches in the late eighteenth and early nineteenth centuries, there took place one of the greatest ecological and cultural encounters of all time. Peoples with immensely long and intimate histories of habitation encountered the furthest-flung representatives of the world’s first industrialising nation. The circle of migration out of Africa more than 80,000 years earlier finally closed.

This is a land of a radically different ecology, where climatic variation and uncertainty have long been the norm — and now those extremes are intensifying. Australia’s long human history spans great climatic change and also offers a parable of cultural resilience. The history of the Aboriginal peoples of Australia takes humans back, if not into the ice, then certainly into the ice age, into the depths of the last glacial maximum of 20,000 years ago and beyond, into and through periods of average temperature change of 5°C and more, such as those we might now face.

When Europeans and North Americans look for cultural beginnings, they are often prompted to tell you that humans and their civilisations are products of the Holocene and that we are all children of this recent spring of cultural creativity over the last 10,000 years. By contrast, an Australian history of the world takes us back to humanity’s first deep sea navigators and to the experience of people surviving cold ice-age droughts even in the central Australian deserts. It brings us visions of people living along fast-retreating coastlines as they cope with the dramatic rising of the seas.

Human civilisation here was sustained in the face of massive climate change. This is a story that modern Australians have only just discovered, and now perhaps it offers a parable for the world. The continent of fire will lead the world into the new age of fire. But it also carries human wisdom and experience from beyond the last ice age.

Living on a precipice of deep time has become, I think, an exhilarating dimension of what it means to be Australian. We can now see that the modern Australian story, in parallel with other colonial cataclysms, was a forerunner of the planetary crisis. Indigenous management was overwhelmed, forests cleared, wildlife annihilated, waters polluted and abused, the climate unhinged. Across the globe, imperial peoples used land and its creatures as commodities, as if Earth were inert. They forgot that the planet is alive.


In the third decade of the twenty-first century, it is clear that Australia is facing a new existential threat, quite different from that which Curtin addressed in 1941. We are embroiled in a climate emergency and biodiversity crisis that threaten to destroy our security and way of life.

It’s not just a threat; it’s actually going to happen unless we act swiftly and decisively. It is a planetary event, but Australia and its region are especially vulnerable to its effects. National security assessments and reports from Australian defence chiefs have acknowledged our predicament, identifying the climate crisis as “this clear and present danger,” “the greatest threat to the security and future of Australians” and “the Hundred Year War” for which we are seriously unprepared. To meet the challenge, we will need to recognise that we do indeed face a crisis, an emergency, and that we will be required to mobilise with a grave sense of urgency as if in a war.

In that December 1941 address to the people, Curtin sought to wean Australians off a subconscious cultural reflex to trust to luck, isolation and Britain. “I demand,” he said, “that Australians everywhere realise that Australia is now inside the firing lines.” He spoke of the need to shake citizens out of false assumptions of security; he talked of awakening “the somewhat lackadaisical Australian mind” and of the “reshaping, in fact revolutionising, of the Australian way of life until a war footing is attained quickly, efficiently and without question.” “We can and we will,” he promised.

What would a brave but realistic geopolitical pivot look like in our own time? What would constitute a Curtinesque act of visionary leadership now?

I think it would entail a recognition that, because of our extreme ecological and economic vulnerability in this escalating crisis, Australia needs to lead the world into the energy transition. Not to drag its feet, not to wait for other nations, but actually to demonstrate the path to zero emissions. To provide global direction and inspiration. And to do so out of intelligent national self-interest as well as out of “world-mindedness.”

Australia needs to grasp its opportunity as a renewable energy superpower. It needs to wean itself swiftly off its fossil fuel dependency, not cling to old, polluting forms of power and vested interests. A Western Australian like John Curtin would have to take on that challenge in the mining state, reminding constituents of the long-term significance of minerals in the renewable future. Of course it will be difficult and fraught. But that is what leadership is about: stepping wisely into the future that is coming for you.

Yes, it will be difficult but it is also simple. The physics of the planet are simple and we know what we have to do and what will happen if we don’t. The enemies of action are either ignorant and short-sighted or selfish and greedy. The pathway to electrification has been laid down clearly. The technologies are there or fast developing, as is the business momentum.

But the free market can’t move fast enough and government must lead. Even funding for the transition is readily available in the form of massive government fossil fuel subsidies that can be diverted, and windfall profits to the oil and gas industry that demand to be taxed. The economic, social and environmental benefits to the nation will be immense. I believe that the people are ahead of government on this and that they will welcome bold leadership. To paraphrase John Curtin, we should step into that future now, quite clearly, without any inhibitions of any kind, and free of any pangs as to our traditional links or kinship with coal, oil, gas, Murdoch and Rinehart. •

The post The world after John Curtin appeared first on Inside Story.

]]>
https://insidestory.org.au/the-world-after-john-curtin/feed/ 1
The Lebers, a family of ratbags https://insidestory.org.au/the-lebers-a-family-of-ratbags/ https://insidestory.org.au/the-lebers-a-family-of-ratbags/#comments Wed, 22 Nov 2023 22:28:14 +0000 https://insidestory.org.au/?p=76511

Shaped by history, Sylvie Leber and her forebears have campaigned for social change

The post The Lebers, a family of ratbags appeared first on Inside Story.

]]>
Sylvie Leber describes herself as a “ratbag.” It’s in the blood, she says. Sylvie attended her first protest in 1967, age sixteen, joining a crowd gathered at Melbourne’s Government House to oppose a visit by Nguyễn Cao Kỳ, prime minister of South Vietnam and a vital American ally in the prosecution of the Vietnam war. Many more protests have followed. She’s been roughed up and worn bruises but never arrested, she says with a hint of surprise. Probably it’s a matter of time. Now in her seventies, she’s still raising her voice for social justice.

Her causes are many and diverse, but linked by a unifying thread: always, Sylvie sides with the oppressed. For nearly sixty years she has fought for women’s rights, refugee causes, and for anyone whose treatment she deems unfair. Perhaps the best measure of her conviction is that she holds fast to causes, even at risk of personal cost.

Sylvie traces her radical roots to her Jewish paternal grandparents, David Leber and Rivka Szaladajewska, whose motivating creed was social and political change. Rivka was born on 26 September 1896 to an observant Jewish family in the Polish city of Łódź. She would later reject religion, and her family her, but she maintained a cultural and social connection to Judaism, working at the Grosser orphanage for Jewish children in the central Polish city of Piotrków Trybunalski. Her fierce commitment to the politics of the left, at a time when Jews were among the most prominent advocates for social democratic causes in eastern Europe and Russia, was another point of connection to her Jewish heritage.

David Leber was born in Piotrków Trybunalski, Poland, on 10 January 1887. While the details of his early life are sketchy, he was motivated from a young age by the tenets of social democracy. He was schooled first at a yeshiva, but left religious education to embrace Bundism, the influential secular Jewish movement that agitated for social and labour reform.

Bundism led David to Russia, and trouble. The February revolution of 1917 saw Bundists and Mensheviks align in a union of social democratic parties. When the February, or Menshevik, revolution was supplanted in October by the more radical Bolsheviks, the Bundists who supported Menshevism became pariahs, dismissed by the new regime as ineffectual gradualists and enemies of the communist state.

Though not a Menshevik himself, David was damned by association. His link to the Menshevik cause appears to have led to his arrest and deportation to Siberia. An accusation that he had sought to assassinate a public official may have been the pretext for his arrest. Whether or not he escaped from Siberia or was released, he is thought to have been rescued from the Soviet Union on a British ship.

Now back in Poland, David found work as a waiter, and met Rivka. Worsening anti-Semitism prompted them to leave their homeland for France in 1922. Rivka was pregnant when they made their way west, and a son, Samuel, was born in Paris on 3 December 1922. Against Jewish custom he was not circumcised, and David and Rivka didn’t marry until twelve years after his birth, with Rivka keeping her maiden name. Her choice to be known as Rivka Szaladajewska was both a stab at patriarchal custom and an affirmation of identity. Others might have seen the Polish suffix “jewska” as a millstone, but not her.

David and Rivka became part of a Parisian left-wing milieu that included other Jewish émigrés, among them the Russian-born artist Marc Chagall, with whom they became good friends. John-Paul Sartre and Simone de Beauvoir were part of the same circle. David and Rivka brought their passions with them to Paris: they were united in their dislike of conventions for which they saw no purpose and in their commitment to social democratic principles and secularism. David continued to see these beliefs as inherent in Bundism: his experiences in Russia didn’t dim his enthusiasm for the Bundist approach to social change. A manifestation of his commitment to community and history was his involvement in founding the Medem Library in 1929, which is still the most important site of Yiddish learning in Europe.

David and Rivka were two of the thousands of Jews ensnared in 1942 by Operation Spring Wind, in which officials of the Vichy French state cooperated with the Nazi regime to arrest foreign and stateless Jews living in France. The operation was the first step in a plan to send Jews east to Auschwitz and their deaths. The two of them were arrested on 16 July in the infamous Vél d’Hiv round-up, interned at the Drancy transit camp in Paris, and then deported to Auschwitz on 24 July as part of convoy number 10. They were killed at Auschwitz, probably later in 1942, though when exactly isn’t certain. Rivka is thought to have taken her own life, throwing herself on an electric fence after she learnt that she was to be a victim of one of Josef Mengele’s depraved experiments.

Two years earlier, when the Germans marched on Paris, Rivka and David’s son Sam was a seventeen-year-old school student living with his parents in the 20th arrondissement. His response to the German advance was to cycle to the port of Royan on the Atlantic coast in the hope of finding passage to England. This plan failed and he returned to Paris, where he remained until November 1941, when David and Rivka compelled him to leave the city for Lyon in the zone libre, where he lived with friends.

German occupation of the zone in November 1942 prompted Sam to move to Grenoble, where he worked as a lathe turner before being corralled into the Chantiers de la Jeunesse Française, a national service scheme imposed on French youth by the Vichy state. Released from this obligation in mid 1943, he managed to avoid another, more insidious labour scheme — the Service du Travail Obligatoire, which sent young French men to Germany as indentured labour — by joining the Resistance in late 1943 or early 1944.

For life as a maquisard, Sam chose the stirring alias Serge Rebel, his new surname a testament to his task and heritage: Rebel is the anadrome of Leber. From David and Rivka he had inherited a commitment to the ideals of Bundism and socialism. He had joined the SKIF, the youth wing of the Bundist movement, in 1931, the year he turned nine. A strong anti-communist streak may have been another inheritance, though his opposition was fed also by his own experiences.

During the Spanish civil war of 1936–39, he had travelled to Spain to fight with the Republicans against Franco’s Nationalist forces. He was turned away on account of his youth, but took from the war an understanding that communists had undermined the Republican cause by concerning themselves more with anarchists than fascists.

His dislike of communism hardened when the French Communist Party, echoing Moscow’s line, adopted a neutral position at the start of the second world war, a stance he thought amoral and hopelessly naive. Later, in the Resistance, he objected to the division of the organisation along communist and anti-communist lines as a needless distraction. In his thinking, communists too often missed the point of the fight. And the point of any battle was to act, not to posture.

In the Resistance Sam worked in intelligence and sabotage. He and his fellow maquisards couldn’t spare explosives to destroy railway tracks so prised them out of position, ensuring that carriages travelling the tracks at Grenoble, an important railway junction connecting different parts of France, would tip over. Precious explosives were reserved for attacking factories that sustained the German war effort. In one instance, Sam recalled, bombs were used to kill German soldiers, but more often their targets were objects rather than people, collaborators aside. For traitors, direct violence always seemed justified.

Sam served with the Resistance until the liberation of France. His rewards were the Croix de Guerre, citations for brave conduct and good service, and a bullet wound, sustained during a firefight with German soldiers in March 1944, which led to three months in hospital, a shortened leg and a permanent and painful limp. Several decades on, Sam was diagnosed with motor neurone disease, which doctors linked to spinal damage caused by his limp.


In later years Sam mentioned his war rarely, and usually only when pressed. School students and Holocaust historians sought him out for interviews, seemingly surprised to find a decorated maquisard living in McKinnon in suburban Melbourne. Sam obliged these requests, with humility and a trace of bemusement. He had fought the war against fascism as a solemn and obvious duty, a position that precluded the shaping of recollections as personal achievement. The fact he was speaking in his third language, after Yiddish and French, may also have shaped his responses, which could seem blunt.

“Sometimes an action went wrong and people got killed and things like that,” he told two interviewers in the 1980s. Nazi collaborators, he added, were “interdicted” on their way to or from work. These answers, on first reading dispassionate and perhaps even callous, did not reflect the man. Rather, they hint at Sam’s lifelong and noble belief in the primacy of the collective cause over the claims of the individual.

After the war Sam returned to Paris, a city that had visited both kindness and cruelty on the Lebers. His parents had found blessed sanctuary there in the 1920s; twenty years later, it was the place of their betrayal. He met Madeleine Benczkowski, and they married in 1948. His new wife, also French-born of Polish Jewish heritage, had been born in Paris on 20 January 1926. Her parents, Herschel and Chaya Benczkowski, had emigrated west from Poland in the years after the first world war.

Herschel was murdered at Drancy in 1942. Madeleine, her brother Sam and their mother Chaya survived the war thanks to the people smugglers who spirited them from Paris to Lyon, where they lived under false names and Madeleine was able to earn money as a furrier’s apprentice. In Madeleine’s vocabulary, “people smuggler” could be a term of endearment and a pejorative. She knew three types of people smuggler — humanitarians, money-makers and “bastards” who betrayed Jews to Nazis. The Benczkowskis’ saviour was a humanitarian and a money-maker, having taken payment in jewellery.

Sam and Madeleine began their married life in Paris as tailors, making men’s trousers from home. Their daughter Sylvie was born on 30 May 1950. The next year they resolved to emigrate to Australia, their decision to leave France prompted by the Korean war and the threat of another world war. They considered Canada, but chose Australia on the advice of Rose and Leon Goldblum, Sam’s cousin and her husband, who were living in Melbourne and recommended the city as a good, safe place to raise children. Rose and Leon were Auschwitz survivors. A preference for a warmer climate may also have influenced Sam and Madeleine’s choice. The Lebers sailed on the Italian ship Sydney, arriving at Station Pier, Port Melbourne, in February 1952.

The family settled into Australian life in Grey Street, St Kilda, within a milieu that offered comfort and connections to the world from which they had come. Melbourne in the 1950s, and St Kilda in particular, was home to a community of French-speaking Jews from France and Belgium. In their company Sam and Madeleine found friends with whom they shared a common language and aspects of a common heritage. As for so many other migrants across time and place, such connections to the familiar were a sustaining tonic in difficult years.

Before the war, Madeleine had hoped to be an accountant, Sam an engineer. After the war, steady work and a safe home were aspiration enough. Madeleine sought work as a jewellery shop assistant but was rejected on account of her French accent, so she returned to what she knew, working from home as a seamstress. Sam worked as a toolmaker, and fitter and turner. He joined the Australian Metal Workers’ Union: the union movement, and the postwar Australian Labor Party, reflected some of his Bundist ideals.

For Sylvie, the initial contrast between life in Paris and life in Melbourne was less abrupt than it was for her parents. She spoke French at home and Yiddish at her kindergarten at the Bialystoker Centre at 19 Robe Street, St Kilda, which served also as a hostel for Jewish migrants and refugees from Europe. The Alliance Française, where Sam and Madeleine borrowed French-language books, was on the same street. Such was Sylvie’s immersion in this European milieu that she knew little English when she started at St Kilda Park Primary School. Daniel, her brother, was born in 1959.

***

If Sam, who died in 2011 aged eighty-eight, was an “activist,” he probably didn’t recognise it. His engagement with the political was not a conscious choice but the manifestation of a commitment to social democratic ideals; in his conception, actions gave honour and worth to thoughts. To be political, if that’s what others called it, was simply his way of being.

Sylvie has followed the same path, her activism inseparable from her work and passions. In this regard she is her father’s daughter. Madeleine, who died in 2015, was a quieter social democrat than her husband: she voted Labor and hoped for a society ordered on fairness and merit rather than money and privilege, but was not overtly political.

In 1979 Sylvie and her friend Eve Glenn formed Girl’s Garage Band, a seven-woman punk rock band with Sylvie on bass guitar, Eve on lead guitar, and Fran Kelly, not yet an ABC journalist, on vocals. The band became better known as Toxic Shock, the name a pointed reference to the bacterial syndrome associated with tampon use that at the time was harming and killing many women. The band’s 1981 single “Intoxication,” written by Sylvie, protested at the complicity of tampon manufacturers in the prevalence of the syndrome.

Through Toxic Shock, Sylvie could voice specific protest, rail against the patriarchal nature of the punk and post-punk scenes and the music industry generally, and express her passion for music. Give-Men-a-Pause, a women’s music show she hosted on 3RRR in the early 1980s, offered another stage to voice thoughts on life and music. In a 2015 article about the contemporary Australian popular music scene, she wrote of her enduring love for playing and listening to music, and her dismay at the persistence of the boys’ club that Toxic Shock strove to disrupt.

For Sylvie, music has been a passion, a motivation and, on occasion, a refuge from horror. In Queensland in 1972, some years before forming Toxic Shock, she was raped and very nearly murdered. She has written with compelling honesty of these crimes, the toll they have taken on her mind and body over half a century, and her determination always to fight back lest “the bastards win.”

Her response to the assault might be described as Leberian, for its hallmarks are concern for others and a remarkable and enduring capacity to resist. Initially she sought to shield her parents from the attack, worried that they, as European Jews who had lived through the war, had experienced enough anguish. Later, her understanding of the Lebers’ commitment to social justice motivated her to speak publicly about what she had suffered. A year after she was assaulted, she and a group of friends founded Women Against Rape, Victoria’s first rape crisis centre, housed within the Women’s Health Centre on Johnston Street, Collingwood. Women Against Rape supported victims in every way possible, while advocating simultaneously for legal change and community education.

Sylvie is a passionate advocate for the rights of refugees and asylum seekers, the experiences of her parents and grandparents having taught her something of the pain and indignity of being denied a home. When Arne Rinnan, captain of the infamous MV Tampa, made his last voyage into Australian waters before retirement, Sylvie and other Melbourne members of the Refugee Action Collective took to a small boat so that they might approach his cargo ship and salute him for his role in rescuing imperilled refugees during the Tampa affair of 2001. Rinnan’s moral example elicited an idiosyncratic touch: to signal her admiration, Sylvie fashioned a placard decorated with a love heart. Love, Sylvie believes, “is a revolutionary emotion.”

Sylvie named her daughter Colette Anna — Colette for the pioneering French author and feminist, and Anna for a great aunt who survived Auschwitz. Colette is a social worker, committed to many of the same causes as her mother. She works to prevent violence against women, and argues for the rights of refugees, including protesting their abysmal treatment by Australian governments, Liberal and Labor. Colette is another Leber ratbag, which makes her mother proud. It’s in the blood. •

The post The Lebers, a family of ratbags appeared first on Inside Story.

]]>
https://insidestory.org.au/the-lebers-a-family-of-ratbags/feed/ 1
Writing the history of the present https://insidestory.org.au/writing-the-history-of-the-present/ https://insidestory.org.au/writing-the-history-of-the-present/#respond Tue, 21 Nov 2023 04:55:55 +0000 https://insidestory.org.au/?p=76487

Russia’s war against Ukraine is generating a rich historiography

The post Writing the history of the present appeared first on Inside Story.

]]>
“Whenever I read a book about the current war,” writes Andrey Kurkov in his preface to Olesya Khromeychuk’s heartbreaking account of the combat death of her brother, “I get the strange impression that this war is over. These books transport the reader into the past, even if it is just yesterday.”

Kurkov is right. Reading books about this war can have a soothing effect. Only once we look up from the pages that have captured our imagination are we propelled back to the awful knowledge that Russia’s war against Ukraine continues. Blood keeps flowing. People keep getting injured and killed. A country is being destroyed. There seems no end in sight.

Looking at the pages in front of them, historians have another experience as well. Trained to read for argument and to classify books into schools of thought, they begin to think about the books dealing with this war as part of its “historiography” — a corpus of texts engaged in a discussion with each other attempting to understand the past and its meaning for the present.

To see a historiography forming while the event it describes is still unfolding is unusual. Normally decades, even centuries, pass before historical schools solidify around a particular event. In this respect, the literature on Russia’s war against Ukraine resembles the historiography of Stalinism in its formative stages, but its historiographical schools are developing much faster in our present, pressurised environment.

Somewhat schematically, we can distinguish five schools of thought about the origins of this war. One group of writers sees the West at fault, and particularly NATO. Another blames Russia and the Russians, opponents of peace and democracy. A third group sees Russia’s imperialist past at the core of a war that expresses the legacy of a lost empire and the failure to overcome its culture. A fourth group, the intentionalists, focus on Vladimir Putin and his perhaps irrational, or at least idiosyncratic, motivation for waging war. A final group sees the war as a struggle between dictatorship and democracy reflected in Putin’s attempt to quash any potential challenge to his rule, at home or abroad. The war, on this reading, is part of Putin’s “preventive counter-revolution.”

There are, of course, combinations of these viewpoints. Blaming NATO can go together with a notion of Russia’s ongoing imperialism: given the latter, the argument goes, NATO should have abstained even more emphatically from expanding into Russia’s supposed backyard. Russia’s imperialism and continued quest for a great power status can be linked, in turn, to its hostility to Ukraine’s quest for democracy. Intentionalists, too, can see Putin as drawing on a wider Russian culture of imperialism, which can go hand in glove with a quest for dictatorship and therefore preventive counter-revolution.

Perhaps the most well-defined position in the debate on the war’s origins is the first: blaming NATO for provoking the conflict by expanding into Russia’s “sphere of influence.” This path has been taken most prominently by international relations scholar John Mearsheimer. As critics of this view have pointed out, it has its history backwards. It was not when NATO enlarged into Eastern Europe that Russia became aggressive but when it showed weakness — by failing to agree on a response to the accession hopes of Georgia and Ukraine at the Bucharest Summit in 2008, for example. Moreover, it is difficult to see Putin’s aggression, “riddled” as it is with “irrationalities,” as some kind of logical response to a putative Western threat.

NATO-blaming has recently lost some of its dominance over public discourse. But it continues to be popular on the far left, sometimes amended to an opposition to both Russian and NATO imperialism. Beyond the extremes lingers the view that, whatever bad things happen in the world, “the West” must be to blame.

Historian Philipp Ther is clearly affected by such sensibilities. He feels “comfortable” in the company of “leftists,” he writes in the introduction to his latest book, and he feels that “the West” (whatever that might be) has lost its way. The multiple crises we encounter today, including the war against Ukraine, are part of a “wrong turn” in economic policy after 1989–91. “The West lost the peace” after the end of the cold war, he says, because it became self-satisfied and embraced unfettered capitalism (or “neoliberalism,” defined as the conglomerate of “liberalisation, deregulation, privatisation, the reduction of state influence on the economy, and global financial capitalism”).

NATO expansion, then, is not at the centre of Ther’s argument in How the West Lost the Peace: The Great Transformation Since the Cold War. A social historian, Ther is much more interested in political economy than international relations. He flags in his introduction that he feels uneasy about NATO expansion, and at one point suggests Russia should have joined the alliance. But he hastens to add that his critique is “absolutely not meant to relativise Russia’s attack on Ukraine.” The latter “deserves the full support of the West and the entire world — otherwise Russia’s pursuit of a multipolar world order with a Russian sphere of influence in Eastern Europe will instead lead to maximum global disorder.”

Elsewhere in his book, Ther declares Russia’s enduring “imperial legacy” to be the core cause of the war. The early 1990s might have been a moment to cast off the legacy, but it passed, first because of Boris Yeltsin’s shelling of a recalcitrant (and imperialist) parliament in 1993 and then because of the rouble crisis in 1998, “when most of the Russian middle class sank into poverty again.” The latter is something of an overstatement, as a middle class continued to exist thereafter, but Ther’s overall point is well taken: “Neither of these traumatic moments had anything to do with the expansion of NATO or the EU; they were domestic problems first and foremost.”

Rather than an outcome of international relations, Russia’s trials and tribulations were part of the global malaise Ther is exploring. His great bugbear is the failed prophet of the “end of history,” Francis Fukuyama, whom he sees as the chief ideologist of Western triumphalism; his intellectual hero is the sociologist Karl Polanyi, analyst of the “great transformation” of the long nineteenth century.

Like Polanyi in the interwar years, Ther argues, we find ourselves at the tail end of another transformative period: the one that ran from 1989 to 2022. This transformation had two aspects — post-Soviet transformation in the former Soviet empire and “late-capitalist transformation in the West” — and they were held together by a shared framework of “neoliberal globalisation.”

Far from being the end of history, this period was one of profound economic, social and political upheaval, with winners and losers dotted around the globe, both between countries and within them. The claim that unfettered capitalism would somehow lift all the boats, making us all more prosperous, happy and democratic, turned out to be a pipedream at best and ideological obfuscation at worst.

Instead came the global financial crisis, followed by the annus horribilis of 2016, with Donald Trump’s victory in the United States and the Brexiteers’ in Britain. Then the “one-two punch of the pandemic and the biggest war in Europe since 1945… brought to an end the era for which historians have not yet found a name.” In line with Polanyi, Ther proposes calling the period from 1989 to 2022 “the age of transformation.”

Transformation to what? We don’t yet know, but it might not be good. Overall, Ther is quite pessimistic, but he does hope that his exploration of the history of our present will help open up “new opportunities for a progressive politics and society,” a political thrust that fits in well with other recent attempts to reconstruct the social democratic project for the twenty-first century.

To Ther, this is an existential quest. Like the unfettered capitalism of the nineteenth century, he fears that the new age of transformation might lead to some kind of fascism. The pendulum, to use his metaphor, having swung all the way towards a neoliberal abandonment of state protection, is now swinging back the other way. But Ther’s pendulum, in a hard-to-visualise twist, can swing in two ways: “left towards democratic socialism, or right towards fascism.”

Does any of this explain the war against Ukraine? Not really. Ther struggles to make Russia and Ukraine fit his explanatory scheme. True, the economic crisis triggered by Gorbachev’s reforms and deepened by the breakdown of the Soviet Union “plunged much of Russia’s population into destitution and misery,” which is never a good foundation for a democratic polity. But the same was true for Ukraine, which became democratic. And while Ukraine’s economy remained sluggish, Russia’s has grown by leaps and bounds since Putin came to power.

Neoliberalism, then, is of little use as a scapegoat for Russia’s aggression. Instead, Ther evokes a combination of the preventive counter-revolution argument and the anti-imperialist paradigm. Putin’s goal, he argues, is “to rewrite the end of history — with the creation of a new Russian empire.” The “larger dimensions of the conflict” also include the confrontation between “an authoritarian system” (Russia) that has evolved into a “hard dictatorship since Putin’s second term,” on the one side, and a country (Ukraine) that “has continually moved in the direction of liberal democracy ever since the Orange Revolution, and especially since 2014.”

Ukraine is far more democratic than Hungary, an EU member. Its governments have repeatedly transitioned smoothly and peacefully after elections, “something that unfortunately cannot be said for the USA since the storming of the Capitol.” Putin’s war on Ukraine is “also a war on democracy,” Ther writes, a “declaration of war against the EU and a free Europe.”


By evoking imperialism, a global transformation of capitalism, and a systemic confrontation between dictatorship and democracy, Ther avoids the position of authors who find the origins of this war in Russia’s national character, its history or its culture. I have argued elsewhere against such views as ahistoric and simplistic. Mikhail Zygar’s recent magnum opus War and Punishment: The Story of Russian Oppression and Ukrainian Resistance, can serve as evidence of some of their shortcomings: it is the book of a Russian democrat, a Russian anti-imperialist and a Russian enemy of this war. Were the arguments about Russia’s national character as militaristic, imperialist and anti-democratic correct, he should not exist. And yet he wrote a book — a long and eloquent one at that.

Zygar is a Russian intellectual, and he knows it. And he’s an anti-imperialist. His book is framed as a long letter to his Ukrainian friend Nadia, at whose house in Bucha — the scene of one of this war’s massacres — he wrote much of his earlier book, All the Kremlin’s Men. “Nadia no longer speaks to me,” he writes in distress. “Because I am Russian, she considers me an ‘imperialist.’” He hopes his book will change her mind, and the minds of his compatriots: “Nadia, I am not an imperialist, and I am writing this book so that others will not be either.”

War and Punishment is made up of two, very different, parts. In fact, it is two books pressed into one volume. The first is a series of historical essays on major moments in Ukrainian–Russian relations that have been turned into myths in both Russian and Ukrainian historiography. They include Hetman Bohdan Khmelnytsky’s ill-fated alliance with Muscovy against Poland in 1654, which Russian historians have used to claim that Ukraine voluntarily subjugated itself under Moscow’s tsars; Ivan Mazepa’s alliance with Sweden against Peter the Great in 1708, which was declared a “betrayal” by Russian imperialists; Catherine the Great’s destruction of the remnants of Cossack Freedom; the life and work of Taras Shevchenko, Ukraine’s national poet of the nineteenth century; the Ukrainian revolution and Ukraine’s independent state at the end of the first world war; the Great Famine, or Holodomor, of the early 1930s; and nationalist resistance against German and Soviet occupation during the second world war.

Zygar provides a fresh and readable account of the historical background of each of these episodes and how they persist as anti-Ukrainian myths in Russian historiography.

Like anti-Russian authors, then, Zygar is well aware of the imperialist mainstream of Russian culture. His book is an attempt both to condemn it and to reconstruct, or strengthen, its anti-imperialist counter-current. Like anti-fascist German intellectuals after 1933, and for similar reasons, he is scathing about the culture in which he grew up:

Many Russian writers and historians are complicit in facilitating this war. It is their words and thoughts over the past 350 years that sowed the seeds of Russian fascism and allowed it to flourish, although many would be horrified today to see the fruits of their labour. We failed to spot just how deadly the very idea of Russia as a “great empire” was… We overlooked the fact that, for many centuries, “great Russian culture” belittled other countries and peoples, suppressed and destroyed them.

But his reaction is not to treat Russia as some kind of historical anomaly but to change what it means to be a Russian. “Russia as an empire has been consigned to the past, as a direct and irreversible consequence of the war,” he writes. What remains, however, is imperialism, a mindset, an emotional state, that needs to change, not just because of what it does to others, but also because of how it deforms Russia and the Russians:

Imperial history is our disease; it’s inherently addictive. And the withdrawal symptoms will hurt. But this is inevitable. We have to return to reality and realise what we’ve done.

We have to learn this lesson. To stop believing in our own uniqueness. To stop being proud of our vast territory. To stop thinking we’re special. To stop imagining ourselves as the centre of the world, its conscience, its source of spirituality. It’s all bunk.

Decolonising the Russian mind means democratising the country. Or, put the other way, democratisation can only succeed with the defeat of the imperialist mindset, which legitimises the subjugation of citizens as subjects:

We must strip the state of the right to impose its own view of the past on us. We have to roll up our sleeves and completely reinterpret our history, or rather the history of the peoples who fell victim to the empire…

Looking back, we see a horrific sight; our ancestors, indoctrinated to believe they were victors, were themselves victims. They were forced to kill, to rejoice in the killing, to take pride in the killing. And they were good at it. They were proud; they got high; they wrote beautiful poems, songs, and books glorifying blood and violence, the crunching of bones. And they forgot it was their own blood, their own bones.

This position is a radical departure from Russian liberal thought, both past and present, which often remained deeply imperialist (and racist), while espousing individual autonomy and democratic governance for Russians. At the same time it builds on anti-imperial Russian thinkers and the work of critical historians working, for example, in the now illegal organisation Memorial.


The second part of Zygar’s book is very different. It tells an integrated story, with a huge number of characters, reminiscent of the big Russian novels the title alludes to. This story begins in 1991 and ends in the present. While readers with little background in Russian and Ukrainian history will benefit greatly from the punchy and often inspired historical vignettes of the first part, they will likely get lost in the details of the second. It provides political history in its purest form: a tangled web of personalities and the relations between them; a history of power, corruption, loyalty and betrayal; and a history of powerful men and women: politicians and powerbrokers, oligarchs and gangsters, businesspeople and soldiers. In between, we learn about the unlikely rise of the comedian-turned-politician-turned-wartime-leader Volodymyr Zelenskyy.

This focus on personalities, their relations and the complex history of events over three decades — the same decades covered by Ther’s great transformation — sits somewhat uneasily with the framing of the book as an exploration of Russian imperialism. Given that many actors in Ukraine also pursued their own interests, connected to Russia as much as to Ukraine, the emergent story of oligarchic politics is much more messy than the subtitle of the book suggests: this is not just a story of Russian oppression and Ukrainian resistance.

When it comes to explaining the outbreak of the all-out war in 2022, Zygar is an intentionalist: the fourth emergent school of the history of this war. Intentionalists focus on the decision-makers and their motives. My own recent book, informed in many ways by Zygar’s earlier work, was intentionalist in this regard: while I saw, again like Zygar, Russian imperialism as one of the underlying structural causes, I also argued that the timing of the invasion becomes intelligible only when we understand that Putin, his seventieth birthday approaching, was looking for a legacy. He had spent the Covid years in splendid isolation, stewing in his own juices and reading Russian imperialist history. He wanted to get into the history books as an empire builder alongside Peter the Great, Catherine the Great, and Stalin.

Zygar tells a similar story, but he sees Putin driven less by his own historical ambition than by domestic politics. Spooked by the failed Belarusian revolution of 2020–21, the Russian president decided to remove the most prominent opposition leader, Alexei Navalny. Putin’s agents poisoned Navalny on 20 August 2020 but the attempt on his life failed. Evacuated to Germany, Navalny launched a counterattack: a YouTube video with the results of his team’s investigation into the poisoning, which was watched by some twenty million people at the time.

Navalny returned to Russia on 17 January 2021, triggering anti-regime protests and his own arrest. Two days later his team released a video about a private palace owned by Putin on the Black Sea coast. “This revelation,” writes Zygar, “strikes perhaps the most powerful blow to Putin during his entire reign. The video is watched by 120 million people, that is, almost the entire adult population of Russia.” Demonstrations have to be clubbed out of existence “Belarus-style.” The “damage to Putin’s credibility is colossal,” and he fears losing control. It is in this context, Zygar argues, that Putin’s administration is beginning to hatch new war plans: Ukraine can serve as the successful little war that saves Putin’s rule.

This is why Putin returns to history and, with the help of his former culture minister Vladimir Medinsky, writes his notorious July 2021 essay on the alleged historical unity of Russians and Ukrainians — the historical treatise that part one of Zygar’s book is trying to counter. It is a historical justification for the coming war, but not its origin. The origin is the attempt to forestall revolution at home.


Here, then, Zygar partakes of the final emerging school of thought on the origins of this war: that it is an attempt at preventive counter-revolution. As the Australian political scientist Robert Horvath has argued, Putin has long attempted to immunise his regime against the “colour revolutions” that seemed to be breaking out periodically both in Russia’s immediate vicinity and further afield.

One way to link this anti-revolutionary quest to aggression against Ukraine is to see Ukraine as a vibrant democracy at Russia’s doorstep and hence an example of what could be for the domestic opposition. The problem with this interpretation is that there was no renewed democratic revolution in Kyiv in 2022, and hence no reason to quash it.

Zygar’s interpretation is closer to Horvath’s original reading: Putin went to war not because Ukraine posed a democratic threat to his rule but because he faced a democratic threat at home. The war was a distraction: an attempt to reignite the imperialist jingoism of the Crimea annexation of 2014 that propelled Putin’s approval ratings upwards.

Two interpretive problems remain. For one, Putin didn’t order the invasion when he needed the distraction but well after the domestic crisis had passed. The 2021 protests were well and truly over by the time he published his Ukraine manifesto in the middle of that year. By early 2022, when he sent in his troops, there was no challenge to his regime.

Second, as Zygar documents himself, the war plans were hatched in secret. If the regime as a whole was under threat and the war was part of an attempt to prevent revolution, it is hardly credible that even Putin’s closest advisers were not aware of the war plans even at the eve of the invasion.


Be that as it may: Putin’s invasion on 24 February 2022 started a new historical epoch. As Ther points out, that might well be true for the globe as a whole but it is certainly true for Russia and Ukraine. At the centre of this new epoch is the war, its history being written as events unfold.

We already have military history in the more narrow sense of the term: an appreciation of unfolding events at the frontline; analyses of the technical aspects of the fighting, the evolution of tactics and weaponry; and a focus on what lessons professional soldiers can learn from this fighting. More readable for non-specialists are initial narrative accounts of this war. Among the steady stream of these, some are penned by historians but more by journalists. The latest addition is Andrew Harding’s A Small, Stubborn Town: Life, Death and Defiance in Ukraine, an account of the battle of Voznesensk in March 2022.

Harding’s slim volume is a gem. A masterpiece of journalistic storytelling, it has the qualities of a good novella. It may be the most readable book about this war published to date. Based on interviews with some dozen survivors of the battle — soldiers and civilians, men and women, Russians and Ukrainians — the book tells a tale of survival and resistance on Ukraine’s side as well as aggression and frustration on Russia’s. It also explores the sometimes unclear loyalties, and indeed identities, of both Russians and Ukrainians, and doesn’t shy away from unsentimental depictions of war crimes.

Harding’s book thus explores some of the complexities of the real history of this war without falling into relativism: it is clear that Harding’s sympathies are with the defenders rather than the aggressors and that he doesn’t find it difficult to distinguish between the two. He leaves us with the despondent nightmares of his interviewees. They are haunted, he writes, “by the notion that this conflict may never end, and by the fear that Russia’s capacity to absorb suffering and its unflinching willingness to continue inflicting it will eventually enable it to grind out some kind of victory.”

As Ther warns, such an outcome would be catastrophic. It can be avoided if Ukraine’s friends in what is left of the democratic world stay the course. The biggest threat to Ukraine’s independence today derives from phantasies that this war might be stopped if Russia were to be accommodated by reasonable diplomacy. As Zygar notes, Russia in its current configuration cannot be accommodated. Defeat, not victory, might set Russia on the path Zygar proclaims with grim optimism: “Future generations of Russians will remember with horror and shame the war that Putin unleashed. They will marvel at how archaic hubris came to dominate the minds of twenty-first-century people. And they will not tread the same path if we, their ancestors, bear the punishment today.” •

How the West Lost the Peace: The Great Transformation Since the Cold War
By Philipp Ther | Translated by Jessica Spengler | Polity Press | $36.95 | 304 pages

War and Punishment: The Story of Russian Oppression and Ukrainian Resistance
By Mikhail Zygar | Weidenfeld & Nicolson | $34.95 | 424 pages

A Small, Stubborn Town: Life, Death and Defiance in Ukraine
By Andrew Harding | Bonnier | $32.99 | 160 pages

The post Writing the history of the present appeared first on Inside Story.

]]>
https://insidestory.org.au/writing-the-history-of-the-present/feed/ 0
The spies who went into the cold https://insidestory.org.au/the-spies-who-went-into-the-cold/ https://insidestory.org.au/the-spies-who-went-into-the-cold/#respond Thu, 09 Nov 2023 05:53:41 +0000 https://insidestory.org.au/?p=76395

Calder Walton’s lively global survey takes in a century of espionage

The post The spies who went into the cold appeared first on Inside Story.

]]>
One cold February day the British intelligence service received a secret update from an agent in Central Europe. The Russians were refusing to treat Ukraine as a separate country, the agent reported, and were willing to back that up with force. A reliable Ukrainian informant living in exile in Poland had asked how much international support Ukraine could expect if it asserted its right to independence.

Remarkably, that report was written not in February 2022, on the eve of the full-scale Russian invasion predicted by Western intelligence, but a century earlier, on 7 February 1922. The coincidence underlines both the scope and one of the themes — continuity across time — of Calder Walton’s ambitious and thought-provoking new book, Spies: The Epic Intelligence War between East and West.

Stretching from 1917 to the present day, Spies covers the intelligence contest between Russia, Great Britain and the United States that extends over more than a century. Perhaps surprisingly, Walton argues that Russia has invariably been one step ahead of the West. Especially before 1945 and after the collapse of the Soviet Union, the Russian intelligence services were an underestimated threat. Dating back to the conspiratorial traditions of the Cheka, the secret police created by Lenin, they were simply better spies and used deception more effectively.

The greatest, perhaps even “epic,” achievements of Soviet foreign intelligence occurred in the 1930s and during the second world war. Soviet agents penetrated the highest levels of government and the security services in Britain and the United States.

Walton, a distinguished British historian currently at Harvard, covers well-trodden ground but his analysis is sharp. Soviet agents like Alger Hiss, Harry Dexter White, Laurence Duggan and Ted Hall (in the United States), and the Cambridge Five, Alan Nunn May, Klaus Fuchs and George Blake (in Britain) are all familiar to scholars in the field, but Walton’s discussion is enriched by his engaging prose, his access to fresh archival records (some only declassified in 2022), and his sketching in of the military, political and cultural tapestry into which espionage was woven.

Although the intelligence provided by MI5 defector Kim Philby cost the lives of dozens of Allied agents, among other things, its immense potential value to the Soviets (and incalculable damage to the Allies) was undercut by Stalin’s paranoia or hubris. For the Soviet leader it was a case of “too good to be true.”

Stalin’s suspicion of disinformation was also evident when his spies warned of Germany’s imminent invasion of the Soviet Union in June 1941. He dismissed the warning from the celebrated spy Richard Sorge, whom he called a “lying shit,” and his pencilled response on the report of a German agent inside the Nazi regime (which I still recall seeing in the British National Archives) was “Fuck him.”

Stalin’s failure to believe the warnings about Hitler’s attack didn’t apply to material Moscow was receiving from spies who had penetrated the Manhattan Project in the United States. Stalin knew well before Harry Truman did that America was developing the atom bomb and, as is well known, expressed no surprise when the American president informed him of a new super weapon at the 1945 Potsdam conference. He also knew of the Venona operation, to which I’ll return, six years before Truman or the CIA.

For the United States, the cold war began in 1947. Three pivotal documents — the Truman Doctrine, the National Security Act (which created the CIA) and the text of Cominform’s “two camp” thesis — all appeared in that year. For Walton, however, it began with the Bolshevik revolution. In the 1920s, the Cheka had a division of officers coordinating foreign operations (and more than 100,000 agents inside Russia); at the time, MI5’s counterespionage unit had just five officers. In 1929 the US secretary of state shut down the government’s code-breaking agency because “gentlemen do not read each other’s mail.” In 1936, a decision to open an MI6 station in the British embassy in Moscow was thwarted because it was “liable to cause embarrassment.”

By the beginning of the second world war, Walton wryly notes, Soviet intelligence “had more graduates of British universities than Britain’s own intelligence services.” Despite the wartime alliance against Germany after 1941, the Soviets intensified their espionage; Western intelligence operations, especially from Bletchley Park, were meanwhile preoccupied with the Nazi threat and dutifully ignored the Soviet Union.

The one exception was the Venona project. This ultra-secret operation was launched in 1943 to decode cables sent from Moscow to its embassies and went on to expose networks of Soviet spies operating in the West. (These included Walter Clayton’s KLOD network in Australia, which Walton doesn’t mention.) Although MI5 and MI6 had fewer than 200 officers between them in 1947 while the KGB was soon employing about 200,000, the process of redressing the imbalance had begun.

But the Russians were unrelenting. Despite the Venona crackdown, they set about interfering in Western elections. In one of the more startling revelations in Spies, obtained from Russian archival records, Walton contends that Stalin colluded with the Progressive Party candidate in the 1948 US presidential elections, Henry Wallace, formerly Roosevelt’s Soviet-friendly wartime vice-president. Wallace appears to have secretly liaised with Stalin, who aligned himself closely with Wallace and vetted some of his campaign material. This, according to Walton, turned Wallace “into an asset for Stalin, if not a recruited Soviet agent.”

That may be an overstatement, but it at least confirms that the far more extensive election interference conducted by Russia in favour of Trump in 2016 (and before then in support of Gerald Ford in 1975) was not unprecedented. It was part of the arsenal of “active measures” against Britain and the United States that included bribery, forgery, misinformation, assassinations and the planting of deep-cover “illegals.”

Walton also probes the Soviet Union’s main adversary, the United States. American covert operations, termed “back-alley actions” by secretary of state Dean Rusk, became the weapon of choice in postwar Washington. They stretched from the CIA’s intervention in the 1948 Italian elections to the CIA-backed coups against democratically elected governments in Iran (1953), Guatemala (1954), the Dominican Republic (1961), British Guiana (1962), Iraq (1963), Bolivia (1971), Chile (1975) and many more, to the proxy war in support of the anti-Soviet mujahedeen in Afghanistan (from 1980). One fact of which I was unaware is that a secret annex of the Marshall Plan channelled reconstruction funds to the CIA for clandestine political-warfare activities in postwar Europe.

Walton’s most chilling, and disturbing, account of covert action concerns the American destabilisation of Congo and its complicity in the assassination of Patrice Lumumba, the country’s popular, left-leaning president, in 1960. Walton cites Eisenhower telling his national security adviser, Gordon Gray, that he was “very eager indeed that Lumumba be got rid of.” Lumumba was got rid of and the brutal, corrupt and despotic — but US-friendly — Joseph Mobutu ruled Congo (later Zaire) for the next thirty years. (In a wonderful vignette, Walton describes MI6’s head of station in Congo, Daphne Park, who helped coordinate Lumumba’s murder but who “looked, and acted, like Miss Marple from Agatha Christie’s novels.”) By taking in Africa, the Middle East, Latin and South America, Walton emphasises the global dimension of this long intelligence war.

The roles of high-ranking Soviet defectors to the West and moles working within the Russian intelligence services were crucial in the great cold war struggle, and their stories are compellingly told. Once again, most are familiar and well documented: Walter Krivitsky, a foreign intelligence officer who was eventually assassinated by Soviet intelligence (1941); Igor Gouzenko, who first exposed Moscow’s atomic espionage (1945); Oleg Penkovsky, perhaps the most prized agent, who played a pivotal role in the Cuban missile crisis (1962); Oleg Gordievsky, whose intelligence helped avert a nuclear first strike codenamed Able Archer (1983); Vasili Mitrokhin, who defected with a tranche of Moscow’s innermost intelligence secrets (1992); and Alexander Poteyev, who escaped an assassination attempt on US soil (2020).

A surprising omission is Vladimir and Evdokia Petrov, who defected from the Soviet embassy in Canberra in 1954 and provided immensely valuable intelligence over the next five years. It was Vladimir who revealed the whereabouts of the “missing diplomats,” Burgess and Maclean, a revelation absent from Walton’s extensive discussion of the Cambridge Five. ASIO is also absent from the text (and index), though listed in the glossary, and Australia, despite its membership of the unprecedented Five Eyes intelligence-sharing network, is similarly overlooked.

What are treated in some detail are the continuities. Boris Yeltsin dismembered the KGB, but the security services reconstituted under former KGB officer Vladimir Putin have retained and expanded their power. As Walton writes, the FSB and SVR (the domestic and foreign agencies) “inherited the KGB’s infrastructure, archives, agents, skill set, ideology and operational approach.” Only the acronyms changed.

Similarly, the cold war didn’t end with the collapse of the Soviet Union in 1991. Diverting its attention to counterterrorism after 9/11, the West failed to appreciate that the Russian security apparatus was becoming even more aggressive or that a revanchist Putin would use asymmetric espionage — hijacking the internet to disseminate disinformation, for example — to Russia’s advantage.

Once again, the West had to play catch-up. By 2019, 77 per cent of Kremlin staff had a background in the security services. Intelligence in Russia was intensely politicised, as it always has been, which helps explain why the planned swift invasion of Ukraine in February 2022, based on prewar intelligence analysis and briefings that could not contradict Putin, was a failure.

Christopher Andrew — that doyen of intelligence historians with whom Walton collaborated on his history of MI5 — calls Spies “a masterpiece,” but it does contain errors. Russian tanks never “rolled in” to Prague to enable the Czechoslovak coup d’état in 1948, and nor was it a “military takeover” (unlike 1968); it was engineered by the NKVD — the People’s Commissariat for Internal Affairs — and the local Communist Party. Ethel and Julius Rosenberg never “confessed to spying,” and nor was Julius “a Soviet agent in Los Alamos”; he coordinated a New York–based spy ring and engaged in industrial rather than atomic espionage. Stalin’s death in March 1953 did not “bring the Korean War to an end”; the reasons for the armistice signed three months later lay elsewhere. During Gordievsky’s exfiltration from Moscow, crisps, not soiled nappies, were thrown from the car window at the Finnish border to deter sniffer dogs; the nappies did exist, but were changed on top of the car boot directly over the hidden Gordievsky to successfully foil Soviet guards and Alsatians — an improvisation perhaps unique in espionage history.

Notwithstanding these quibbles and Walton’s questionable conclusion that “the age of the secret service is over,” Spies brims with insights and fascinating details, encompassing a full century in a global setting, and should attract an audience otherwise unacquainted (beyond film and TV) with the murky world of espionage. •

Spies: The Epic Intelligence War between East and West
By Calder Walton | Simon & Schuster | $34.99 | 625 pages

The post The spies who went into the cold appeared first on Inside Story.

]]>
https://insidestory.org.au/the-spies-who-went-into-the-cold/feed/ 0
Freeing Bennelong and Phillip https://insidestory.org.au/freeing-bennelong-and-phillip/ https://insidestory.org.au/freeing-bennelong-and-phillip/#respond Fri, 20 Oct 2023 01:07:15 +0000 https://insidestory.org.au/?p=76138 Nothing is preordained in Kate Fullagar’s dual biography

The post Freeing Bennelong and Phillip appeared first on Inside Story.

]]>
The first thing that strikes you about Kate Fullagar’s Bennelong and Phillip is the unusual way she has organised her material. There is a good deal of serious purpose in the structure she has chosen to impose on old stories, and it is this structure that I will try to spell out here. It matters very much, because in arranging things as she does she wrestles with two problems of central importance for Australian history.

Number one: she takes infinite trouble in giving equal time to her two subjects, the Wangal man Bennelong and Arthur Phillip, first governor of New South Wales. These two came to know each other as a result of the British invasion and yet they led largely separate lives. Throughout the book, in a spirit of strict equity, Fullagar moves backwards and forwards from one to the other in a process of interweaving. For the reader moving through the book it is like handling particoloured rope.

And then, secondly, she tells their story, or rather their two stories, backwards.

Fullagar’s project for equality reminds me of the famous passage in Eleanor Dark’s The Timeless Land (1941) where the fictional elder Tirrawuul advances to meet Phillip during the first hours of invasion. Each man watches the other, eye to eye. “Tirrawuul saw a smallish man, quite incredibly ugly, with a pale face and a very large nose [and so on]… Phillip saw an elderly savage, quite incredibly ugly, with greying tangled hair, and alert dark eyes [and so on].” That is literary equity par excellence. What Dark achieved through the liveliness of historical fiction Fullagar manages in a more assiduous, methodical way. Reading Dark and Fullagar together shows up better the purpose of each.

Dealing one after the other with invader and invaded, Eleanor Dark gives a keen impression of mutual strangeness and of how each man searched the other’s face for a shared humanity. Kate Fullagar’s method is more roundabout — not so much literary as ethnographic. She gives descriptions, side by side, of how each man and his people enacted the rituals of death and burial, and their different uses of violence, including ceremonial violence, and of dance, dress and display.

Questions of culture and personality necessarily intersect. Neither Bennelong nor Phillip was perfectly typical of his kind, whatever that might mean. Bennelong seems to have had a strong emotional dependence on women, for instance. Phillip, on the other hand, seems to have needed female company markedly less than most other Englishmen of his day. One of the best things about Fullagar’s book is how she uses the grid of culture as a powerful background for personality. The tension between culture and individual character vividly complicates the tension between the two men, played out as it is in the highly dramatic circumstances of invasion.

There is another complication, less obvious but more profound. In her book Upheavals of Thought: The Intelligence of Emotions, the American philosopher Martha Nussbaum sketches the varied ways in which culture shapes feeling, while acknowledging it is often hard to distinguish the thing from its expression. Grief among the Balinese, says Nussbaum, looks and sounds very different from grief among the Ifaluk, a people of the Caroline Islands. The Utku people of northwest Canada condemn anger as childish but the ancient Romans saw it as manly and noble; the Utku keep a lid on it but the Romans made all the noise they could.

The anecdotes in Fullagar’s book show differences of the same kind, in feeling and expressions of feeling, between First Nations peoples and the invading British. Fullagar makes good use of this material, though she probably doesn’t push the question as far as Nussbaum might have liked. She quotes David Collins, Phillip’s secretary and judge-advocate, describing the mysterious combination of feeling and violence among Indigenous people. Men known to be good friends fought each other, so Collins said, “with all the ardour of the bitterest enemies,” apparently intent on wounding if not murder, and yet they were friends again afterwards. The officers were also baffled by Bennelong’s violence towards the young woman Kurubarabula, his promise to kill her, her running towards him all the same, and then their marriage.

The feelings of the invaders themselves must have been just as impenetrable to First Nations people, as they often are to us today.

Throughout the book Fullagar shows an ongoing interest in the possibility of a treaty engineered by Phillip, as governor, with Bennelong, as a representative of the invaded peoples. Phillip was anxious, for instance, that Bennelong and Yemmerrawanne should meet George III during their stay in England, 1793–94, because, Fullagar says, such a meeting might have led to a “formal agreement” between the British government and the Indigenous population. Something might have been done, in other words, to register Indigenous “consent” to the settlement at Port Jackson.

There was certainly talk now and again of the need for “consent,” but there is no surviving evidence that the British government ever thought of making an agreement of this kind. Unlike other Indigenous communities affected by colonisation, the people at Port Jackson were understood to be wanderers, “not attached to any particular spot.” After five years on the ground, Phillip knew this was not true. Individuals and groups were obviously attached to certain places, though exact ideas about possession were so far hard to decipher.

And yet, as far as we know, even Phillip never argued for any kind of agreement about land use. It would have been a feather in his cap if the king had deigned to notice Bennelong, however briefly, with an “audience,” but it seems likely that that was the limit of the governor’s hopes.

All the same, by circling as she does around the idea of a treaty, Fullagar hints at a larger and deeper question — the possibility of ongoing mutual respect, including the invaders’ capacity to listen, in an official sense, to Indigenous voices. Presentation to the king was called an “audience” because it involved listening by the executive. In this case, for whatever reason, the king chose not to listen. While in England, Bennelong and Yemmerrawanne were dressed as English gentlemen and entertained with all sorts of display, military, theatrical and cultural, but they were not invited to perform themselves as a representative Voice. The more things change, the more they stay the same.


So we come to the second distinctive feature of this book’s structure. Its story or stories are told backwards. In describing the lives of Bennelong and Phillip, Kate Fullagar begins with the end and ends with the beginning. In the first couple of chapters we hear about how each man has been understood since his death — the ups and downs of each “afterlife.” Combined with that and just as important is detail about the network of friends and kin each left behind him when he died. So we are introduced to each life as a creation of human circumstance, and to each individual as a focal point of feeling and attention. After all, other people make us what we are. It is a subtly powerful point, and it tends to pervade the book.

Ending with the beginning can have the same sort of effect. So, in the last chapters we read of Bennelong and Phillip, each in his own way, born into a family network and into a store of traditional knowledge — a rich cultural inheritance. Here is another point of creative tension. Quite apart from the inevitable push and pull that goes on between the two men during their period of contact, we sense a striking ambivalence in the way each of them stands out from their crowd of friends and associates while at the same time being continuously drawn back in. Such is life, now as then.

Fullagar offers an interesting explanation as to why she has chosen this back-to-front approach. Partly, it is another part of her project of equity between Bennelong and Phillip. Telling stories in the normal sequential way means giving preferential treatment to Phillip. Phillip represented an empire on the move. British energies and British achievements gave men like him a right to possess the future. He is a founder of nationhood, a cultural hero, and as such inevitably mythical in some sense. That makes Bennelong his antithesis, a figure attractive enough but doomed to fail. Phillip represents high-principled government, good order and the inevitable progress of Western civilisation. Bennelong stands in his shadow, childlike, irresponsible and ultimately tragic.

Fullagar is not entirely free herself from this framework of thought. In the European context she pits “conservatives” of the 1790s, including Phillip himself, against “the liberal spirit of French popular democracy,” and yet such terminology is surely imposed by us in retrospect. In those days, “conservative” implied a power to nurture and sustain life. The sun, for instance, was called “a conservative,” and by the same token words like “liberal” and “democracy” seem to jar with the actual methods of the Jacobin revolutionaries in France.

Altogether, British ideas about the relationship between past and future were still fairly fluid in the late eighteenth century. The mindset of the first invaders is a topic of enormous complexity and weight, and in tackling notions of time, “progress” and so on, it might be better to avoid such words altogether.

But then, Fullagar is right. Our own assumptions about “progress,” as she says, make the violence of empire “the unfortunate means to a justifiable end.” Telling the story in the old-fashioned way would also give a shallow idea of individual character, including moral character. An iconic and ideal Phillip, assimilated to the statue in Sydney’s botanic gardens, cannot be genuinely human. The same straitjacket gives a kind of narrative uselessness to Bennelong’s life during his post-Phillip years. No longer a valued go-between, he seems to be trampled underfoot by “galloping global empire.”

Freeing herself from this old paradigm, Fullagar also frees Bennelong and Phillip. The various life phases of each take on a new significance. More than that, invasion, occupation and settlement can be more clearly understood because nothing is preordained.

In 1788 nobody could have known whether there would be a second fleet or whether the settlement would survive at all. By 1800, if those questions were answered, no one could have known whether the occupiers would ever be more than a circumscribed small-farming community. When Bennelong died in 1813, settlers were beginning to take up large grants, suitable for sheep and cattle, and in the same year country west of the Blue Mountains was opened up for settlement. Even so, no one so far could have predicted occupation from shore to shore.

This was indeed a galloping empire, a brutal impulse of power that in the end passed Bennelong by. Whether his latter years were miserable or not, the balance of power that had seemingly existed between himself and Phillip had been radically undone.

The book is subtitled “A History Unravelled,” and in a couple of places Fullagar talks about lives “unspooled.” It is an interesting image. Cut loose from its conventional framework of long-term achievement and/or loss, the old Bennelong–Phillip tapestry comes apart, falling into a multitude of brightly coloured episodes and life phases. Each man is caught more completely in his own time — but caught, as it were, with no memory of earlier events, because with Fullagar’s chosen structure he has not got to those yet.

In short, her biographical method is not problem-free, but it serves a vital purpose. It will appeal to some readers more than others, but no one can avoid having their ideas about invasion challenged to some extent by this remarkable book. •

Bennelong & Phillip: A History Unravelled
By Kate Fullagar | Scribner | $55 | 320 pages

The post Freeing Bennelong and Phillip appeared first on Inside Story.

]]>
https://insidestory.org.au/freeing-bennelong-and-phillip/feed/ 0
Western civilisation and its discontents https://insidestory.org.au/western-civilisation-and-its-discontents/ https://insidestory.org.au/western-civilisation-and-its-discontents/#comments Fri, 13 Oct 2023 20:01:57 +0000 https://insidestory.org.au/?p=76035

A mix of ingenuity, creativity, contradiction and collaboration unsettles the much-vaunted concept of “the West”

The post Western civilisation and its discontents appeared first on Inside Story.

]]>
A few months ago Inside Story asked me to review The West: A New History of an Old Idea, the latest book by historian Naoíse Mac Sweeney. It might have “a little extra resonance in Australia” at this time, editor Peter Browne suggested. Shortly afterwards, my professional life imploded.

I found myself among the forty-one academics suddenly “disestablished” from their continuing positions at the Australian Catholic University in the name of creating a “more agile and sustainable” education environment. I was then invited to compete in a Hunger Games for fourteen replacement positions. As one of the senior historians among the targeted, more privileged than several others, I have absolutely no intention of doing so.

It’s been an interesting, and unexpected, context for my reading of Mac Sweeney’s brilliant book on the history of the idea of Western civilisation. Back in August I thought I might consider it from the position of one of the few universities in Australia to have recently invested in those disciplines most strongly associated with the West — history, philosophy, religious studies, literary studies and political theory.

Where other universities have been trimming or at least following a course of “natural attrition” when it comes to these subjects, ACU pursued over the last few years a deliberate push to elevate its profile in what is also often called the humanities. The university systematically hired humanities researchers from around the nation and world at senior, middle and junior levels.

In 2020, ACU also became the third and final university in Australia to partner with the private Ramsay Centre for Western Civilisation, with which it now runs a Bachelor of Arts program in “the great books, art, thought and practices” that have shaped the West. This program remains largely segregated from both ACU’s regular BA offerings and its humanities research institutes.

In the face of a baffling and partial reversal, however — a reversal that slashes most of its recent research hires but leaves the Ramsay program intact — I now read Mac Sweeney’s book as one of the ousted, a living effect of the extreme tenuousness of the hold of the idea of “Western civilisation” in Australian society.

Tenuousness, in fact, is Mac Sweeney’s primary point. Her central argument is that our modern notion of Western civilisation is not only much newer than we thought but also unstable, compromised, frequently incoherent and indeed “factually wrong.” That notion, briefly, is that Western civilisation emerged from a shared Graeco-Roman antiquity that melded with the rise of European Christendom and led to the Renaissance, the scientific revolution and the Enlightenment, in the process giving birth to liberal capitalist modernity. All these developments occurred on an ever-westward sweep from southern Europe to North America.

Mac Sweeney’s secondary point, pursued more subtly throughout, is that grand narratives of the West have always been used ideologically to justify political ends, either for or against its central claims.

Such a twofold argument might suggest a book that covers only the span of time in which Mac Sweeney reckons our current notion of Western civilisation has been brandished — roughly from the late seventeenth century to the present day. But The West traverses all the ages currently understood to fall within the narrative, starting with the fifth-century BCE Anatolian thinker Herodotus. In fact, more than half the book is dedicated to the pre-seventeenth-century world. The subtitle should more accurately, though surely less cutely, have been “A New History of a Newish Idea, With a Good Chunk of Its Prehistory.”

Mac Sweeney was aware from the outset that her rather abstract subject matter could “easily get stuck in the realms of the theoretical.” To avoid this, she explores her ideas via fourteen lives, each with his or her own chapter. A couple are expected: Herodotus himself, as well as the English Tudor polymath Francis Bacon. The others are mostly surprises, including the Islamic philosopher Abū Yūsuf Yaʻqūb ibn ʼIsḥāq al-Kindī, the Ottoman Sultan-Mother Safiye and Hong Kong leader Carrie Lam.

All up, Mac Sweeney’s fourteen lives comprise two Anatolians, two Romans, two Britons, two Africans, a German, a Baghdadi, an Albanian, an American, a Palestinian and a Hong Konger. There are eight men to six women. Around two-thirds are scholars of some description; about half of them are significant political rulers.

The book’s biographical method and its vivacious and direct writing style are among its best features, turning Mac Sweeney’s compelling ideas into an enticing, delightful and sustaining read. But not all the figures do the same work. Some are discussed as presumed contributors to the Western tradition while others exemplify its contrary aspects.

The starting line-up shows these two modes neatly. Herodotus, in chapter one, has long been thought an original contributor to the Western tradition, inaugurating the binarism at its heart between “us” and “them” with his depiction of Greek heroes and barbarian enemies. Mac Sweeney insists, though, that Herodotus invoked this binarism solely to undermine it. He was repelled by the increasingly xenophobic triumphalism of his contemporary Athens, writing instead a history that showed equivalent heterogenous societies in all the regions of his known world. He eventually abandoned Athens in disgust at its invention of a singular, superior Greek culture.

Chapter two’s character, meanwhile, the Roman powerbroker Livilla, represents the people who are usually considered the inheritors of Greek culture, the Romans of the first century CE. Livilla’s turbulent life adds great colour but its pertinent part concerns how much Livilla — granddaughter of Augustus — nodded to “Asian Troy” as her most important heartland. Her Rome was an empire born of Trojan refugees and powered by absorbing every set of people within reach. It had no understanding of itself as being oriented towards Europe over Asia, and especially not to the conquered Hellenes.

Baghdadi al-Kindī demonstrates how much the Byzantine empire of the ninth century engaged with the Greek and Roman philosophers of the past. In fact, Mac Sweeney holds that the Byzantines took the thought of the varied ancients more seriously than did anyone in this era, proving that Greek and Roman influence did not “flow in a single channel” to western Europe but instead “sprayed rather chaotically in all directions.”

Godfrey of Viterbo and Laskaris of Albania both feature as warnings of how uncomfortable was the blending of medieval Christendom into Greek, Roman or Byzantine history. They similarly refute any sense of a single flowing channel: the retrofitting of Christian theology into pre-Christian traditions was awkward, painful and sometimes frankly denied.

Chapters six to eight traverse a long Renaissance, showing how this era “stitched together… the uneasy hybrid we now call ‘Greco-Roman antiquity.’” Even so, the Roman writer Tullia D’Aragona shows that there remained much respect in the sixteenth century for the traditions of Mesopotamia, Egypt and Ethiopia. So, too, Safiye Sultan shows that the attraction of the east for many Protestant Europeans was often greater than was an increasingly papalised West. And Francis Bacon exemplifies how emerging scientists, though they learned much from ancient texts, remained wary of any attempt to dictate how or what to think.

American revolutionary Joseph Warren brings the reader finally to the birth of the modern idea of Western civilisation, a truncated version that burgeoned through the prior two centuries but was, we now know, almost incomprehensible to thinkers of earlier ages. Mac Sweeney argues that the idea finally became “mainstream” as a helpmate to the success of the American Revolution. American-tinctured Western civilisation not only instantiated the idea that the West came from a fused and exclusive tradition of Greek and Roman practices, but also implied that those living on the latest western frontier — Americans — perfected the Christian, scientific and liberal threads that adhered seamlessly to the tradition along the way.

Chapters on the Angolan Queen Njinga and the West African poet Phillis Wheatley provide searing counterpoints, highlighting the ever-sharpening racial exclusions embedded in the modern idea of Western civilisation.

Perhaps the most contentious of Mac Sweeney’s biographical choices come in the final three chapters, where British prime minister William Gladstone, Palestinian critic Edward Said and Hong Kong premier Carrie Lam stand in for the last 200 years. Those who support the extreme tendency to favour the recent past in historical studies will protest that too much is missed, especially the cold war version of Western civilisation: the Plato-to-NATO narrative that focused so intensely on capitalism and founded so many “Western Civ” university courses around the Western world. As a former member of a “Not the Twentieth Century” reading group, however, I am happy to accept the author’s brevity here.

Gladstone represents the West’s zenith, when the idea bolstered a Western bloc that was also dominating the world. Said represents how the West started to come apart via its own critical methods during the twentieth century. And Lam, intriguingly, is a conduit to the challenges the West now faces from without — from a militant Islamic State, from post-Soviet Russia, and most of all from a soaring China.


Mac Sweeney spends less time on her second theme, the ideological weaponisation of the idea of the West, though it is implied throughout. She is clearest on how Warren and Gladstone wielded the idea to justify the rise to “domination” of Euro-American norms. She suggests that it’s less powerful today, when “most people in the modern West no longer want an origin myth that serves to support racial oppression or imperial hegemony.”

I’m not so sure. Advocates don’t have to carry placards of Donald Trump as a gladiator to reveal a desire to perpetuate certain conventions about Western primacy, exceptionalism and natural linearity.

The fact that Mac Sweeney wrote this book suggests she, too, may realise the idea still has dangerous legs. One of her implied points could well be that while the West wreaked plenty of havoc (dispossession, slavery, colonisation) between 1776 and 2001, it may inflict even more damage when brandished in a fractured, unmoored and uneven manner as it is today.

Most importantly, her book is not a call to cancel the study of what apparently constituted the West. Instead, she contends that by investigating the very tenuousness of its claims we can come to see more than just falsity. We get the chance to discover a richer and more diverse global past than we previously knew.

In endeavouring to trace the genealogy of the West back through liberalism, rationalism, Christianity, Rome and Greece, we will find a kaleidoscope of ingenuity, creativity, contradiction and interconnected collaboration in place of a neat arrow. Together, these complexities point to something far closer to universal humanity than was ever imagined in any Western narrative. They should inspire us to move beyond the binary of the West versus the Non-West that yet inflects much modern thinking.

I have no idea if the Ramsay programs currently being unrolled in Australia present the history of Western civilisation in Mac Sweeney’s critical and expansive way. What I do know is that the possibility of studying the politics, religion, literature and theories of the world in which the West arose is now significantly foreclosed at my university. Many regular students, and scholars, will have to turn elsewhere to continue to discover and to explain the ideas that have shaped all our lives. The West would make an excellent starting point. •

The West: A New History of an Old Idea
By Naoíse Mac Sweeney | WH Allen | $35 | 448 pages

 

 

Just noting that she uses this spelling but a quote later has Mac Sweeney using Greco.

The post Western civilisation and its discontents appeared first on Inside Story.

]]>
https://insidestory.org.au/western-civilisation-and-its-discontents/feed/ 2
Yes or No, history won’t go away https://insidestory.org.au/yes-or-no-history-wont-go-away/ https://insidestory.org.au/yes-or-no-history-wont-go-away/#comments Tue, 10 Oct 2023 04:34:32 +0000 https://insidestory.org.au/?p=75957

Regardless of the outcome of the Voice referendum, Australia’s past will continue to unsettle the present

The post Yes or No, history won’t go away appeared first on Inside Story.

]]>
In August, not long before the referendum date was announced, I joined a kitchen table conversation about the Voice. There were eight of us, some acquainted, others meeting for the first time. We were all tending towards Yes, but our levels of certainty varied, along with our knowledge and understanding of the issues.

The host used materials created by the Victorian Women’s Trust to get us talking, including a set of cards laid face down between mugs of tea and plates of biscuits. We took turns picking up a card and reading the text on the reverse side. One was about events in New South Wales in 1881:

Forty-two Yorta Yorta men living at the Maloga Mission petition the governor to grant them land, to support themselves raising stock and cultivating crops. The petition is published in the Sydney Morning Herald and the Daily Telegraph. Six years later, representatives from the Maloga Mission present the governor with a petition to Queen Victoria, again requesting land.

Eventually the NSW government did set aside 730 hectares in the area for a reserve that came to be known as “Cummeragunja,” or “our home.”

Another card told of the 1963 Yirrkala bark petitions:

The Yolngu Nation from Yirrkala in the East Arnhem Region sends Bark Petitions to the federal parliament. They object to land on their reserve being excised for bauxite mining, without consultation. Territories Minister Hasluck rejects the first petition, challenging the validity of signatures. A second bark petition adds the thumbprints of clan Elders…

The petitions led to a parliamentary inquiry, which visited Darwin and Yirrkala to collect evidence. The committee didn’t support a halt to mining, but it did recommend that sacred sites be protected and the Yolngu compensated for loss of livelihoods.

Dating back as far as 1788, the twenty-nine cards detail resistance, protests, pleas, petitions, strikes, walk-offs, court cases and letters to newspapers. They record the creation of new representative organisations — including the Australian Aboriginal Progressive Association (1924), the Australian Aborigines League (1933), the Aboriginal Progressive Association (1937) and the Federal Council for Aboriginal Advancement (1958) — and the release of the 2017 Uluru Statement from the Heart.

Together, the cards tell a compelling story of a 235-year struggle for land, recognition and justice, of which calls for Voice, Treaty and Truth are the latest manifestation.

Since that conversation, I have played a small role in the Yes campaign, handing out flyers outside a supermarket, a train station and a pre-polling centre. Plenty of people have been supportive and I’ve had constructive, civil conversations with individuals who were genuinely unsure about how to vote. I’ve also been labelled a racist and a race-traitor and accused of not acknowledging that the “real” Uluru Statement from the Heart is much longer than just one-page. This is trivial stuff compared with the abuse copped by First Nations’ representatives on both sides of the campaign, but the atmosphere feels like it has become increasingly polarised as the vote approaches.

Perhaps that was inevitable once the Coalition made the vote partisan and turned the campaign slogan, “Vote No to the Voice of Division,” into a self-fulfilling prophecy. But anger and resentment at the idea of a Voice haven’t come out of thin air. Misinformation and sheer falsehoods need receptive ears. Just as the push for Yes is informed by the long struggle for recognition and rights, so the No campaign draws on deeper wellsprings, including an entrenched defensiveness about Australia’s past.

In a talk at the Byron Writers Festival in August, historian Henry Reynolds recalled the intellectual environment he encountered when he started teaching at the University College of Townsville in 1965. Although it later became James Cook University, at the time the college was a northern outpost of the University of Queensland, and the main textbook set by Reynolds’s southern professors was Gordon Greenwood’s Australia: A Social and Political History.

Greenwood’s collection, with essays by six researchers, was reprinted twelve times between 1955 and 1975, and widely used in teaching around the nation. But it contained nothing about Aboriginal and Torres Strait Islander peoples. Reynolds scoured academic reviews of the book and found that “not one of the eminent historians who reviewed it realised there was something missing.”

This is evidence of what art historian Bernard Smith called “the white blanket of forgetfulness” in his 1980 Boyer Lectures. Twelve years earlier, in his own Boyer Lectures, the anthropologist W.E.H. Stanner had introduced a similar concept — the great Australian silence. Reflecting on the lack of Indigenous voices in histories and commentaries, Stanner said that “inattention on such a scale cannot possibly be explained by absent-mindedness” but must be structural, like “a view from a window which has been carefully placed to exclude a whole quadrant of the landscape.”

More recently, in Telling Tennant’s Story, Inside Story contributor Dean Ashenden showed how this view has been constructed and maintained. Stopping in the old railway town of Quorn near the Flinders Ranges on the road north from Adelaide, Ashenden finds lots of information about the Ghan but nothing about the Aboriginal people of the area, “who they were or how they fared when the inexorable frontier arrived.” The story was similar all the way up the Stuart Highway.

This is my story too. Growing up in South Australia in the 1960s and 1970s, my main exposure to Aboriginal Australia was seeing people sitting under trees in Victoria Square and the Adelaide Parklands. I had an inspiring fifth grade teacher who introduced us to the culture and lifestyle of the central deserts, and around the same time I met a group of Pitjantjatjara elders who were staying with a neighbour who had worked at the old mission of Ernabella (now Pukatja).

Despite these experiences, though, I never thought to ask who had lived on the lands around Adelaide prior to 1836. I don’t recall hearing the name of the Kaurna people until I was in my early twenties. Like so many, then and now, I was blanketed in forgetfulness.


When he arrived in Townsville, Reynolds was struck by the very visible presence of Aboriginal people in North Queensland — something he was not accustomed to in Tasmania. When he started researching local history with his students he knew they had to include the story of relations between coloniser and colonised. And once they went looking, they discovered records of dispossession, conflict and war waiting to be found, not just in the oral stories handed down through Aboriginal and non-Aboriginal families but also in newspapers, court records and diaries.

The first newspaper in North Queensland, the Port Denison Times, was established in 1861. Reading through copies in the Bowen Council Chambers, Reynolds found that frontier violence was openly acknowledged in the nineteenth century. What’s more, the morality of the colonialism was fiercely debated in its pages.

Yet when Reynolds first pitched his landmark 1981 book, The Other Side of the Frontier: Aboriginal Resistance to the European Invasion of Australia, Penguin knocked it back because there were already “too many books about Aborigines.”

“Invasion” is still a rarely used word to refer to the origins of the Australian state. “Settlement” remains far more common, suggesting a benign process that met with little resistance and was long ago complete. The Voice referendum is unsettling because it tugs at the corners of the blanket of forgetfulness to destabilise the dominant sense of who we are as a nation.

Many within the No camp believe that nothing is to be gained by looking back and it is time to draw a line under history. After all, we’re all Australians with equal rights. To give Aboriginal and Torres Strait Islander people a Voice to Parliament amounts to special treatment and breaches the basic liberal-democratic tenet that every citizen has one equal vote and equal standing before the law.

If we are to move forward together based on a shared commitment to liberal principles, though, we must surely confront the fact that the colonisation that shaped Australia and its institutions was entirely illiberal. It did not treat First Nations peoples equally. It ignored their rights, stole their property, suppressed their languages and cultures, denied them voice and votes. The liberal state is supposed to uphold freedom and equality, but the Australian state denied both of those things to Aboriginal and Torres Strait Islander peoples and faced down their fierce resistance with violence, segregation and imprisonment.

The push for constitutional recognition of Australia’s First Nations peoples anchored in the Voice reminds us of these deep and unresolved wrongs. Its challenge to the legitimacy and identity of the liberal state was bound to be met with anger and resentment. Yet, as political philosopher Duncan Ivison has argued, the Voice also provides a way forward — an opportunity “to reset what seems currently fixed.”

As Ivison writes, “By providing a legal and political framework within which Indigenous peoples’ voices can be heard on matters of deep concern to them, whilst at the same time engaging with the core political structures of the Australian state, it offers a distinctive opportunity for ‘re-founding’ these relations.”

I still hold a hope that the opinion polls are wrong and a surge of undecided voters will swing the vote to Yes on polling day. But I’m not optimistic. Whatever the result, I’m confident that history will keep reaching into the present in unsettling ways.

Each successive generation, Indigenous or non-Indigenous, migrant or locally born, will discover, and rediscover, discomfiting truths that pierce the great Australian silence. Historians and others won’t stop delving into the trove of archival and anecdotal records, stirring up the sediment of the past to cloud the waters of the present. Some Australians, many even, may fail to listen or refuse to hear. But there will always be those who grapple with the insistent moral and political demands history makes on us. There is no foreseeable point in the future where we can draw a line under things and say, we’ve dealt with that, now let’s move on. •

The post Yes or No, history won’t go away appeared first on Inside Story.

]]>
https://insidestory.org.au/yes-or-no-history-wont-go-away/feed/ 1
What the Nobel Prize tells us about economics https://insidestory.org.au/what-the-nobel-prize-tells-us-about-economics/ https://insidestory.org.au/what-the-nobel-prize-tells-us-about-economics/#comments Mon, 09 Oct 2023 23:25:22 +0000 https://insidestory.org.au/?p=75855

This year’s winner is another challenge to critics of the youngest of the prizes

The post What the Nobel Prize tells us about economics appeared first on Inside Story.

]]>
This year’s Nobel Prize for economics is rightly seen as a victory for women. It is a win for women in economics — Claudia Goldin is just the third woman to receive the award — and it is a win for women’s work as a subject of economics.

But it also confirms one of the most important lessons of the economics Nobel’s past quarter-century: that economics can shed light on all sorts of real-world issues — from whether you should worry about the quality of a used car to the price a woman pays for taking over childcare while her husband builds his career.

These are not the sort of things for which most people think the Nobel Prize in economics is awarded. Indeed, the economics Nobel is the outsider of the Nobel group, and arguably the most misunderstood. Goldin’s win should help shift views about this latecomer among Nobels.

Back in 1898 Alfred Nobel’s will established prizes for the people “who have during the previous year rendered the greatest service to mankind” in each of six fields. The dynamite magnate’s chosen fields were physics, chemistry, physiology/medicine, literature and the pursuit of peace. Economics was not on the list.

The science Nobel quickly gained an unsurpassed reputation. The literature prize, if sometimes quirky, nevertheless acts as national canonisation for whomever receives it; it made Patrick White famous to millions of Australians who never came close to reading him. And the peace prize frequently generates political controversy: comic songwriter Tom Lehrer said of Henry Kissinger’s 1973 peace prize that it made satire obsolete.

The economics prize came seven decades after its sibings, pressed on the Nobel committee in 1968 by the Swedish central bank. It occupies the Nobels’ uncomfortable middle ground, aspiring to the hard-edged epistemological standards of physics but enduring accusations that it is as political as the peace prize.

Economics in 2023 is in a particularly tough corner. According to surveys such as a 2019 YouGov poll of British voters, it is markedly less popular than the physical sciences. And anti-economics views, once confined to the fringe, seem to be spreading into the mainstream.

On the left, economists are widely resented for their tendency to suggest governments should spend money more cautiously on everything from welfare payments to underground rail lines. On the right, once less sceptical of economics, the very idea that pointy-headed economic experts might have something to contribute is now often derided: that 2019 YouGov poll of UK voters found Brexiteers less than half as willing to trust economists than Remain voters were.

It shouldn’t really come as a surprise that economics is less popular than, say, chemistry. As the economist Thomas Karier wrote in his 2010 book Intellectual Capital: Forty Years of the Nobel Prize in Economics, “human behaviour is notoriously fickle and difficult to summarise with a few fundamental equations.” Another economist, Russ Roberts, hit a similar note in an online essay: “there are too many factors we don’t have data on, too many connections between the variables we don’t understand and can’t model or identify.”

The result: interpretation and the investigator’s biases play roles in the social sciences that they don’t play in the physical sciences.

On top of this, as economists from Roberts to John Quiggin have noted, economics is a discipline in which peculiarly few questions receive definitive answers. When Edwin Hubble and Fred Hoyle proffered different ideas about how the universe is evolving, they had to wait just a few years to figure out who was clearly right. (As it happened, though, astronomers didn’t become eligible for the physics prize until after Hubble’s death.) Economists almost never resolve disputes that way.

When a controversial figure receives the economics Nobel, their political allies often leap clumsily to claim vindication. When Milton Friedman won in 1976, the political right rushed to laud his free-market worldview, though his Nobel-cited monetary-theory work suggested the United States left interest rates too high during the Great Depression. When the equally widely admired David Card won in 2021 for work on minimum wages, left-wing commentators crowed that he had shown governments should raise minimum wages to reduce poverty, a view Card specifically disowned.

A frequent Nobel committee response to this problem usefully highlights the difficulty of reaching categorical conclusions on economic issues. The 1974 economics prize, for instance, went jointly to two men who had previously shared little more than a common continent: Friedrich Hayek, an Austrian economist who championed the importance of market signals, and Gunnar Myrdal, a development economist who had served in one of Sweden’s Social Democrat governments.

That pattern repeated in 2011: Thomas Sargent, a leader of the “rational expectations revolution,” won with Christopher Sims, famous as a critic of rational expectations. Then in 2013 Eugene Fama won for creating the efficient markets hypothesis — and behavioural finance expert Robert Shiller won in part for showing where Fama’s hypothesis went wrong.


Sometimes people detect a more sinister pattern in Nobel wins. This tendency reached a high point in the 2016 book The Nobel Factor, by Swedish economic historians Avner Offer and Gabriel Söderberg. The book’s cover blurb uses the familiar newspaper practice of posing a sensational question without answering it: “Was it a coincidence that the market turn and the prize began at the same time?”

Offer and Söderberg press the case that it was no coincidence at all, that the prize was a Swedish central bank plot to undermine the dominant Keynesianism of the time, to give itself greater power and to push popular perceptions of economics in a more market-oriented direction. Indeed, Offer and Söderberg suggest that the plot worked, with the Nobel successfully promoting what is known as the “market turn” in economics, which lasted from the 1970s to the 1990s.

Look at the overall pattern of prizes though, and you might start to think the plotters have displayed rank incompetence. Yes, the late twentieth century produced a crop of future Nobel Prize winners, mostly American, who wanted to promote the power of markets and limit the power of government in various ways. Hayek in 1974 was followed by Friedman in 1976, who at the time was almost unavoidable: even The Nobel Factor concedes that he was for a time the most cited economist ever, above both Keynes and Marx.

A decade later came more market promoters: James Buchanan in 1986, followed by Robert Lucas (1995), Robert Mundell (1999), Finn Kydland and Ed Prescott (2004), Thomas Sargent (2011) and Eugene Fama (2013). But this gang comprises just nine winners out of seventy-five between 1969 and 2016.

The timing seems worse still for Offer and Söderberg’s thesis. Check those dates again: the Nobel committee only got around to gonging the third of these hard-edged free-marketeers (Buchanan) in 1986. By then Reagan and Thatcher — and, in Australia, Hawke — had already implemented the most dramatic of their pro-market policy changes. Why plot to change world history and then wait until it became unnecessary before putting your conspiracy fully into action?

Another story fits the timeline much better: after Keynesianism failed to deal well with the 1973 and 1979 oil shocks, the economics profession began to reconsider Keynesianism and to take more interest in other models. The Nobel committee simply followed that intellectual trend, cautiously, waiting for the passage of time to confirm a particular idea’s lasting effect.

The Nobel Factor barely mentions the effect the oil shock had on economic thinking after almost three decades of postwar growth. Offer and Söderberg also downplay that the intensely pro-market ideas were mostly incorporated in the default view or discarded before the turn of the century.


In fact, when you look closely at the past twenty-five years of economics Nobel winners, “right-wing plot” is not the phrase that springs to mind. A more likely lesson is that, for whatever reason, the winners of the economics Nobel have often taken economics away from the clichéd idea of calculating, rational Economic Man.

Assigning laureates to categories is a fraught business. But the biggest theme of the past quarter century has probably been behavioural economics, which looks at how psychological and social factors lead people to make decisions classical economic theory might not suggest. Such prizes went to George Akerlof and Daniel Kahneman before the 2008–09 financial crisis, and to Elinor Ostrom, Robert Shiller and Richard Thaler after it. In second position would be issues of poverty (Amartya Sen, Angus Deaton, Abhijit Banerjee, Esther Duflo and Michael Kremer). It’s hard to paint such concerns as disclosing a right-wing agenda.

Various technical methodological breakthroughs have also been rewarded. At the very least this category should probably include James Heckman, Daniel McFadden, Vernon Smith, Leonid Hurwicz, Eric Maskin, Roger Myerson, Joshua Angrist and Guido Imbens — though thinkers such as Paul Romer could probably fit here comfortably too).

Other winners have tackled very real-world problems. George Akerlof explained why you might be right to worry that the used car you just bought is a lemon. Paul Milgrom and Robert Wilson found the best way for governments to auction radio frequency rights. Ben Bernanke showed why bank failures can turn a slowdown into a depression, before getting to test his ideas as US central bank chief in 2008–09, with some success. And David Card’s work suggested that raising the minimum wage might not throw people out of work the way most economists feared.

Card’s prize-winning work in particular suggested that the Nobel committee was growing more interested in economics that actually overturned previous beliefs on a practical question.

Card and his co-author, the late Alan Krueger, identified a nice natural experiment, a rise in New Jersey’s minimum wage law, and used it to explore the effects of the minimum wage. They found that a higher New Jersey minimum wage didn’t push New Jersey’s unemployment rate up. As a result, economists are now less certain about the damage caused by increases in low statutory minimum wages.

Yet here we confront again that uncomfortable reality: economics is a social science. And in real-life situations you often just have more going on than you know how to deal with. Practical experiments like Card’s are not controlled experiments capable of replication. Rather, they happen in the middle of very complex societies. As just one example, the New Jersey employers had early notice that the minimum wage would be hiked, and some may have cut their workforce before the raise took effect.

So debate continues about what, if anything, Card’s New Jersey natural experiment proved. Reputable polls of reputable economists suggest a slight majority still think a big increase in the minimum wage across the United States would probably cost jobs.


And yet examining questions like this can yield clear insights about how society might improve itself — and that seems to be the case with this year’s laureate, Claudia Goldin. Perhaps even more than Card, she has convinced many of her colleagues that what they might have thought was going on wasn’t fact the real story.

When Goldin was growing up, she says, she read most of the Sherlock Holmes stories, and the famous fictional detective’s fascination with mysteries rubbed off on her. “I think of myself as Sherlock Holmes,” she told me last year.

In her work on women’s labour, Goldin could not simply pick up existing statistics. She had to act as a real-world Sherlock Holmes, piecing together the picture of women’s labour market experience from a messy pile of incomplete historical data. As friend and fellow economics professor Deirdre McCloskey notes, “She’s just wonderful about finding new sources of information about the past… that’s what makes her unusual.” Economics has sometimes favoured maths nerds, but nowadays it likes data nerds at least as much.

Goldin has put her hard-won data to many purposes. But among her most important is her investigation of the gender pay gap in most professional and managerial occupations across most Western nations.

People talk about “how we need to devise methods to eliminate bias,” she says. “I couldn’t agree more. But that isn’t going to solve the big issue.” Rather than rely on bias as an explanation, Goldin describes how the gender pay gap arises predominantly from the “couple inequality” between men and women with professional or managerial training. In this dynamic, she argues, women often end up effectively sacrificing their own careers when children arrive so that their husbands can work longer hours and get ahead. What Goldin has identified is the skewing effect of social expectations built up over decades — in a sense, the raw weight of history on today’s labour market.

As McCloskey puts it, Goldin “brings a fresh perspective to what is usually thought of as sheer exploitation.”

Goldin’s explanation for the gender pay gap is particularly remarkable because she has gone to such lengths to assess all the more popular explanations — hiring bias, pushy men, negotiation dynamics, occupational segregation. And perhaps most remarkably of all, given the sensitivity of this territory, her gender work has so far proved immune to attack. It simply fits the facts far better than any of the alternatives.

In an ever more complex social world, Goldin has shown it is still possible to put together a convincing story about what is going on.

Goldin’s pay gap research is in some ways harder to describe than, say, David Card’s minimum wage work. (One of her own attempts is here.) But more so than Card, Goldin’s work points to solutions. Notably, she argues many businesses need to change the way they structure jobs and pay to give professionals and managers more flexibility to take care of children for a few years without destroying their careers.

None of us should overstate economists’ ability to answer complex economic and social questions. But neither should we understate the importance of trying — and of acknowledging those who make truly great attempts. •

The post What the Nobel Prize tells us about economics appeared first on Inside Story.

]]>
https://insidestory.org.au/what-the-nobel-prize-tells-us-about-economics/feed/ 2
Machine questions https://insidestory.org.au/machine-questions/ https://insidestory.org.au/machine-questions/#respond Tue, 03 Oct 2023 06:12:49 +0000 https://insidestory.org.au/?p=75877

What does history tell us about automation’s impact on jobs and inequality?

The post Machine questions appeared first on Inside Story.

]]>
When it appeared twenty-five years ago, Google’s search engine wasn’t the first tool for searching the nascent World Wide Web. But it was simple to use, remarkably fast and cleverly designed to help users find the best sites. Google has gone on, of course, to become many things: a verb we use in everyday language; a profitable advertising business; Maps, YouTube, Android, autonomous vehicles, and DeepMind. Now a global platform with billions of users, it has profoundly changed how we look for information, how we pay for it and what we do with it.

The way we talk about Google has also changed, reflecting a wider reassessment of the costs and benefits of our connected lives. In its earlier days, Google Search was enthusiastically embraced as an ingenious tool that democratised knowledge and saved human labour. Today, Google’s many services are more popular than ever, though Google Search is the subject of a major antitrust case in the United States, and governments around the world want to regulate digital services and AI.

In Power and Progress, Daron Acemoglu and Simon Johnson take the project of critical reappraisal further. Their survey of the thousand-year entanglement of technology and power is a tour de force, sketching technology’s political economy across a broad historical canvas. They chart the causes and symptoms of our contemporary digital malaise, drawing on a growing volume of journalism and scholarship, political economy’s long tradition of analysing “the machine question,” and the work of extraordinary earlier American technologists, notably the cyberneticist Norbert Wiener, the network visionary J.C.R. Licklider, and the engineer Douglas Engelbart.

If, as Acemoglu and Johnson argue, our digital economy is characterised by mass surveillance, increasing inequality and destructive floods of misinformation, then the signal moments from the past will inevitably look different. From this angle, the great significance of Google Search was its integration with online advertising, opening up the path to Facebook and a panoply of greater evils.

The strengths of Power and Progress lie in the connections it makes between the deficiencies of current technology and the longer story of innovation and economic inequality. History offers many opportunities to debunk our nineteenth-century optimism in technology as a solution, and to puncture our overconfidence in the judgement of technology leaders.

A particular target is the idea that successful innovations produce economy-wide benefits by making workers more productive, leading to increased wages and higher living standards generally. The theory fails to capture a good deal of historical experience. The impact of new agricultural technologies during the Middle Ages provides a telling example. Between 1000 and 1300, a series of innovations in water mills, windmills, ploughs and fertiliser roughly doubled yields in England per hectare. But rather than leading to higher incomes for most people, living standards appear to have declined, with increases in taxation and working hours, widespread malnutrition, a series of famines and then the Black Death. Average life expectancy may have declined to just twenty-five years at birth.

The cities grew, but most of the surplus generated by improved agriculture was captured by the church and its extensive hierarchy. A religious building boom proceeded on spectacular lines. Vast amounts were spent on hugely expensive cathedrals and tax-exempt monasteries: the same places, as Acemoglu and Johnson note, that tourists now cherish for their devotion to learning and production of fine beer. The fact that better technology didn’t lead to higher wages reflects the institutional context: a coercive labour market combined with control of the mills enabled landowners to increase working hours, leaving labourers with less time to raise their own crops, and therefore reduced incomes.

If medieval cathedrals give rise to scepticism about the benefits of tech, it follows that we should think more carefully about the kinds of technologies we want. Without that attention, what the authors call “so-so automation” proliferates, reducing employment while creating no great benefit to consumers. The self-checkout systems in our supermarkets today are a case in point: these machines simply shift the work of scanning items from cashiers to customers. Fewer cashiers are employed, but without any productivity gain. The machines frequently fail, requiring frequent human intervention. Food doesn’t get any cheaper.

The issue then is not how or whether any given technology generates economic growth, but which conditions make possible innovations that create shared prosperity. The recent past provides examples of societies managing large-scale technological change reasonably well. The postwar period of sustained high growth and “good jobs” (for some but not all) had three important features: the powers of employers were sometimes matched by unions; the new industrial technologies of mass production automated tasks in ways that also created jobs; and progressive taxation enabled governments to build social security, education and health systems that improved overall living standards.

For technology to work for everyone, the forces that can temper the powers of corporations — effective regulators, labour and consumer organisations, a robust and independent media — play an essential role. The media are especially important in shaping narratives of innovation and technical possibility. Our most visible technology heroes need not always be move-fast-and-break-things entrepreneurs.

Finally, public policy can help redirect innovation efforts away from a focus on automation, data collection and job displacement towards applications that productively expand human skills. Technologies are often malleable: they can frequently be used for many purposes.

Acemoglu and Johnson would like us to divert all that frothy attention on AI to what they call machine usefulness, focused on improving human productivity, giving people better information on which to base decisions, supporting new kinds of work, and enabling the creation of new platforms for cooperation and coordination: a course they see as far preferable to a universal basic income.

Kenya’s famous M-PESA, introduced in 2007, is one of many examples, offering cheap and convenient banking using basic mobile phones. On a larger scale, the web is also a human-oriented technology because its application of hypertext is ultimately a tool for expanding access to information and knowledge. Acemoglu and Johnson concede that the idea at the heart of Google Search can also be understood in this way: a mechanism that works well for humans because it is constantly reconfiguring itself in response to human queries.

The authors’ ideas for positive policy interventions can usefully be read alongside those of the Australian economists Joshua Gans and Andrew Leigh, whose 2019 book Innovation + Equality remains less used than it should be.


One way to read Power and Progress is as a historically informed guidebook for the conflicts of our time — in the courts, where Lina Khan’s Federal Trade Commission has launched far-reaching cases against Google and Amazon, in the new regulatory systems emerging in the European Union, Canada and elsewhere, and in the wave of industrial actions taken by screen industry writers and auto workers in the United States.

In Australia, we are also at a point where governments will soon make decisions about the kinds of technology we want to support or constrain. We can have no certainty about the outcomes of any of this, but Acemoglu and Johnson argue that such conflicts are both necessary and potentially productive. They diverge here from one of the main currents of liberal technology critique: where writers like Carl Benedikt Frey, whose The Technology Trap (2019) covers some of the same terrain, see redistributive policies as necessary for managing the consequences of automation, Acemoglu and Johnson point to the positive potential of political and industrial conflict for reordering technological agendas. They want to place more emphasis on our capacity to choose the directions technology may take.

The recently concluded Hollywood writers’ strike offers an intriguing example. The key point is that the screen writers didn’t oppose the use of generative AIs such as ChatGTP in screenwriting. Instead they secured an agreement that such AIs can’t be recognised as writers and that a studio may not require the use of an AI. If a studio uses an AI to generate a draft script that it then provides to a writer, the credit or payment to the writer will be the same as if the writer had produced the draft entirely themselves; and a writer may use an AI with the permission of the studio without reducing their credit or payment.

The settlement clearly foreshadows the extensive use of generative AIs in the screen industries while offering a share of the benefits to writers. The critical point, as some reports have noted, may be that the revenue-sharing deal with writers preserves the intellectual property interests of the studios, since works created by an AI may not be copyrightable.

Meanwhile, AI raises other important issues about automation, quite apart from the focus on work. When we are relying on machines to make or inform decisions, we are also moving into the domain of institutions, with the obvious risk that existing technology-specific laws, procedures and controls can be bypassed, intentionally or otherwise. This, after all, was what robodebt did with a very simple automated system. In the absence of wide-ranging institutional adaptation and innovation, more complex modes of automation will pose greater risks.

More generally, the authors’ framing of the “AI illusion” appears to be premature. Power and Progress was clearly substantially completed before the appearance of the most recent versions of ChatGPT. Accustomed as we are to AI’s many failures to match its promises, we should now be considering the surprising capabilities and broad implications of large language models. As Acemoglu and Johnson would insist, if generative AI does turn out to be as powerful as many believe, then it will necessarily be capable of far more than “so-so” automation. •

Power and Progress: Our Thousand-Year Struggle Over Technology and Prosperity
By Daron Acemoglu and Simon Johnson | Basic Books | $34.99 | 546 pages

The post Machine questions appeared first on Inside Story.

]]>
https://insidestory.org.au/machine-questions/feed/ 0
You’re not going to buy it are you? https://insidestory.org.au/youre-not-going-to-buy-it-are-you/ https://insidestory.org.au/youre-not-going-to-buy-it-are-you/#comments Fri, 29 Sep 2023 06:35:23 +0000 https://insidestory.org.au/?p=75835

A chance find in a Melbourne collectibles shop transports the author back to 1988’s “celebration of a nation”

The post You’re not going to buy it are you? appeared first on Inside Story.

]]>
I sometimes think of myself as a ragpicker, someone who salvages the refuse discarded by other people. Ragpickers, or rag-and-bone men, were a common sight in industrialised towns and cities in the nineteenth century. They walked the streets with carts and sacks into which they would gather all sorts of detritus, literally including rags (sold for making paper) and bones (useful for many purposes, from buttons to fertiliser). There was even a market for horseshoe nails scraped from between paving stones.

Be assured that I’m not going to be sifting through your rubbish on bin night. In my day job as a social history curator I interpret historical material for display in exhibitions, and in that work the context and significance of objects is critical. In my downtime, though, I grub for the bits of history left behind in charity shops, collectables shops and markets. I’m not a collector; I just like being in the presence of old stuff.

Fine antique shops bore me because everything in them has already been assessed for its market value. All is tidily identified, with no space for adventure or mystery. I’m drawn to the places where I can be unsettled by orphaned artefacts and random associations. In charity and collectables shops it’s up to the customers to establish significance, and they’ll do this through Google searching of course, but also by drawing on their own imagination and memories.

“Oh, my mum used to have one of those!” is a commonly overheard remark, referring perhaps to vintage Tupperware or a Corningware casserole dish. I once spotted a glass jug exactly the same as the one my mother used for mint sauce, but I didn’t buy it, because really, it was rather ugly. Maybe she thought so too, but it was what she had.

Whether or not someone will buy other people’s discarded stuff depends entirely on how they reimagine its use and reinvest it with new meaning. Inversion of value is something that the French writer Raymond Queneau had great fun with in his 1967 poem “The Bin-Men Go on Strike”:

it’s strike day for the bin-men
it’s a lucky day for us
we can play ragpicker or peddler
junk dealer who knows even antiquarian
there’s a little bit of everything…

A little bit of everything. I like that. It’s a tough call, Queneau goes on, between the “eyeless armless noseless doll” or the tin of sardines “that lost all its sardines on the way” or the “can of French peas that lost all its French peas on the way,” all of it “yawn[ing] in the midday sun… ripe for the picking.” Suddenly you see a work of art abandoned by some “ignorant philistine”: the Mona Lisa is it? Or The Night Watch, the Venus de Milo or The Raft of the Medusa?

Carol Rumens chose “The Bin-Men Go on Strike” for her poem of the week earlier this year in the Guardian. She suggests that Queneau conjures “art from soiled fragmented images” and, in so doing, simultaneously goes in the opposite direction and reduces art back to rubbish. Who gets to declare what is art and what is not art? And so, I thought when I read the poem, who gets to declare what is history and what not? Anyone. Feeling superfluous is very freeing.


On a trip to Melbourne in June this year I was happily playing this game in my head in the Chapel Street Bazaar — one of the largest second-hand markets I’ve ever seen — when I was brought up short by a commemorative plate, one of those limited-edition ceramic pieces that people collect for display on shelf or wall.

After blinking at it for a few seconds I realised it depicts a moment shortly after the arrival of the First Fleet at Sydney Cove in 1788. A couple of ships lie at anchor, a Union Jack has been hoisted, and convicts and marines are busy rowing barrels of supplies to a small jetty. Someone has pitched a tent, and already a few trees have been felled to create a clearing.

It was priced at $95. Gingerly I picked it up and turned it over. The painting was titled “Ships of the First Fleet, Sydney Cove” and had been commissioned by Westminster Australia (a company specialising in commemorative ceramics, I later learned) for a limited firing to mark the Australian bicentenary in 1988. The original work was painted by maritime artist Ian Hansen.

Immediately I was taken back to the raucous year-long “celebration of a nation” that was 1988. Most particularly I remember the promotional jingle that planted a twelve-month earworm in all our heads:

Come on give us a hand,
Let’s make it grand!
Let’s make it great in ’88,
Come on give us a hand!

“The road to the Bicentenary was certainly a winding and treacherous one,” notes Frank Bongiorno in The Eighties: The Decade That Transformed Australia (2017). His remark makes me wish I had been paying more attention to the swirl of entangled ideologies going on at the time, but, living in Hobart and wrapped up in my own life, I wasn’t.

The First Fleet re-enactment did penetrate my world, mainly because the “tall ships,” as everyone called them, visited Hobart in early January 1988 for a race to Sydney ahead of the spectacular re-enactment event on the harbour on 26 January. Also on that day in Sydney a protest was attended by more than 40,000 Indigenous Australians and supporters from across the country. I don’t have Indigenous heritage and I confess it barely registered with me.

Mostly I recall a lot of people running about in period costume and the myriad television specials, concerts, books and so on. The official bicentennial logo — a map of Australia in green and gold diagonal stripes — was impossible to ignore. It was on everything from caps to coffee mugs to commemorative coins.

I thought it would be on the back of this plate too, but no, this was an unofficial production. I put it back on its little stand. It all seems such a long time ago now. Gradually I started to notice the clutter of other things on the same shelf. A matching hen and rooster in ceramic. A glazed figurine of a cat. A couple of lamps. A decanter and glasses. A bunch of artificial tulips in a vase. A stack of video cassettes topped by a biscuity-looking bust of the Madonna and child.

Tucked in next to the plate was a ceramic bell labelled “4 generations souvenir bell $45,” featuring an illustration of four generations of the royal family: the Queen, Prince Charles, Prince William and Prince George. The illustration was obviously taken from a photograph of Prince George’s christening in 2013, and I discovered later that it was posed to match a photograph taken in 1894 of the christening of the future Edward VIII. In that photo, Queen Victoria is seated holding the baby prince while the child’s grandfather and father (later Edward VII and George V) stand behind. In each photo the elderly female monarch is flanked by three future English kings. Extraordinary when you think about it.

Here then, in this crazy jumble of stuff, was a glorious freewheeling rejection of the power of professional museums to control the language of acquisition and display; a laugh-out-loud moment for a curator on a day off. This is what I turn up for in collectables shops.

The centrepiece was the commemorative plate, innocently inviting the viewer to remember the earliest days of white settlement on this continent. The flag seen on the right had been hoisted at an informal ceremony on 26 January 1788 at which Captain Arthur Phillip, having decided that this was the best place to establish the colony, had gathered a small party of officers and others to drink to the success of the new colony and the health of their king, His Majesty George III.

And there, depicted on that other useless ceramic thing — the bell — are George’s smiling descendants. There’s his little namesake, who will one day (presumably) be crowned George VII. The god they all worship makes an appearance too, on that altar of video cassettes, also as a babe in arms.

A little bit of everything, at the heart of which was a yawning absence. “Ships of the First Fleet, Sydney Cove” doesn’t depict a single Indigenous person — not one of the Eora people who had cared for that coast for tens of thousands of years before Phillip’s men planted the Union Jack there.

There is nothing to suggest the complex meeting of two vastly different cultures, none of what Inga Clendinnen, in Dancing with Strangers (2003), called “hugger-mugger accidents, casual misreadings, and unthinking responses to the abrasions inevitable during close encounters of the cultural kind.” Certainly there is no hint of violent dispossession. This was a 1988 view of 1788, and all the manufacturer wanted was to make money by producing something that people would be happy to display in their living rooms.

Actually — and this is no surprise in a collectables shop — I was surrounded by numerous examples of complex cultures and histories reduced to toy-like simplicity for domestic consumption. Walking about with fresh eyes I noticed a moustached Mexican doll in a sombrero, several black baby dolls (one of them ludicrously dressed in a grass skirt), some “golliwogs,” some “African” masks and a couple of very choice examples of “Aboriginalia.”

It’s within the collectables market, in bricks and mortar and on online, that we find the best kitsch, and a lot of it is genuinely good fun. It makes us smile, and sometimes generates fresh inspiration for artists and other creatives. But look again at what lurks. While these objects tell us little about the cultures their makers sought to represent, they tell us a great deal about ourselves. Our ignorance, our insularity and casual racism take artefactual form and, over time, fall to the bottom to form a giant, heaving slurry of stuff that we often just don’t know what to do with.


“You’re not going to buy it are you?” my son Harry queried when I told him about the commemorative plate that evening. Of course not, I said, although the thought had crossed my mind. But to do so would enhance the market for this kind of thing, and would, I thought, make me complicit in the artefact’s reductive re-enactment of the past. To own it would be to accept its message. For the price of $95 I would be rejecting Clendinnen’s warning that the people of the past are more than “just ourselves tricked out in fancy dress.”

So I walked away. Yet the plain truth of it is that I’d be embarrassed to own the plate myself and I’m hoping that a public museum somewhere has acquired one so that I can shuffle responsibility from the personal to the collective. I did some searching through various online collection databases but had no success with this particular item, although that’s not to say it’s not there somewhere.

But the 1988 bicentenary seems to be fairly well represented in public collections generally, which is heartening. It shows that, after all, there is a role for publicly funded museums (and libraries and archives) to preserve evidence that disturbs and unsettles our comfortable views of ourselves and our history. It is a job too important to be left to chance. At some point, bin-men and curators all need to get back to work.

Post-referendum we are likely to be feeling more than unsettled. What does the future hold? Australia Day 2024 is not that far away. •

The post You’re not going to buy it are you? appeared first on Inside Story.

]]>
https://insidestory.org.au/youre-not-going-to-buy-it-are-you/feed/ 6
Time’s quiet pulse https://insidestory.org.au/times-quiet-pulse/ https://insidestory.org.au/times-quiet-pulse/#respond Fri, 29 Sep 2023 00:22:46 +0000 https://insidestory.org.au/?p=75810

Historian Graeme Davison explores powerful forces below history’s horizon

The post Time’s quiet pulse appeared first on Inside Story.

]]>
“Clocks and watches have always fascinated me,” writes Graeme Davison. “I love their precision, their delicate self-regulation and their astonishing craftsmanship.” In My Grandfather’s Clock: Four Centuries of a British-Australian Family, he writes with deep affection of the clock of  his title, with its “steady rocking gait and its cheerful metallic ring.” Since it came into his house its quiet pulse, slower than a heartbeat, has provided a mesmeric and “reassuring aural backdrop” to his daily life.

As a historian Davison has long been interested both in material culture and “commonplace objects,” and — as he showed in his earlier book, The Unforgiving Minute — in the history of time-telling itself. The scholars who shaped his vision of history were “preoccupied by the mystery of time and change.” Small wonder that when a grandfather clock, a family heirloom, came into his possession, it should have inspired him to commence an investigation of its place in history and heritage, and a meditation on “the nature of time in both its personal and historical dimensions.”

Notwithstanding the book’s evocative title, it was never really Davison’s grandfather’s clock. His great-aunt “Cissie” (Elizabeth Anne Davison) brought the clock with her to Australia in 1934, twenty-two years after his grandfather emigrated and only a few months before he died. The clock had passed from father to son for generations, but for years it stood in Elizabeth’s crowded bedroom, among other relics of her former life that she had brought with her from England. On her death she bequeathed it to Davison’s father, with instructions that it should pass, eventually, to Graeme himself. Women, Davison acknowledges, “are often the great keepers of family memory,” although patriarchal society and patriarchal sources tend to obscure their role.

The search for the longer history of the clock’s place in the family sent Davison along the path of his ancestors to the moment of its acquisition — and then further still. Digging into his ancestral past, he followed the male line into deep time where “conventional genealogy loses its footing.” He found the misty origins of his family story in the fifteenth and sixteenth centuries, when Davisons and Davidsons were caught up in the blood feuds that convulsed the borderlands between Scotland and England.

Any sense of connection to those Davisons of yore can exist only in imagination; but Davison’s surrender to the romantic myth itself belongs to the pattern of history. The impulse that draws us to such places, he suggests, has its origins in the discontents of modernity. In our “mobile, globalised, urban world,” the family history trail seems to promise the possibility of return to an ancestral homeland and a “more primitive, unspoiled version of ourselves.”

Here, as throughout the book, Davison unostentatiously weaves reflection on his feelings and methods into the account of his findings. The social, cultural and emotional impulses of genealogical research are shown to be themselves a product of the history of modernisation he relates.

For all that, he insists, the possibility remains that “some part of us is indeed a relic of things below the horizon of history.” With that conviction, he peers into “that dark space where heredity and nurture, memory and history combine to make us who we are.”


The Davison surname provides the unbroken thread the historian can follow through multiple generations, up to and even beyond the point where parish records peter out. Davison acknowledges that this is a selective path, albeit one balanced somewhat by his earlier exploration of his mother’s forebears in Lost Relations (2015). But the profound implications of his choice are worth pondering for a moment — precisely because they are so easy to forget.

Little in our society owes more to social convention than surnames, which inherently claim patrilineal descent as the primary defining relationship. To say nothing of the possibility of error in any attribution of paternity, the thread that follows the male line is just one among the thousands, or millions, that make up the spreading fan of our ancestry. When we follow a single line, as sociologist Eviatar Zerubavel points out, we consign to oblivion three out of four grandparents, or sixty-three out of sixty-four great-great-great-great-grandparents.

To come at this from the other side: the most we can inherit from the four times great-grandfather whose surname we may bear is something less than 0.05 per cent of our genetic material. In the “inherently boundless community” of family, argues Zerubavel, the traditions of classification that determine kin recognition are matters of convention, not genetics. Genealogies do not passively document who our ancestors were; they are “the narratives we construct to actually make them our ancestors.” Such narratives exclude far more than they include, and not always with benign intent. So it behoves us to remember how tenuous is their basis in fact, and scrutinise the cultural assumptions they perpetuate.

Davison is rightly sceptical of the idea of genetic affinity with distant ancestors. While he surrenders willingly to the gravitational pull of the male line, he does so out of a combination of sentiment and pragmatism. The Davison name is the most “stable marker of personal identity” available to establish ancestry, while the records of women’s lives — the wives and mothers who might expand the picture — are even more scarce than those for men. Taking the ancestral line on which he can rely with greatest certainty, he uses it to ponder not genetic continuity but cultural continuity in the midst of social change.

The “things below the horizon of history” that form part of our identity, his book ultimately suggests, are not genes so much as elements of culture, beliefs, skills and aspirations that have passed quietly from one generation to the next. Untold generations of Davisons belonged to the “middling sort,” who struggled for life on the margins on whatever terms society offered them. Such experiences form family character in stubbornly enduring ways. The Davisons of whom he can write with personal knowledge “were modest, practical, plain-speaking folk, their manner abraded by the grit of their industrial origins. There was something uptight as well as upright about them; an effect, I suppose, of the hard school in which they had grown up.”


More than a mystical search for origins, this is a historian’s account of the place of the individual, or the family, in history. Davison’s ancestors come dimly into view in the context of great, if gradual, social transformation. Step by step, one decision at a time, sons moved away from fathers to establish new homes and learn new occupations: from a precarious existence in ancestral homelands to the acquisition of craft skills in a farming village, a port town, a factory suburb, and an industrial metropolis — until early in the twentieth century John Davison moved with his wife and children to the other side of the world.

The perspective of family offers a corrective to the generalisations of academic history, Davison suggests. Through the eyes of the people who lived through it, the industrial revolution can be seen as an evolution, the making of the modern world “more like a series of small adjustments than a leap from one way of life to another.” Each step, he argues, “was a one-off response to the map of opportunity at the time, but seen over the longue durée the moves fall into a pattern that suggests the operation of powerful unseen forces.”

In this story the grandfather clock, that accurate keeper of family time, becomes a powerful symbol tying the individual to the historical moment. Measured time, says Davison, was the foundation of modern life: when another John, six generations back, acquired the clock around the turn of the nineteenth century, “my family were joining a very large project indeed.”

For John the clock may have been chiefly a marker of growing prosperity and social status. For his son William, a skilled block printer paid on piece rates for what he and his sons could produce, it had an added utility, regulating the daily activities of his industrious household. Superseded by newer technology in the late twentieth century, it lost much of its earlier importance and stood idle for years, until rescued and repaired by a sentimental historian who likes to think, when winding the clock, of the “foggy fingers” of the ancestors who have wound it before him.

Pondering generational change, Davison offers a persuasive and thought-provoking account of the relationship between the agency of the individual (man) and the impersonal forces of modernisation. Yet I wondered at times whether a more expansive exploration of family within each generation — of the horizontal, as well as the vertical, structures of kinship — might have brought into view more of the “invisible” factors that enabled or constrained their choices.

Let us not forget, for example, how the clock made its way to Melbourne and eventually to Davison himself. Not by direct transmission through the male line, but through the agency of great-aunt Cissie, whose “map of opportunity” after her father’s death in 1930 showed only one path to survival, in her brother’s household on the other side of the world. While her journey and her bequest eventually restored the clock to its traditional path from father to son, her story has deeper significance, as a reminder that for some family members, women especially, survival has often depended on more complex webs of kinship. Though their stories may rarely be preserved, they too imprint a ghostly presence on the family tree, for those who care to look. Their imprint on family culture was profound.

But any account of family origins must leave out far more than it can ever tell. Each journey through family history is as selective as it is idiosyncratic. Though he doesn’t dwell on their implications, Davison doesn’t hide the choices he has made. His gentle, reflective, beguiling narrative invites us to travel at his side as he pursues his individual quest, and to surrender to the charm of the knotted threads of sentiment, imagination and hard-edged research that bind him to his forebears and to history.

My Grandfather’s Clock: Four Centuries of a British-Australian Family
By Graeme Davison | The Miegunyah Press | $50 | 319 pages

The post Time’s quiet pulse appeared first on Inside Story.

]]>
https://insidestory.org.au/times-quiet-pulse/feed/ 0
A Dunera life https://insidestory.org.au/a-dunera-life/ https://insidestory.org.au/a-dunera-life/#comments Sun, 17 Sep 2023 02:07:11 +0000 https://insidestory.org.au/?p=75639

Sent to Australia as an “enemy alien” by Churchill’s government, Bern Brent spent decades challenging conventional accounts of the internees’ lives

The post A Dunera life appeared first on Inside Story.

]]>
Most of the 2050 or so Dunera internees — or Dunera boys, as they are commonly known — were German and Austrian Jews, many of them refugees from Hitler’s regime. In 1940, on Churchill’s orders, these “enemy aliens” were arrested in Britain and deported to Australia, where they were to be held for the duration of the second world war. The Dunera was the ship that brought them to Australia, and Bern Brent, who died last week at 100 years of age, was among the internees on board.

I was lucky enough to work with Ken Inglis and friends on Dunera Lives: A Visual History, published in 2018. The book is an attempt to tell something of the Dunera story through 500 images, one of which attracted far more comment than any other. This was a grainy black-and-white photograph from Bern’s collection taken on 14 December 1938, a month after the anti-Semitic violence of Kristallnacht.

The photo shows fifteen-year-old Gerd Bernstein (Bern’s original name), dressed in a suit, raising a glass to his family. His mother Helena, father Otto and maternal grandmother Sophie Maas sit at a table in their Berlin home at Wielandstrasse 17. A picture of young Gerd hangs on the wall.

The faces most exposed to the camera are those of grandson and grandmother. Gerd looks thoughtful and hesitant, even sad, while on his grandmother’s face I see both love and terror — love for him, and terror about what the future holds for her Jewish family. The next day Gerd left Berlin on a Kindertransport, bound for Britain. On 17 December, alone in Britain just two days after leaving home for a strange and unfamiliar country, he turned sixteen.

His parents survived the war despite his father Otto’s imprisonment in Theresienstadt concentration camp from 1942 to 1945. Otto was the sole survivor among the hundred people with whom he was transported to Theresienstadt; Sophie died there.

After the visual history appeared, Bern and I discussed readers’ reactions to this photograph and the intense, heavy sadness it prompted in many, me included. Bern would have none of it. He thought this response muddled, our emotions skewed by knowledge of the Holocaust.

In Bern’s mind, this photograph showed a moment in the life of his family, no more and no less. To view the photograph through the lens of the Holocaust was a mistake, and he made a point of reminding me that in 1938 the Nazis had not yet embarked on the systematic destruction of European Jewry.

Bern’s position on the photograph reflected his historical preferences. The substance of any history, Dunera’s included, is in the detail. He saw history through a lens that allowed little room for the floating of partly conceived ideas and none at all for speculation, no matter how worthy the intention. If there were a place for emotion, it was after the facts were established. Perhaps his approach could be called traditional. I can’t imagine Bern thinking much of the modern belief that histories can be retold on the basis of emotions.

Whatever the merits of Bern’s approach, it made him a wonderful informant. His view of the past as the stuff of hard facts meant that he spoke only about what he knew. If he didn’t know the answer, he said so. If he thought the question irrelevant, he made that clear too, then explained why. He punctured myths, of which Dunera has more than its share, and raised questions that forced historians and others to think anew about key moments in the story.

Recently I asked him about an incident that looms large in Dunera history and memory. On the voyage to Australia, British guards treated the internees with calculated brutality in a gross, and in some cases criminal, dereliction of duty. The Dunera canon tells that guards, as part of this sustained assault, forced internees to walk over broken glass strewn across the deck of the ship.

The glass was there, Bern told me, but he doubted it was placed deliberately, and he and others simply stepped around it. While the weight of evidence about this incident is against him, I know that Bern, as a historian of the Dunera, never spoke idly. On that basis alone, his account demands consideration.

Ken Inglis cherished Bern’s clarity and commitment to accuracy. They corresponded from the start of Ken’s Dunera project — which also led to a second book, Dunera Lives: Profiles, in 2020 — exchanging emails regularly until Ken’s death in December 2017. Their voluminous correspondence, now part of the Inglis Dunera papers at the National Library of Australia, reveals two scholars in respectful and admiring conversation, one testing notions and ideas, the other encouraging or discouraging those possibilities.

While Bern was an oracle on Dunera, on one aspect of the story he had no answers. When the Dunera internees arrived in Australia, most were incarcerated first at Hay in western New South Wales. Because the camps there could house only 2000, around ninety-five of the internees, seemingly chosen at random, were taken instead to Tatura in Victoria’s Goulburn Valley, along with other men who had travelled on the Dunera and been deemed dangerous by British authorities.

Bern was part of this Tatura rump, and there he stayed for the duration of his internment, which lasted until January 1942. Thus, he knew nothing of camp life at Hay. Ken would chuckle at this inconvenience, suggesting that Bern had been remiss in not arranging his internment to suit the needs of future historians.

Bern exerted a strong influence on the writing of the two volumes of Dunera Lives, saving us from mistakes and misinterpretations, and suggesting lines of enquiry that emerged as themes in the books. He was our most prolific and important informant. If our telling of a story differed from the one he knew, he always gave our version a fair hearing. On the odd occasion, we might even have convinced him.

On other occasions, not at all. In Dunera Lives we took a strong line on Winston Churchill’s role in the Dunera affair. While Churchill’s wartime government would later issue an apology of sorts to the Dunera internees for the appalling treatment they suffered, by that time many had already concluded that British liberalism was a chimera. Bern was of the opposite view. He held that Churchill had no choice but to act as he did, and that to suggest otherwise was to allow historical judgement to be derailed by the luxury of hindsight.

This position was entwined with another view to which Bern stuck fast. The Dunera had delivered him to Australia, where he made a rewarding and productive life. As he said often, there was nothing for him in postwar Europe, whereas in Australia, as a young man with energy and purpose, he was able to embrace education and new beginnings, free of the restrictions and prejudices that had shaped his life in Germany.

For Bern, his good fortune was the story, and this mattered more than issues such as the question of Churchill’s culpability. He thought the Dunera the luckiest thing to ever happen to him. Perhaps the fact that his parents survived the Holocaust also influenced this position.


Bern was an unusual Dunera boy in other ways, too. While happily Australian, he maintained strong links to his homeland. He returned to Berlin and Germany often, visiting past haunts and chasing up friends. He continued to speak and read German, and listened to German news on the radio. A couple of years ago Bern wrote to tell me about the Exilmuseum in Berlin, after he had heard mention of the nascent institution on a German radio program. He wondered if the museum’s curators would be interested in learning about the strange story of the German and Austrian exiles who in 1940 found themselves interned in rural Australia. They were.

For other Dunera boys, such engagement with Germany and Austria was anathema. Their wartime experiences and knowledge of the Holocaust poisoned their feelings for the land of their birth. Many never returned to Germany and Austria; many chose to avoid Germanic culture and language.

Bern too knew the pain of persecution and the tragedy of the Holocaust. Anti-Semitism forced him and his mother to flee Germany for Britain. Hitler’s regime murdered his grandmother and imprisoned his father. But never did he allow the pain and injustice of the past to determine the direction of his life. It was a remarkably brave choice, and one that not all Dunera boys were able to make, or even wanted to.

Bern’s longevity conferred a sad and perhaps unwanted title. He was the last Dunera boy in Australia, and among the last anywhere. It is thought that there is a Dunera boy alive in France, and another in New York, and possibly others of whom my colleagues and I don’t know, though it is unlikely that these “unknowns” would number more than one or two.

Bern accepted his position with grace, acknowledging the dubious honour as a responsibility rather than a burden, which surely it was. To the best of my knowledge, he never refused a request for an interview, and was diligent in answering questions from scholars and members of the Dunera diaspora. Perhaps he saw duty in these tasks; the Dunera had led him to a good life in Australia, and provided both a scholarly purpose in his later years, and enduring friendships.

On what proved to be his last weekend, Bern travelled to Melbourne, where he delighted in the company of Peter Danby’s family. Danby, originally Peter Danziger, was also a Dunera boy, though the friendship was older than that, the two having met in Britain. Bern was accompanied by Peter and Joanna, two of his three children. The Brent–Danby friendship is now carried by the next generation.

In September 2022 I took the British author and activist Jennifer Nadel to meet Bern in his Canberra home. Jennifer’s father, George Nadel, was a budding scholar when he was deported to Australia on the Dunera. She knows little about his internment, but enough to realise that George’s postwar silences hid deep trauma. For Jennifer, Dunera has been a difficult and painful word.

Aside from their Jewish heritage, George Nadel and Bern had little in common. George was born Austrian, Bern German. Both had a passion for history, though they were driven by differing approaches and emphases. George, who went on to found and edit the venerable journal History and Theory, was an academic who practised history in more formal worlds and ways than Bern. If Bern were ever a reader of that journal, I imagine him warming more to the history than the theory. Bern saw the Dunera as a ship of salvation; for George, the Dunera seems to have heralded only misery.

And yet both men survived the Dunera, and by 2022 Bern was one of very few people anywhere in the world who could talk directly of the experience. Through Bern, Jennifer was given a privileged glimpse of a past about which George never spoke. As we drove away from Bern’s home, she said that his German-accented English, and certain of his mannerisms, evoked fond thoughts of her father.

When I wrote to Jennifer to tell her of Bern’s death, she immediately recalled his bearing and presence, and the importance to her of their meeting. Bern’s willingness to act as a conduit to the past, to talk openly and directly about Dunera, helped many people like her to better understand the story and their part in it. In so doing he aroused emotions. I wonder what he made of that.

Jennifer described meeting Bern as a privilege. It’s the right word. To have known, talked and corresponded with him was a privilege, and something I cherish. Ruhe sanft, lieber Bern. •

The post A Dunera life appeared first on Inside Story.

]]>
https://insidestory.org.au/a-dunera-life/feed/ 14
Clash of the titans https://insidestory.org.au/clash-of-the-titans/ https://insidestory.org.au/clash-of-the-titans/#comments Fri, 08 Sep 2023 06:46:47 +0000 https://insidestory.org.au/?p=75583

Doc Evatt may have won the battle over banning the Communist Party but Bob Menzies was the ultimate victor

The post Clash of the titans appeared first on Inside Story.

]]>
Two scholarship boys, both born in 1894, both drawn to politics and the law, were destined to be fierce rivals on the national stage. Running for the Nationalist Party in 1928, one of them — Robert Menzies — secured election to the Victorian upper house; the following year he moved to the lower house and then in 1934, with the United Australia Party, to federal parliament. The other — H.V. “Doc” Evatt — resigned from NSW parliament to join the High Court at the unlikely age of thirty-six; even more unlikely was his decision to quit the bench in 1940 to run as a Labor candidate in the federal election.

Evatt’s move from court to federal parliament was considered “a most regrettable precedent” by Menzies, who was by then prime minister. (While it may have been regrettable, it wasn’t much of a precedent, never being repeated over the ensuing eighty-three years.) Evatt responded in kind, suggesting that Menzies would lose the next election. (That, too, proved a less than accurate prediction.) As Anne Henderson sees it in her new book, Menzies vs Evatt: The Great Rivalry of Australian Politics, battle was joined from that time.

Reading Henderson’s opening chapters it’s hard not to be staggered by Evatt’s workload as external affairs minister and attorney-general. No minister today would take on these dual roles, and Henderson highlights the difficulties the combination caused for Labor in government, especially at a time of war.

It would have been a punishing load for the best-organised minister (which Evatt clearly was not), and was exacerbated by his frequent absences overseas in the pre-jet age, including a year as president of the UN General Assembly. As an often-absentee attorney-general, he was unable to contribute fully to vital tasks, including defending the government’s bank nationalisation plan before the High Court.

Evatt became Labor leader after Ben Chifley’s death in June 1951. His role later that year in defeating Menzies’s referendum to ban the Communist Party is seen by many as his finest moment, but Henderson downplays the victory. Support for the ban was recorded by polls at 73 per cent in early August but by polling day, six weeks later, it had dropped to just under 50 per cent. (The referendum was carried in only three states.) Henderson cites the history of defeated referendum proposals and asks why the Yes even got close — as if falling support for the ban followed a law of nature regardless of effective political campaigning.

It’s true that early support for many referendum proposals has evaporated by polling day. But it is difficult to think of a question for which Yes campaigners enjoyed more favourable circumstances than this one. The cold war was in full swing, Australian troops were fighting the communists on the Korean peninsula (under a UN flag), and communism was seen as an existential threat, broadly detested within the electorate. Menzies had warned of the possibility of a third world war within three years; strong anti-communist elements within Evatt’s own party supported the ban.

Indeed, one might equally ask why Menzies couldn’t pull it off. I suspect that he would have appreciated the irony that it was the internationalist Evatt, not the Anglophile Menzies, who campaigned by citing British justice’s onus on the state to prove guilt rather than (as the anti-communists proposed) on the accused to prove innocence.

As with most failed referendums, the loss did the prime minister no harm. In fact, Henderson makes the interesting suggestion that it saved him from having to enact legislation that may “have been as divisive and unsettling to civic order” as the McCarthy hearings were in the United States. It’s impossible to prove of course, but Australia definitely didn’t need that kangaroo court–type assault on individuals’ reputations and lives.

Henderson’s account of the Petrov affair and the subsequent royal commission — a disastrous time for Evatt — traverses territory that is probably less contentious than it was a generation ago. On the Labor Party’s 1955 split, she quotes with approval the claim by former Liberal prime minister John Howard that Labor’s rules afforded too much power to its national executive: a more genuinely federal structure (like that of the Liberals) would have rendered Evatt’s intervention more difficult and a split in the party less likely.

Whether a Victorian Labor branch left mostly to its own devices would have sorted out its problems is unclear, but the opportunity was unlikely given the hostility of Evatt and his supporters to the group of Victorians they saw as treacherous anti-communists. Ironically, it was this capacity to intervene that would facilitate a federal takeover of the moribund (and still split-crippled) Victorian ALP fifteen years later. That intervention eventually reinvigorated the state branch, establishing a Labor dominance in Victorian state elections and in the state’s federal seats that persists to the present day.

Henderson also poses the question of whether a different Labor leader could have avoided the split. What if deputy leader Arthur Calwell had been installed after the 1954 election loss? She speculates that Calwell might have been able to offer concessions to the anti-communist Victorians and stresses an absence of intense ideological fervour among many of those who would soon be expelled from the party.

While it is hard to envisage a leader handling the crisis less effectively than Evatt did, Henderson quotes Labor MP Fred Daly’s view that Calwell at the time was “hesitant, uncertain and waiting for Evatt’s job” — hardly the stuff of firm leadership. Arthur was always prepared to wait.

It may be true that most of the anti-communist Labor MPs, even in Victoria, were not fervent ideologues, but possibly more relevant was the ideological predisposition of the powerful Catholic activist B.A. Santamaria, who was able to influence state Labor’s decision-making bodies and preselections from outside the party. Santamaria boasted in 1952 to his mentor, Archbishop Daniel Mannix, that his Catholic Social Studies Movement (the infamous “Movement”) would be able to transform the leadership of the Labor movement within a few years and install federal and state MPs able to implement “a Christian social program.”

This may have been overly ambitious nationally, but Santamaria’s undue influence over Victorian Labor was already a concern for some. Moreover, the party Santamaria envisaged might be viewed as essentially a church or “confessional” party, at odds with traditional Australian “Laborism,” not to mention with the main elements of a pluralist, secular democracy.

Henderson’s most interesting observation, for this reviewer, is her contention that Evatt’s lack of anti-communist conviction owed much to his being “an intense secularist.” It is certainly the case that critics of communism in this era often preceded the noun with the adjectives “godless” or “atheistic.” In a predominantly Christian society like Australia, communism’s atheistic nature was a damning feature, especially among Catholics, including Catholic Labor MPs. Presbyterian Menzies also held strongly to this view.


If this review has focused more on Evatt than on Menzies, this reflects the enduring questions Evatt’s leadership raises — including the state of his mental health, which is seen by some as helping to explain his erratic and self-destructive behaviour. (Henderson doesn’t consider this question, but it was well covered by biographer John Murphy.)

Menzies, having survived the referendum result, was also undaunted by his narrow election victory in 1954, secured with a minority of the vote, a lucky escape to be repeated in 1961. He went for the Evatt jugular whenever it was exposed — which was often, as Henderson shows vividly. John Howard would later claim, on his own behalf, that the times suited him. Menzies had that advantage in spades, and he exploited it artfully.

If there is a central theme to Menzies’s approach to his battle with Evatt, it is his characterisation of the Labor leader as a naive internationalist, oblivious to the emerging threat of monolithic communism, especially to the north of Australia. This is a criticism endorsed by Henderson. A cynic might suggest that the communist threat was not only electoral gold for Menzies but also provided a convenient pretext for him to maintain his unwavering support for European colonialism. Better the colonialists than the communists.

Neither character was a team player by instinct, but Menzies adapted better and learned from mistakes. Among other flaws, Evatt’s lack of self-awareness was both crucial and crippling. There is no doubt that the winner of the “great rivalry” was Menzies.

As a known partisan, Henderson runs the risk that her book will be seen in that light, and that her put-downs of Evatt’s admirers — “a collective of scribblers,” “the Evatt fan club” — will be viewed accordingly. Her failure to acknowledge any merit in Evatt’s referendum victory will seem churlish to some. But Henderson can’t be faulted on the book’s readability: it’s a one-sitting job for those fascinated by the politics of that era.

I was left wondering about the depth of the personal animus between the two men. Henderson quotes Menzies accusing Evatt of being too interested in power — as “a menace to Australia” to be kept out of office “by hook or by crook.” Prime ministers and opposition leaders routinely find themselves in settings where some form of civil, non-political conversation is virtually unavoidable. What on earth might these two have talked about? Well, both of them loved their cricket. •

Menzies vs Evatt: The Great Rivalry of Australian Politics
By Anne Henderson | Connor Court | $34.95 | 236 pages

The post Clash of the titans appeared first on Inside Story.

]]>
https://insidestory.org.au/clash-of-the-titans/feed/ 3
Odyssey down under https://insidestory.org.au/odyssey-down-under/ https://insidestory.org.au/odyssey-down-under/#respond Fri, 08 Sep 2023 05:33:01 +0000 https://insidestory.org.au/?p=75570

A new kind of history is called for in the year of the Voice referendum. Here’s what it might look like.

The post Odyssey down under appeared first on Inside Story.

]]>
In the beginning, on a vast tract of continental crust in the southern hemisphere of planet Earth, the Dreaming brought forth the landscape, rendering it alive and full of meaning. It animates the landscape still, its power stirred constantly by human song, journey and ceremony. Past and present coalesce in these ritual bursts of energy. Creatures become mountains which become spirits that course again through the sentient lands and waters. People visit Country, listen to it, and cry for it; they sing it into being, they pay attention to it. They crave its beneficence and that of their ancestors. Their very souls are conceived by Country; life’s first quickening is felt in particular places and they become anchored forever to that beloved earth.

The stars are our ancestors lighting up their campfires across the night sky. The universe exploded into being fourteen billion years ago and is still expanding. As it cooled and continued to inflate, an opposite force — gravity — organised matter into galaxies and stars. Everything was made of the elements forged by stars. Around billions of fiery suns, the interstellar dust and debris of supernovas coalesced as planets, some remaining gaseous, some becoming rigid rock. Earth, with its molten core, its mantle of magma and a dynamic crust, was born. The planet is alive.

In the shallow waters off the western coast of the continent metamorphosed by the Dreaming sit solid mementos of the beginning of life. They are living fossils, cushions of cells and silt called stromatolites. After life emerged in a fiery, toxic cauldron in an ocean trench, bacteria at the surface captured sunlight and used it to create biological energy in the form of sugar. They broke down carbon dioxide in the atmosphere, feeding off the carbon and releasing oxygen as waste. Photosynthesis, Earth’s marvellous magic, had begun. It was just a billion years after the planet was formed.

To later inhabitants, oxygen would seem the most precious waste in the firmament. But it was a dangerous experiment, for the oxygen-free atmosphere that had created the conditions for life was now gone. Stromatolites hunched in the western tides descended from the creatures that began to breathe a new atmosphere into being.

Two billion years ago, enough oxygen existed to turn the sky blue. The same oxygen turned the oceans red with rust. Thus life itself generated the planet’s first environmental crisis. This ancient rain of iron oxide is preserved today in the banded ores of the Hamersley Range. The universe was then already old, but Earth was young.

The planet was restless and violent, still seething with its newness. When separate lands fused, the earth moved for them. Australian landmasses shifted north and south as crusts cruised over iron-rich magma. Large complex cells fed off the growing oxygen resource and diversified rapidly. For almost 400 million years the whole planet became gripped by glaciation and scoured by ice, and most life was extinguished. The long reign of the ancient glaciers was written into rock.

As the ice withdrew, life bloomed again. Organisms of cooperative cells developed in the oceans and became the first animals. Six hundred million years ago, a supercontinent later known as Gondwana began to amass lands in the south, and their titanic fusion created a chain of mountains in central Australia. Uluru and Kata Tjuta, inspirited by the rainbow python, are sacred rubble from this momentous first creation of Gondwana.

Life ventured ashore, protected now from dangerous radiation by the strengthening shield of ozone gas around Earth. Plants and animals sustained each other, the essential oxygen circulating between them. Gondwana united with other continents, creating a single landmass called Pangaea. When the planet cooled again, surges of glacial ice scoured life from the land once more. But life persisted, and its reinventions included the seed and the egg, brilliant breakthroughs in reproduction. They were portable parcels of promise that created a world of cycads and dinosaurs.

Earth gradually changed its hue over eons. Rusted rock and grey stone became enlivened by green, joining the blue of the restless oceans. Chlorophyll conquered the continents. Pines, spruces, cypresses, cycads and ferns found their way up the tidal estuaries, across the plains and into the mountains, but the true green revolution awaited the emergence of flowering plants. These plants generated pollen and used animals as well as wind to deliver it. Insects especially were attracted to the perfumed, colourful flowers where they were dusted with pollen before they moved to another bloom. It was a botanical sexual frenzy abetted by animal couriers. The variety of plants exploded. Nutritious grasslands spread across the planet and energy-rich fruits and seeds proliferated. As this magic unfolded, Gondwana separated from Pangaea again and consolidated near the south pole, where it began to break up further.

The cosmic dust that had crystallised as Earth, dancing alone with its single moon and awash with its gradually slowing tides, seemed to have settled into a rhythm. The bombardment of meteors that marked its early life had eased. Giant reptiles ruled, small mammals skulked in the undergrowth, and flowers were beginning to wreak their revolution.

Then, sixty-six million years ago, the planet was violently assaulted. A huge rogue rock orbiting the Sun plunged into Earth. The whole planet shuddered, tidal waves, fires and volcanoes were unleashed, soot blackened the atmosphere, and three-quarters of life was extinguished. The largest animals, the dinosaurs, all died. But the disaster of the death star also created the opportunity for mammals to thrive. The comet forged the modern world.


Flat and geologically calm, the landmass that would become Australia was now host to few glaciers and volcanoes. But ice and fire were to shape it powerfully in other ways. About fifty million years ago, in the final rupture of Gondwana, Australia fractured from its cousin, Antarctica, and voyaged north over millions of years to subtropical latitudes and a drier climate. Fire ruled Australia while Antarctica was overwhelmed by ice. The planet’s two most arid lands became white and red deserts.

The newly birthed Australian plate rafted north into warmer climes at a time in planetary history when the earth grew cooler, thus moderating climatic change and nurturing great biodiversity. It was the continent’s defining journey. It began to dry, burn and leach nutrients, the ancient soils became degraded and impoverished, and the inland seas began to dry up. In the thrall of fire, the Gondwanan rainforest retreated to mountain refuges and the eucalypt spread. Gum trees came to dominate the wide brown land. The bush was born.

Three million years ago, when North and South America finally met and kissed, the relationship had consequences. Ocean currents changed and the Pleistocene epoch, marked by a succession of ice ages, kicked into life. Regular, dramatic swings in average global temperature quickened evolution’s engine. The constant tick and tock of ice and warmth sculpted new, innovative life forms.

In southern Africa, an intelligent primate of the forests ventured out onto the expanding grasslands and gazed at the horizon. This hominid was a creature of the ice ages, but her magic would be fire. One day her descendants walked north, and they kept on walking.

By the time they reached the southeastern edges of the Asian islands, these modern humans were experienced explorers. They gazed at a blue oceanic horizon and saw that there was no more land. But at night they observed the faint glow of fire on a distant continent. And by day they were beckoned by haze that might be smoke and dust. What they did next was astonishing.

The people embarked on an odyssey. They strengthened their rafts and voyaged over the horizon, beyond sight of land in any direction — and they kept on sailing. They were the most adventurous humans on Earth. They crossed one of the great planetary boundaries, a line few land-based animals traversed, one of the deep sutures of tectonic earth. This was over 60,000 years ago. The first Australians landed on a northern beach in exhaustion, wonder and relief. They had discovered a continent like no other.

The birds and animals they found, the very earth they trod, had never known a hominid. The other creatures were innocent of the new predator and unafraid. It was a bonanza. But the land was mysterious and forbidding and did not reveal its secrets easily. The people quickly moved west, east and south, leaving their signatures everywhere. They had to learn a radically new nature. Arid Australia was not consistently dry but unpredictably wet. The climate was erratic, rainfall was highly variable, and drought could grip the land for years. The soil was mostly poor in nutrients and there were few large rivers. But these conditions fostered biodiversity and a suite of unique animals and plants that were good at conserving energy and cooperating with one another.

The first people arrived with a firestick in their hands, but never before had they known it to exert such power. For this was the fire continent, as distinctive in its fire regimes as in its marsupials and mammal pollinators. Fire came to be at the heart of Australian civilisation. People cooked, cleansed, farmed, fought and celebrated with fire. The changes they wrought with hunting and fire affected the larger marsupials which, over thousands of years, became scarce. People kept vast landscapes open and freshly grassed through light, regular burning. By firing small patches they controlled large fires and encouraged an abundance of medium-sized mammals. As the eucalypt had remade Australia through fire, so did people.

They had arrived on those northern beaches as the latest ice age of the Pleistocene held the planet in its thrall. Polar ice was growing and the seas were lower, which had made the challenging crossing from Asia just possible. People could walk from New Guinea to Tasmania on dry land. This greater Australia, now known as Sahul, was the shape of the continent for most of the time humans have lived here. People quickly reached the far southwest of Western Australia and the southern coast of Tasmania. From the edge of the rainforest they observed icebergs from Antarctica, emissaries from old Gondwana.


For tens of thousands of years after people came to Australia, the seas continued to retreat and the new coastlines were quickly colonised. Every region of the continent became inhabited and beloved, its features and ecologies woven into story and law. Trade routes spanned the land. People elaborated their culture, history and science in art and dance, and buried their loved ones with ritual and ceremony in the earliest known human cremations. Multilingualism was the norm. Hundreds of distinct countries and languages were nurtured, and the land was mapped in song. This place was where everything happened, where time began.

As the ice age deepened, the only glaciers in Australia were in the highlands of Tasmania and on the peaks of the Alps. For much of the continent, the ice age was a dust age. Cold droughts settled on the land, confining people in the deserts to sheltered, watered refuges. Great swirls of moving sand dunes dominated the centre of the continent but the large rivers ran clear and campfires lit up around the lakes they formed. About 18,000 years ago, the grip of the cold began to weaken and gradually the seas began to rise. Saltwater invaded freshwater, beaches eroded, settlements retreated, sacred sites became sea country. The Bassian Plain was flooded and Tasmanians became islanders. Over thousands of years, Sahul turned into Australia.

The rising of the seas, the loss of coastal land, and the warming of average temperatures by up to 8°C transformed cultures, environments and economies throughout the continent. People whose ancestors had walked across the planet had survived a global ice age at home. In the face of extreme climatic hardship, they continued to curate their beloved country. They had experienced the end of the world and survived.

The warm interglacial period known as the Holocene, which began 13,000 years ago, ushered in a spring of creativity in Australia and across the planet. Human populations increased, forests expanded into the grasslands and new foods flourished. Australians observed the emergence of new agricultural practices in the Torres Strait islands and New Guinea but mostly chose not to adopt them. They continued to tune their hunting and harvesting skills to the distinctive ecologies of their own countries, enhancing their productivity by conserving whole ecosystems. A complex tapestry of spiritual belief and ceremonial ritual underpinned their economies. The sharing of food and resources was their primary ethos.

Strangers continued to visit Australia from across the seas, especially from Indonesia and Melanesia. Four thousand years ago, travellers from Asia brought the dingo to northern shores. During the past millennium, Macassans from Sulawesi made annual voyages in wooden praus to fish for sea cucumbers off Arnhem Land where they were generally welcomed by the locals. The Yolngu people of the north engaged in trade and ceremony with the visitors, learned their language, adopted some of their customs and had children with them. Some Australians travelled by prau to Sulawesi.

In recent centuries, other ships nosed around the western and northern coasts of the continent, carrying long-distance voyagers from Europe. One day, early in the European year of 1788, a fleet of tall ships — “each Ship like another Noah’s Ark” carefully stowed with seeds, animals and a ballast of convict settlers — entered a handsome harbour on the east coast of Australia and began to establish a camp. These strangers were wary, inquisitive and assertive, and they came to stay. They were here to establish a penal colony and to conduct an agrarian social experiment. They initiated one of the most self-conscious and carefully recorded colonisations in history on the shores of a land they found both beautiful and baffling.

They were from a small, green land on the other side of the world, descendants of the people who had ventured west rather than east as humans exited Africa. They colonised Europe and Britain thousands of years after the Australians had made their home in the southern continent. They lived in a simplified ecology scraped clean by the glaciers of the last ice age, and were unprepared for the rich subtlety of the south.

For 2000 years before their arrival in Australian waters, the Europeans had wondered if there might be a Great South Land to balance the continents of the north. By the start of the sixteenth century, they confirmed that the planet was a sphere and all its seas were one. They circled the globe in tall sailing ships and voyaged to the Pacific for trade, science and conquest. The British arrivals were part of the great colonialist expansion of European empires across the world. For them, success was measured through the personal accumulation of material things; Australians were the opposite.

On eastern Australian beaches from the late eighteenth century, there took place one of the greatest ecological and cultural encounters of all time. Peoples with immensely long and intimate histories of habitation encountered the furthest-flung representatives of the world’s first industrialising nation. The circle of migration out of Africa more than 80,000 years earlier finally closed.

The British did indeed find the Great South Land of their imagination seemingly waiting for them down under and they deemed it vacant and available. It was an upside-down world, the antipodes. They would redeem its oddity and emptiness. The invaders brought the Bible, Homer, Euclid, Shakespeare, Locke and the clock. They came with guns, germs and steel. With the plough they broke the land. They shivered at “the deserted aboriginal feel of untilled earth.” They dug the dirt and seized it. Sheep and cattle were the shock troops of empire; their hard hooves were let loose on fragile soils and they trampled them to dust. Australian nature seemed deficient and needed to be “improved.” Colonists believed that the Australians were mere nomads, did not use the earth properly, and therefore did not own it.

But the true nomads were the invaders and they burned with land hunger. War for possession of the continent began. It continued for more than a hundred years on a thousand frontiers. Waterholes — the precious jewels of the arid country — were transformed into places of death. It was the most violent and tragic happening ever to befall Australia. So many lives were sacrificed, generations of people were traumatised, and intimate knowledge of diverse countries was lost.


Australia entered world history as a mere footnote to empire; it became celebrated as a planned, peaceful and successful offshoot of imperial Britain. A strange silence — or white noise — settled on the history of the continent. Nothing else had happened here for tens of thousands of years. Descendants of the newcomers grew up under southern skies with stories of skylarks, village lanes and green hedgerows from the true, northern hemisphere. And they learned that their country had a short triumphant history that began with “a blank space on the map” and culminated in the writing of “a new name on the map” — Anzac. So the apotheosis of the new nation happened on a distant Mediterranean shore. The cult of overseas war supplanted recognition of the unending war at home, and the heroic defence of country by the first Australians was repressed. They were disdained as peoples without agriculture, literacy, cities, religion or government, and were allowed neither a history nor a future.

The British and their descendants felt pride in their new southern land and pitied its doomed, original inhabitants. Colonists saw themselves as pioneers who pushed the frontier of white civilisation into the last continent to be settled, who connected Australia to a global community and economy. They were gratified that their White Australia, girt by sea, a new nation under southern skies, was a trailblazer of democratic rights: representative government, votes for working men, votes for women. But the first Australians lay firmly outside the embrace of democracy. They continued to be removed from country onto missions and reserves; they did not even have a rightful place in their own land, and every aspect of their lives was surveyed.

The invaders lived in fear of invasion. Had they used the soil well enough, had they earnt their inheritance? Would strangers in ships, boats, threaten again? Had they reckoned with their own actions in the land they had seized? There was a whispering in their hearts.

New peoples arrived down under from Europe, the Americas and Asia, and the British Australians lost their ascendancy. Australia became the home again of many cultures, vibrantly so, and a linguistic diversity not seen on the continent since the eighteenth century flourished. Many languages of the first peoples persisted and were renewed. The classical culture of the continent’s discoverers endured; their Dreamings, it was suggested, were the Iliad and Odyssey of Australia. A bold mix of new stories grew in the land.

The invaders of old Australia did not foresee that the people they had dispossessed would make the nation anew. The society they created together was suffused with grief and wonder. The original owners were recognised as full citizens and began to win their country back through parliament and the courts. They believed their ancient sovereignty could shine through as a fuller expression of Australia’s nationhood.

But now the planet was again shuddering under an assault. The meteor this time was the combined mass of humans and their impact upon air, oceans, forests, rivers, all living things. It was another extinction event, another shockwave destined to be preserved in the geology of Earth. The fossilised forests of the dinosaurs, dug up and burnt worldwide since Australia was invaded, had fuelled a human population explosion and a great acceleration of exploitation. Rockets on plumes of flame delivered pictures of spaceship Earth, floating alone, finite and vulnerable in the deep space of the expanding universe. Ice cores drilled from diminishing polar ice revealed, like sacred scrolls, the heartbeat of the planet, now awry. The unleashing of carbon, itself so damaging, enabled a planetary consciousness and an understanding of deep time that illuminated the course of redemption.

The Australian story, in parallel with other colonial cataclysms, was a forerunner of the planetary crisis. Indigenous management was overwhelmed, forests cleared, wildlife annihilated, waters polluted and abused, the climate unhinged. Across the globe, imperial peoples used land and its creatures as commodities, as if Earth were inert. They forgot that the planet is alive.

The continent of fire led the world into the new age of fire. But it also carried wisdom and experience from beyond the last ice age.

Humans, as creatures of the ice, were embarked on another odyssey. It would take them over the horizon, to an Earth they have never before known. •

References: The stars are our ancestors: B.T. Swimme and M.E. Tucker, Journey of the Universe • “the most precious waste in the firmament”: Richard Fortey, Life: An Unauthorised Biography • “The planet is alive”: Amitav Ghosh, The Great Derangement and The Nutmeg’s Curse • iron oxide, the seed and the egg: Reg Morrison, Australia: Land Beyond Time • the true green revolution: Loren Eiseley, The Immense Journey • expanding grasslands: Vincent Carruthers, Cradle of Life • distinctive in its fire regimes and mammalian pollinators: Stephen Pyne, Burning Bush • conditions of biodiversity: Tim Flannery, The Future Eaters • Sahul and the last ice age: Billy Griffiths, Deep Time Dreaming • conserving whole ecosystems: Peter Sutton and Keryn Walshe, Farmers or Hunter-gatherers? • “each Ship like another Noah’s Ark”: First Fleet surgeon George Worgan in Grace Karskens, People of the River • agrarian social experiment: Grace Karskens, The Colony • guns, germs and steel: Jared Diamond, Guns, Germs, and Steel • “the deserted aboriginal feel of untilled earth”: George Farwell, Cape York to the Kimberleys • “the true, northern hemisphere”: Shirley Hazzard, The Transit of Venus • “a blank space on the map”: Ernest Scott, A Short History of Australia • a whispering in their hearts: Henry Reynolds, This Whispering in Our Hearts • “the Iliad and Odyssey of Australia”: Noel Pearson, A Rightful Place • “a bold mix of the Dreamings”: Alexis Wright, The Swan Book • “we believe this ancient sovereignty can shine through as a fuller expression of Australia’s nationhood”: The Uluru Statement 2017 • a great acceleration: John McNeill and Peter Engelke, The Great Acceleration • “the heartbeat of the planet”: Will Steffen • the new age of fire: Stephen Pyne, The Pyrocene.

The post Odyssey down under appeared first on Inside Story.

]]>
https://insidestory.org.au/odyssey-down-under/feed/ 0
Anti-globalism’s cauldron https://insidestory.org.au/anti-globalisms-cauldron/ https://insidestory.org.au/anti-globalisms-cauldron/#comments Tue, 05 Sep 2023 06:19:41 +0000 https://insidestory.org.au/?p=75499

The Great War brought the drive for international trade and cooperation to a disastrous end

The post Anti-globalism’s cauldron appeared first on Inside Story.

]]>
Countless predictions in recent years have sounded a warning that the 1930s — the modern world’s darkest decade — is back. The decade has become shorthand for rampant nationalism, the rise of the far right and the collapse of democracy. Those were the years when the world appeared to turn its back on globalism, when widespread unemployment and hunger drove advanced economies to the brink, when borders tightened, and when fanaticism triumphed in politics, paving the way for the genocidal 1940s.

Yet the decade as we know it started much earlier than 1930. As Tara Zahra argues in her new book Against the World: Anti-Globalism and Mass Politics Between the World Wars, the retreat from liberalism and international cooperation in Europe and the United States began during the first world war and then intensified when postwar hunger and deprivation drove combatant populations away from the ideals of internationalism and cooperation that had once appeared unstoppable.

From the late nineteenth century, global flows of people, money, goods and ideas crossed borders faster than ever before, as new technologies transformed transportation, communication and refrigeration. Tens of millions of Europeans were on the move, a vast majority of them emigrating to North and South America.

But the war suddenly shut down these globalising forces. As Zahra writes, “European countries devoted all of their destructive energies to damming international flows of people, supplies and intelligence.” The results were catastrophic and far-reaching. Hundreds of thousands of Central Europeans starved to death. In Germany, which relied on imports for about a third of its food supply, imports declined by 60 per cent. Poor seasons and the loss of men to the front killed domestic harvests. In Berlin, food prices rose to 800 times their prewar level.

The crisis was similar in the disintegrating Austro-Hungarian Empire. Although less dependent than Germany on food imports, Austria’s agricultural output fell by almost half. Hungary lost a third of its harvest, and officials stopped sending food to nearby Viennese workers who depended on it. Food rationing exacerbated people’s hunger, and queuing at food depots became a full-time occupation.

Manès Sperber, a ten-year-old in Vienna, recalled long wartime nights of queueing in the cold and wet only to find that “the ‘Sold-out’ sign would be put up just as you finally managed to reach the threshold of the shop.” By the end of the war, Viennese were surviving on just 830 calories a day. “To obey the food laws is equivalent to suicide,” one middle-class Viennese woman wrote in her diary in 1918. Indeed, it was women who led the protests against the food shortages — protests that often turned violent — across Europe. Police sent to quell the protesters often joined in instead.

Zahra uses her exceptional skills as a historian to show how globalisation (not a term in use at the time, though certainly a phenomenon traceable to the nineteenth century) and its demise divided and politicised millions. She shows how the dissolution of Austria-Hungary, long a focus of her scholarship, left Austria adrift. The Paris Peace Treaty cemented the collapse of the imperial order and its fragmentation into warring economic units. Once the largest free-trade zone in Europe, Austria lost much of its food supply and raw materials to the economic nationalist policies implemented by its new neighbours, Hungary, Yugoslavia and Czechoslovakia.

“Of the fragments into which the old empire was divided, Austria was by far the most miserable,” League of Nations official Arthur Salter wrote in 1924. In his memoir The World of Yesterday Stefan Zweig lamented the disappearance of Austria as a centre of cultural and intellectual cosmopolitanism, represented by its multinational, multilingual and geographic diversity. The empire had stood in for the whole world not only because of its diversity of population and languages, but also because of its economic self-sufficiency. Now, it had become a head without a body.

Across Europe, back-to-the-land movements emerged as one of the more popular solutions to the food crisis. Supporters from both sides of politics were keen to develop economic self-sufficiency among local populations as a bulwark against future threats and to boost national economies. Autarky became a unifying goal for populations who had experienced hunger and humiliation.

As Zahra writes, “The importance of food security was seared into the bodies of hungry citizens.” In Italy, land was occupied by returning veterans and women, angry they had not received acreages promised in return for their wartime sacrifice. In Austria, calls for the “inner colonisation” of rural land by unemployed men and women appeared to offer the promise of food, jobs, houses and dignity; in reality, unwanted minorities (Slavs and Jews) were expelled from borderlands to free up space. Later, the same ideas were incorporated by the Nazis into the imperial concept of Großraumwirtschaft (greater area economy), which they used to justify their annexation of lands to Germany’s east and the expulsion of millions.

The settlement movement gained even more followers after the Great Depression, as disillusionment with capitalism spread. Faced with bad soil, bad weather, insufficient skills and an almost complete lack of infrastructure, these efforts weren’t always successful.

Women fared the worst. One of Zahra’s most significant contributions is her focus on the experience of women, who often faced the greatest of anti-globalism’s excesses. The back-to-the-land movement was about not only a return to the land but also a return to traditional gender values. Women were expected to work for up to fourteen hours a day doing backbreaking farm labour and unpaid domestic tasks alongside their children to free men up for paid work.

Mass politics on both sides blamed globalism for the drastic decline in living standards, and governments colluded in deflecting blame for the crisis in civilian mortality onto outside forces. Many European countries seemed on the brink of a socialist revolution, a threat that became a reality for a short time in Hungary and Germany, generating counter-revolutionary violence on the right, as fascists and socialists clashed openly in city streets.


As Zahra shows throughout Against the World, the search for scapegoats often led to Jews, who were perceived as “emblems of globalisation par excellence.” Forced by discrimination and persecution into jobs that demanded mobility — as pedlars and traders, for example — they were seen as perennial outsiders, facilitators of global networks of commerce, finance and trade, rootless and without loyalty to the state.

“Jews were targeted as symbols of international finance, unchecked migration, cosmopolitanism, and national disloyalty,” Zahra writes, with alarming echoes of today. German leaders disseminated a “stab in the back” legend that attributed the German and Austrian defeat to internal traitors, namely Jews and communists (the two were often conflated) working for foreign interests.

After Russia’s Bolshevik revolution, the twin “global” threats of Judaism and Bolshevism led to vicious attacks on the Jewish Hungarian population. These pogroms were even more violent in Poland and the Ukraine. Between 1918 and 1921, between 40,000 and 100,000 Jews were killed, around 600,000 displaced and millions of properties looted or destroyed.

Jews were the group most affected by the “epidemic of statelessness” that followed the postwar collapse of empires and the creation of nation-states. These emerging states engaged in a violent new form of political engineering designed to create nationally homogeneous populations. Minorities were persecuted, murdered, expelled or, at the very least, actively encouraged to emigrate; “reliable” citizens were called home or prevented from leaving.

These efforts to purify national populations helped to invent a new kind of migrant: the refugee. In response, a new League of Nations Refugee Commission was created, one of the myriad international commissions and organisations that descended on Europe after 1918 to help those worst affected by the war and its aftermath. The International Save the Children Fund, the Near East Relief Committee and the American Jewish Joint Distribution Committee were just some of the agencies on hand to assist the vast number of stateless people and refugees created by new and closed borders.

Adding to the chaos from 1919 was the Spanish influenza pandemic, which killed as many as thirty-nine million people worldwide, reinforcing political elites’ desire to tighten borders against “diseased” foreigners. In the United States those foreigners were often imagined as Eastern European and Jewish. The 1924 Johnson–Reed Act introduced “national origins” quotas, effectively reducing immigration from Southern and Eastern Europe to America to a trickle.

While it had once been relatively easy for European (though not Asian) individuals fleeing poverty, war or persecution to find refuge in the United States, the closed borders of this new era of anti-globalism left millions in limbo. Ellis Island, repurposed as a detention centre, was emblematic of this shift. The Austrian writer Joseph Roth imagined a Jewish migrant’s fate in 1927: “A high fence protects America from him. Through the bars of his prison, he sees the Statue of Liberty and he doesn’t know whether it’s himself or Liberty that has been incarcerated.”


One of the strengths of Against the World is Zahra’s interest in how the people of the period — the activists, visionaries, nationalists and industrialists invested in globalism, and its discontents — saw the world and themselves in it.

Rosika Schwimmer, a Hungarian Jewish feminist, is one of the more fascinating characters to accompany us throughout this history. We meet her at the beginning of the book as she oversees the annual meeting of the International Women’s Suffrage Alliance in Budapest. For the delegates, internationalism was crucial to the project of enlightening and emboldening (white) women across the globe.

An early pacifist, Schwimmer spent her life attempting to find ways towards world peace. She appears initially as somewhat naive and opinionated, yet also hopeful. By 1919, a victim of anti-Semitism and sexism, refused a passport to leave Hungary, she is forced to smuggle herself first to safety in Austria and then to the United States, where her application for citizenship is denied. Her transformation from “citizen of the world” to “stateless refugee,” writes Zahra, “was emblematic of the fate of internationalism in interwar Europe.”

In one of the more bizarre encounters Zahra describes, Schwimmer convinced the industrialist and anti-Semite Henry Ford to charter a peace ship to end the war, an expedition that failed amid the derision of American journalists. Ford’s politics were self-serving and contradictory: an anti-globalist who relied on migrant labour, he made his workers perform their assimilation in an eccentric ceremony that involved climbing into a giant papier-mâché “melting pot” in national dress and, moments later, “graduating” in American clothing singing “The Star-Spangled Banner.”

Ford also enforced his own back-to-the-land lifestyle for his company employees, demanding his workers move out of the cities and plant gardens to grow food. Yet he was global in his business aspirations, exporting millions of his cars overseas, including to the Soviet Union, and building plants across the globe. His virulent anti-Semitism also found an international supporter: Hitler praised him in Mein Kampf.

Others who make an appearance include Gandhi, whose own program of self-reliance, or swadeshi, Zahra includes within her anti-globalism frame. Gandhi’s determination to free India from Britain’s imperial chains and its subordination in the global economy resonated around the British empire, including in Ireland, where boycotts of British food caused an economic tariff war between the two countries.

“In a world of falling prices, no stock has dropped more catastrophically than International Cooperation,” the journalist Dorothy Thompson lamented in 1931. When Zahra sat down to write this book in 2016, Donald Trump had just been elected president and Britain had voted for Brexit: “There was a refugee crisis, and populist, right-wing parties were winning elections across Europe with anti-migrant platforms.” Covid and the war in Ukraine followed. (Zahra doesn’t mention here the tensions with China or the wars in the Middle East, equally destabilising.) Globalisation’s future, she writes, appeared uncertain.

Zahra’s neat binary of globalism and anti-globalism might bother some, but I found Against the World a refreshing and intelligent account of a period studied perhaps more than any other. This is a book about the fragility of democracy in the face of economic breakdown. Millions across the political spectrum faced hunger, homelessness, financial ruin and family separation in the wake of the first world war. Both the left and the right offered alternatives to the havoc wreaked by reliance on the global economy.

There are clear differences between the anti-globalisation movements of the interwar years that empowered fascism and those of our own times. But there are clear echoes in today’s widespread disenchantment with democracy’s ability to combat the inequalities associated with lost jobs, farms and homes; with the capacity of our international institutions to mediate conflicts; and with foreign competition and free trade. The other frightening echo is in the easy politics of fear, which sees the world’s most vulnerable cast out by demagogues seeking easy targets.

It’s hard to imagine how a world turned inwards will be able to tackle the biggest global challenges of our time. “The earth heaves,” warned a pessimistic John Maynard Keynes in 1919, “and no one is aware of the rumblings.” •

Against the World: Anti-Globalism and Mass Politics Between the World Wars
By Tara Zahra | W.W. Norton & Company | $57.99 | 400 pages

The post Anti-globalism’s cauldron appeared first on Inside Story.

]]>
https://insidestory.org.au/anti-globalisms-cauldron/feed/ 7
Living toughly https://insidestory.org.au/living-toughly/ https://insidestory.org.au/living-toughly/#comments Mon, 28 Aug 2023 06:17:07 +0000 https://insidestory.org.au/?p=75335

Sydney’s best-known bohemian lived entirely by her own rules

The post Living toughly appeared first on Inside Story.

]]>
Bee Miles first attracted notoriety when she made a sensational escape from Sydney’s Parramatta Mental Hospital in February 1927. She had spent the previous three years in various institutions for the mentally ill at the behest of her father, a wealthy businessman named William Miles.

Embarrassed by her escape, William decided to pay Bee a weekly allowance in the hope she would keep as far away from him and the family as possible. This she mostly did, but she was unable to curb her disruptive and sometimes violent public behaviour. She was constantly being arrested, charged and fined, and was jailed when she could not pay the fines; many times she was forced back into asylums. This was the pattern for almost the rest of the life of the woman widely known as  a Sydney bohemian.

During and after the second world war Sydney’s acute housing shortage forced Bee to sleep rough. It is a common myth that she chose homelessness. “No one chooses to be homeless,” notes Rose Ellis, in Bee Miles, the first major biography of her subject. When Bee could no longer afford to rent a room but her allowance meant the city’s social services couldn’t help her, she declared herself a “tenant of the city.” Writes Ellis: Sydney’s “public heart became home, its streets and steps her bed.”

Bee would wake at 5am, hook her blankets to her belt and make her way from wherever she had been sleeping to Mason’s Café in Elizabeth Street, opposite Central Station. She breakfasted there on steak and eggs every morning for nearly twenty years. Afterwards she would go to Dobson’s Turkish Bathhouse where she was given a regular free timeslot to have a bath and wash her hair and clothes. The myth says that Bee was “dirty,” but it wasn’t so. She loved a long, hot bath.

Bee’s working day as a “roving reciter” (Ellis’s words) then began. Passing a delicatessen where she received a free bottle of milk and a barrow where she received a piece of fruit, she would catch a bus from Eddy Avenue to some destination, Watson’s Bay perhaps, where she would offer recitals of poetry and prose for money. Her rates varied from sixpence to three shillings, and Shakespeare was her favourite. To advertise this service she wore a sandwich board.

Back in the city she would perform through the afternoon at a regular spot, such as the steps at the Mitchell Library where there was a regular flow of students. She used to enjoy visits to its reading room until she was banned for smoking. She might end her day with a visit to a friend, not that she had many, or a bookshop or cinema. She dined at 5pm, always curried tongue and peas, and chose her place to sleep for the night, which could be a cave at Rushcutters Bay, under a shed in the Domain, on the steps of St James church opposite Hyde Park Barracks, or in the bandstand at Belmore Park.

After years of being moved on and jumped on, having her blankets and shoes kicked away, and sometimes even being urinated on by the police, Bee finally accepted refuge from Father John Hope (uncle of Manning Clark), rector of Christ Church St Laurence. She slept on the floor of the laundry in the clergy house.

Bee Miles was always on the move. She loved speed — and risk.  As a young woman she became known as “mad Bee Miles” for jumping on and off moving trains on her way from the family home in Wahroonga to the University of Sydney. Her university career lasted only a year and it was said (another myth, of course) that her mind was “turned” by too much study.

She would cling to the bumper bars or footboards of cars, or climb right into a car or taxi and order the driver to drive on. She refused to pay on public transport and conductors learned that it was often wiser not to demand a fare, fearful of the scenes she could cause. Some of her most violent confrontations came when taxi drivers, judging her dishevelled appearance, refused to take her as a paying customer. She suffered several serious assaults this way, the driver-perpetrators never charged.


Prodigiously researched (it began as a PhD), Ellis’s life of Bee Miles unfolds elegantly, uninterrupted by personal perspectives or anecdotes of Ellis’s own. She shares nothing about the relationship she must have developed with her subject (surely every biographer has one). If she essayed a night sleeping in the bandstand in Belmore Park, she doesn’t say. She’s not that kind of biographer.

Her book begins serenely enough. We discover a small girl seated at a piano in a room with a vaulted ceiling and long stained-glass windows overlooking a sprawling garden. The girl is Beatrice Miles and she is practising under the careful but kindly gaze of her grandmother, Ellen Cordner-Miles, a celebrated contralto in Sydney in the 1870s. The afternoon light fades but the girl plays on in the otherwise silent house.

Ellen’s son William Miles, Bee’s father, had taken on various family business enterprises and of these Peapes & Co., a men’s clothing store in George Street, was the most successful. William and his wife Maria had five children. Bee (she insisted on “Bee” and not “Bea”) was born in 1902.

William was a man of contradictions, as famous for his business acumen as for his political radicalism. A devotee of the rationalist and free-thought movements, he raised his children as atheists and taught them the rationalist dictum to reject all forms of “arbitrary” authority. During the first world war he took to a speaker’s box on the Domain to rail against the proposed introduction of conscription, and he instructed his three daughters to wear “No” badges at their school, Abbotsleigh College. Bee relished the ensuing controversy, though her sisters did not.

William might have encouraged Bee’s agile mind but he didn’t expect her to reject his own authority. Her adolescent years were torrid. “Family friction is a battle fought daily,” Ellis observes. “Superficial wounds heal quickly in readiness for the next confrontation. But parental rejection leaves scars that are deep and enduring.”

Fifty years later Bee recalled that her father loved her until she reached the age of fourteen, after which he hated her, angered by her “wilful” nature and jealous of her superior intellect. And yet she also claimed that her mother became jealous of the close relationship between father and daughter, which was more than close, Bee said, it was incestuous. Bee believed that William feared that his wife would go to the police or tell a doctor.

Further trouble came when, at seventeen, Bee contracted encephalitis lethargica, known as “sleeping sickness.” She was with her mother buying gloves at Farmer’s department store one day when she fell asleep at the counter and could not be woken. She had fallen victim to a pandemic, brought to Australia by a returning Anzac, that caused 500,000 deaths in Europe and Australia.

Encephalitis lethargica mainly targeted young people, leaving survivors like Bee with lifelong side effects. Unusually, she escaped the Parkinsonism that afflicted other sufferers, but sensitivity to light (in later years she often wore a sunshade), obesity (she put on weight massively in her forties) and, most significantly, her exhibitionism and her addiction to movement: all were probably the after-effects of encephalitis lethargica.

Here then is the “untold story” of the title of this book, and an ah ha! moment for readers who have heard of or still remember Bee Miles. Ellis treats the subject of Bee’s illness very carefully. Early on she gives enough information about the disease and its effects for the reader to carry forward into the rest of the book because it explains so much about Bee.

But encephalitis lethargica was not the only thing to shape Bee. What with adolescent trauma and her own questing mind, she may never have settled for the life of a North Shore lady anyway. Ellis wants us to know about the joys and freedoms Bee experienced, as well as the pain and loneliness.

By the time she returns to Bee’s illness in the penultimate chapter of the book I was ready and eager to know more. Bee became ill in 1920 and nearly died. Ellis has worked through thirty-six years’ worth of Bee’s medical case notes and finds that although encephalitis lethargica was mentioned many times, specifically or in passing, her doctors condemned Bee through the lens of their own morality. She was “wilful,” “restless,” “impulsive,” “childish,” “arrogant,” “impudent” and “tearful.”

All of this, as well as her attention-seeking behaviour and love of speed and movement, was consistent with well-documented observations of post-encephalitis syndrome. But no one fully explored the link, even though the syndrome was being identified in Australian medical literature at the time. Doctors chose instead to believe her father, who may also have been her abuser, who claimed that Bee had always been “wilful” and lacked “respect for authority” — even though he himself had actively taught her to reject arbitrary authority.

Was Bee herself aware of the probable impact of her illness? Apparently so. In front of a magistrate in 1932 she shouted at her solicitor to “shut up” when he alluded to the effects of sleeping sickness. Many of us would find relief in a formal diagnosis (“at least I’m not actually mad”), but Bee never did. Refusing to be labelled, she rationalised her behaviour into her own view of herself.

She built this view through her public performances and the many press interviews she gave over four decades. She also wrote prolifically and longed to be published. Some of her short travelogues did appear in regional newspapers, but her longer work, including accounts of her incarceration in the 1920s and the massive journeys she made to northern Australia in the 1930s, never found a publisher. Ellis quotes Bee’s own words extensively, however, and thus ensures that she can be known on her own terms and not just as the construct of a male gaze captured in court records and medical case notes.

At sixty-two, after a life of fiercely resisting authority and convention, Bee finally accepted a place in a Catholic-run nursing home, where she died in 1973. A journalist for the Daily Telegraph who visited her in her cave near Rushcutters Bay in 1948 had listed Bee’s fifteen “rules for living.” They included avoiding covetousness, being content with what you have, singing when you are happy, sleeping when it’s dark, and living “toughly, dangerously, excitingly, exhilaratingly and simply.” •

Bee Miles: Australia’s Famous Bohemian Rebel, and the Untold Story Behind the Legend
By Rose Ellis | Allen & Unwin | $34.99 | 336 pages

The post Living toughly appeared first on Inside Story.

]]>
https://insidestory.org.au/living-toughly/feed/ 3
Case closed? https://insidestory.org.au/case-closed/ https://insidestory.org.au/case-closed/#respond Wed, 23 Aug 2023 00:04:06 +0000 https://insidestory.org.au/?p=75270

A distinguished historian of France scrutinises the trial of Vichy leader Marshal Pétain and its aftermath

The post Case closed? appeared first on Inside Story.

]]>
Philippe Pétain was born into a farming family in northern France in April 1856, the only son of Omer-Venant Pétain and Clotilde Legrand. Despite his humble origins, he managed to gain admission to the elite Saint-Cyr military training school in his mid-teens. A colonel by the beginning of the first world war, he rose to the rank of general at the relatively late age of fifty-eight, leading the French army to an unpredictable victory at Verdun.

That triumph, and Pétain’s subsequent success in controlling a mutiny among his troops, gained him considerable prominence. He was named Marshal, a rarely awarded honorific title rather than a formal military title, in 1918. Commander-in-chief of the French army in the interwar years, he twice served briefly as war minister before being appointed ambassador to General Franco’s Spain. He was recalled to Paris in 1940 to take up the position of defence minister but quickly found himself leading the wartime government.

It was this government that would notoriously suspend the French constitution, dissolve parliament and grant him plenipotentiary powers as head of state. In this capacity, Pétain signed an armistice with the Nazi invaders and initiated a policy of collaboration. Three-fifths of France was occupied from 1940 until 1942, then the whole territory. For his role, Pétain was tried in 1945 for treason.

Why was this eighty-four-year-old military figure, effectively a political novice, appointed to the top job? Historian Julian Jackson considers this question early in his new book, France on Trial: The Case of Marshal Pétain. An authority on mid-twentieth-century France, Jackson’s best-known books include France: The Dark Years, 1940–1944 (2001), The Fall of France: The Nazi Invasion of 1940 (2003) and A Certain Idea of France: A Life of Charles de Gaulle (2018).

The urgent need for a new prime minister arose from the rout of the French army by the Germans and the growing political pressure for a settlement with the invading forces. Paul Reynaud, France’s president at the time, “lacked the authority” to resist the push for an armistice, writes Jackson, and when it came to naming a new prime minister the choice was between two generals anxious to settle with the Germans.

Of the two, Maxime Weygand, now the army’s commander-in-chief, was a monarchist and therefore unacceptable to Reynaud. Pétain, meanwhile, “had never been associated with disloyalty to Republican governments.” Pétain was also “revered,” writes Jackson, prudent in his associates, and considered a humane commander.

Harbouring political ambitions, Pétain had “kept in touch with events in Paris” while in Spain; and it was the “impending catastrophe of defeat” by the German army that “gave him his opportunity.” Rumours of a Pétain government had spread during the final years of Reynaud’s presidency, and Pétain had been accused of participating in a plot to achieve this end.

Appointed to form a government on 16 June, Pétain was not granted full powers until 10 July, when parliament reconvened in the central French town of Vichy. “The very next day,” writes Jackson, “Pétain issued a series of ‘constitutional acts’ which effectively made him a dictator and put Parliament into abeyance.”

Tellingly, Pétain had topped an opinion poll in 1935 “to discover who would make the most popular dictator for France.” But he had not been the figurehead of a single faction during the 1930s and 1940s. The left had some distrust of him, but generally went along with his image — the aura of military success, the handsome looks and noble bearing that “seduced crowds” — until the suspension of the constitution.

“Pétain’s tragedy,” says Jackson, “was to be an unremarkable person who had come to believe in his own myth.” Those who didn’t believe the myth were the most cynical of his supporters. His close aides considered that he was influenced by the last person he spoke to on an issue. Pierre Laval, the government’s most enthusiastic supporter of collaboration with Germany, thought Pétain deserved only to be a bust on a mantelshelf. His vanity and his public persona, in other words, led him to be eminently manipulable. But his position as the head of the regime inevitably made him responsible for its actions.

Kidnapped by the Nazis in August 1944, not long after the Allies began their push into France, Pétain was held in Germany ostensibly for his own protection. Jackson recounts the grimly amusing story of Germany’s pretence of a French “government in exile,” not to mention the absurdity of the behaviour of its members and of Pétain’s eventual “release.” In what his defenders presented as a “gesture of noble heroism,” he insisted on returning to postwar France to defend himself before the French population.


Pétain’s trial was held in 1945, very soon after Germany’s surrender. The war hadn’t yet ended, and Charles de Gaulle’s provisional government was barely a year old. Finding “legally robust procedures” for trials after the liberation was imperative but difficult, Jackson writes. Politicians accused of treason in the prewar Third Republic had been judged by the Senate sitting as a High Court, but “it was not even yet decided whether France would keep the same constitution, and most members of the Senate elected under it had voted Pétain full powers in 1940.” A new High Court was created to try the Vichy leaders.

The reputation of the Paris judiciary had been “severely compromised” during the war. The Bar had been purged of its Jewish members, and its remaining members recused themselves on the grounds that impartiality would be difficult for them. Further complicating the picture, both the judge presiding over the preliminary interrogation and the prosecutor in the trial had “murky” wartime pasts. The former was undistinguished in legal circles and probably out of his depth. The conduct of the trial did not maintain decorum.

And what was Pétain to be tried for? Eventually, he was indicted for signing the armistice, for the constitutional acts that made him a dictator, and for “abominable racial laws.” But no evidence was brought concerning this third matter, and Jackson devotes a whole chapter to “The Absent Jews.” France had to wait until 1995 for Jacques Chirac’s acknowledgement of French responsibility for the deaths of 75,000 Jews.

Pétain was eventually convicted of “collusion with the enemy” — treason, that is. Jackson recounts the trial in detail. He profiles the lawyers, jury members and witnesses, and draws on the court record and other contemporary documents to create a blow-by-blow account of the debates. More horrifying than amusing, the proceedings reveal the judicial chaos of post-liberation France.


But what of Jackson’s title, for which he thanks a friend? Did the court action against Pétain really put France on trial? This is the question that pervaded the trial and pervades the book.

From the moment of liberation, de Gaulle insisted that France was a nation of resisters that had been betrayed, in Jackson’s words, “only by a handful of traitors who needed to be punished.” To understand this belief, we need to remember that the Gaullist resistance was a nationalist movement, and ideologically conservative. We also need to remember that de Gaulle’s principal aim from 1944 on was to have France recognised as a participant in the war effort and accepted among the Allies at the negotiating table for decisions concerning postwar Europe.

Jackson is sympathetic with this account, deeming it “necessary.” But it was necessary only in the relatively short term; in the longer term, it has been a major cause of France’s difficulty in coming to terms with its history. It surely lends respectability to the national nostalgia for a “certain idea of France,” a nation — notably not a “state” — whose “greatness” would rest on its ideological and cultural homogeneity.

As Jackson remarks elsewhere, France’s wartime population consisted of resisters — a small number at the start, more towards the end — a great number of supporters of Pétain and Vichy, and many people on the fence, waiting to see how the cards would fall. Even if attitudes had been more homogeneous, France could not be tried, if only because a criminal trial necessarily focuses on the accused person and his or her intentions, as was reiterated several times during the proceedings.

At the same time, some prominent intellectuals acknowledged at the time that France as a whole shared some responsibility, that “each of us was complicit,” in the words of one. One of the defence lawyers argued that “if Pétain was guilty so were the French — so was France,” thus suggesting the grounds for an exoneration of the accused. But neither the armistice, nor the abuse of the constitution, nor France, nor even the “widely shared complicity” of the French in the actions of Vichy was on trial. Hence the prosecutor’s insistent focus on the person of Pétain.

The title promises more than the book can give. In what sense was this trial a “case” of putting France on trial? Granted, we may be dealing only with a metaphor, but the metaphor is not apt. When the French judge their own actions under Vichy, it is for collaboration in all areas of social and economic policy. But collaboration does not figure in the penal code.

Under an alternative construal, “case” might refer to a case study, and hence to the work of the book itself. Jackson suggests this in the introduction when he tells us that the trial affords an opportunity “to watch the French debating their history.” In this sense, Jackson’s research serves “as an example” to those “future historians” whom the prosecutor invokes as doing a different kind of work from the court’s: “We are not historians,” he insisted to his colleagues.

Importantly, case studies acknowledge the singularity of each case. This is also the task of historical research. It is in these terms that we can identify the achievement of Jackson’s book; it is an admirable narrative history whose accomplishment lies in its detailed scrutiny of the particularities of Pétain’s trial and the specific aftermath of its verdict.

The final part of France on Trial demonstrates the continuity from the anti-republican sentiment of the 1930s, through the Pétain cult of the war years, to the persistence of extreme-right politics in France in the present day. It is not paranoid to trace this continuity as far back as the Dreyfus affair — not only because of the persistence of anti-Semitism but also because the nefarious role of the army was central.

For Jackson, “the Pétain case is closed” because the National Rally’s Marine Le Pen has walked away from her father’s fidelity to Pétain. But her more radical niece has not; and this, together with far-right figure Éric Zemmour’s strident anti-Muslimism, leaves me more sceptical than optimistic. Pétain and Vichy governed France; they have been the names of a long strand in France’s “civil war.” If those names no longer attract a following, so much the better, but their echo persists. •

France on Trial: The Case of Marshal Pétain
By Julian Jackson | Allen Lane | $55 | 480 pages

The post Case closed? appeared first on Inside Story.

]]>
https://insidestory.org.au/case-closed/feed/ 0
Lady Mary’s experiment, and other infectious stories https://insidestory.org.au/lady-marys-experiment-and-other-infectious-stories/ https://insidestory.org.au/lady-marys-experiment-and-other-infectious-stories/#respond Fri, 18 Aug 2023 03:37:12 +0000 https://insidestory.org.au/?p=75243

Historian Simon Schama spent the pandemic researching smallpox, cholera and plague

The post Lady Mary’s experiment, and other infectious stories appeared first on Inside Story.

]]>
Some global events enter into collective memory and others don’t. More people died from the influenza pandemic that began in the last year of the Great War than from the fighting; yet the war is a central part of Australian memory in a way the Spanish flu is not. In recent years the world has experienced a real and common peril — the coronavirus known as Covid-19 — but what will we remember of its enormous toll?

I have been thinking (and writing) about the prospect of emerging pandemics for most of my professional life. I started my career as an infectious diseases physician in the 1980s just as HIV was emerging and have seen Lyme disease, hepatitis C, SARS, MERS, H5N1, Ebola, Nipah, hantavirus, bat lyssavirus and monkeypox (to name a few) appear and recede. I thought the next pandemic would be a bird influenza that had been “humanised,” but wiser observers saw the risk of a coronavirus with the right genetic profile.

Although border closures formed part of existing control plans, I didn’t believe that any liberal democracy would close itself off from the world, and the state border closures in my own country were unthinkable — until they happened. I did expect mass gatherings to be cancelled but never imagined entire populations confined to their homes.

Yet just three and a half years after its sudden appearance, I skim academic articles about the virus and altogether avoid general media coverage. This may be an adaptive response to trauma, and a sign that we trust that the technocracy and polity have learned enough to mitigate the effects of the next inevitable event. Or is it something else?

In his new book Foreign Bodies, historian Simon Schama shows how eighteenth- and nineteenth-century authorities, scientists and societies dealt with large-scale outbreaks of smallpox, cholera and plague. Each case shows the “immemorial conflict between ‘is’ and ‘ought,’” Schama writes,

between short-term power plays and long-term security… between the cult of individualism and the urgencies of common interest… between native instinct and hard-earned knowledge. If it is a happy answer you want to the question as to which will prevail, it is probably best not to ask an historian.

Foreign Bodies opens, appropriately in this age of wilful intellectual amnesia, with the French philosopher Voltaire, who nearly died of smallpox in 1723. Because it is the only infectious disease that has been eradicated from the planet, we have no contemporary experience of the terror it inflicted on local populations when it appeared. But smallpox not only killed a significant proportion of any group of people it infected but usually left survivors scarred and disfigured.

Voltaire was contemptuous of how doctors treated the disease in France. In his Letters Concerning the English Nation (1733) he cites the use of inoculation in Britain as an example of that country’s modernity. What he may not have known is that inoculation had only recently, and indeed reluctantly, been introduced from the East.

Smallpox inoculation was very different from the vaccination that came later. It involved pricking smallpox pustules on an infected individual and transferring the extracted fluid into a healthy subject — usually by making a number of superficial pricks of the skin with the contaminated needle. The vast majority of inoculated subjects suffered only a mild to moderate attack of smallpox, although occasionally they died. We still don’t know why inoculation produced a less severe disease than “naturally” acquired infection.

Britain’s adoption of inoculation, patchy and fragile as it was, was driven by an unlikely eighteenth-century influencer. Before she returned from Constantinople to England in 1718, Lady Mary Wortley Montagu, wife of the English ambassador to an Ottoman sultan, had access to a female world closed to her husband. She had been struck by the absence of smallpox scarring in the women she met socially and observed in harems.

Montagu — who had herself been disfigured by an attack of smallpox — learned that the disease’s effects were minimised by the local practice of inoculation. She brought this knowledge back to London and convinced an English surgeon to inoculate her son. The practice was slow to catch on, but Montagu’s political connections meant that Caroline, Princess of Wales, became aware of her advocacy and had her two daughters inoculated. Even Catherine the Great eventually followed suit.

At this distance, the medical profession’s resistance to inoculation isn’t surprising, but it is ironic, considering that medics at the time lacked effective treatments for almost any disease. When the English profession did adopt inoculation, doctors couldn’t help but combine it with mercury, antimony and other useless (and even dangerous) but popular medications. Lady Mary despaired of this meddling, wishing that the protocol would stick to that of the East.

Schama touches on the trope of wise folk remedies versus mainstream medicine in recounting Lady Mary’s story, but he is always aware of the risk of lionising success after the event. Effective treatments have a survival advantage, dangerous ones disappear — eventually.


On balance, inoculation was a useful and relatively safe procedure. But it was supplanted when Edward Jenner introduced vaccination in 1796. Others had already observed that milkmaids, who rarely contracted smallpox, had almost always been infected with a trivial infection known as cowpox (or vaccinia in Latin). Jenner was not the first to infect patients with cowpox to prevent smallpox, but he formalised the procedure and promoted its widespread use.

In this case, the close relation of the viruses that caused cowpox and smallpox provided a serendipitous protective cross-immunity. But nature doesn’t provide many similar serendipities. It would be ninety years before Louis Pasteur developed the next human vaccine, in his case against rabies. Although viruses had still not been identified as a cause of disease, he “passaged” the infectious material through generations of rabbits, eventually producing an attenuated or “live” vaccine designed to provoke a protective immune response without causing the disease itself.

The first acknowledged recipient of Pasteur’s vaccine was a nine-year-old boy, Joseph Meister, who had recently been bitten by a rabid dog. As Schama points out, the usual triumphalist history of the discovery is complicated somewhat when we learn that two subjects had received the vaccine prior to Meister, and one of them had died.

Three exciting decades of microbiological discovery followed Robert Koch’s identification of the anthrax bacillus in 1876. The bacterial causes of many of the important epidemic infectious diseases — plague, tuberculosis, cholera and salmonella, to name a few — were elucidated in short order. Because it isn’t easy to prove that a germ actually causes a disease, Koch postulated the criteria that needed to be met.

It is here, in the context of a persisting uncertainty about the true cause of epidemics, that Schama introduces the unlikely character of Adrien Proust, the father of Marcel. Proust the elder, a public health physician in late-nineteenth-century France, advocated an international body to coordinate responses to epidemics and promote the very new science of vaccination. Representing France at a conference on cholera in Constantinople, Proust had heard the Ottoman Sultan Abdulaziz discussing the health risks for home countries of those returning from the Haj — a topic that remains germane today.

Cholera epidemics were widespread in Europe at the time, and international politics and professional differences were hampering control. Schama argues that while the British were motivated partly by mercantile interests anxious to keep ports open, a medico-philosophical argument was also in play.

The latter reflected one of the key moments in the history of infectious diseases: London-based physician John Snow’s discovery in 1854 that contaminated drinking water was responsible for cholera outbreaks. The British set about applying an engineering approach to cholera control — building sewers to separate waste and drinking water and protect potable water at its source — and it worked. But that experience encouraged British health authorities to fixate on cleanliness, above all else, as the way to healthiness. They saw the new vaccines as a distraction from the main game of carbolic acid and better drains.


The bulk of the second part of Foreign Bodies is taken up with the life of Waldemar Haffkine, the Russian-born microbiologist who developed vaccines against cholera and plague in the late nineteenth century. Haffkine, an obscure figure in the history of bacteriology who deserves better recognition, was hampered from the start by the fact that he was not a medical practitioner. Schama effortlessly places him in the scientific and social domains of the time, illustrating the inherent mistrust he faced both as a Jew and as a non-medico. (Pasteur was not a physician either, but his hagiography was unassailable by then.)

Most of the action takes place in the British Raj, where it soon becomes apparent that the outsider Haffkine’s vaccine rollouts relied more on local Indian support than on his colleagues in the British-run Indian Medical Service. Those vaccines probably saved an order of magnitude more lives than the British-sanctioned sector closures and slum clearances, but Haffkine suffered a blow in 1902 when nineteen people died as a result of a tetanus-contaminated vial of plague vaccine. Although he was exonerated five years later by an inquiry helmed by three giants of microbiology, his career had collapsed. Still only forty-two, he undertook no further significant work of discovery.

Haffkine would marvel at the new RNA technologies and the speed and scale of modern vaccine production, but he would understand the fundamentals of what has been hailed as a miracle of modern science. But this industrial-scale achievement has to be balanced with the more mundane (and at times uncivil) public health debates that preceded the availability of the Covid vaccines.

Uncertainty about something as basic as how SARS-CoV-2 is transmitted lingered longer than the time it took to produce the first vaccine. Masks were recommended after the virus was found to sometimes be carried on the air (airborne transmission, that is) not just in the air (droplet transmission). A proven means of preventing respiratory infections, hand hygiene, was adopted early by everyone but quickly dropped away when mask mandates came into force. You probably need both to prevent transmission and it is interesting to note that the vast majority of Australians’ infections occurred after masks disappeared and hand hygiene waned.


The web of life is a fragile gossamer. Subtle, undetectable disruptions in one part of the chain can lead to unexpected downstream consequences. Almost all new human infections are caused by pathogens that were once harmlessly confined to the animal world. The movement from animal to human is happening principally because of habitat loss, an increasing reliance on overcrowded food production and, sometimes, local tastes for exotic meats from undomesticated animals.

Foreign Bodies opens with the words attributed to Pliny the Elder — in the end, all history is natural history — and Schama finishes with the story of the decline of the horseshoe crab. Unchanged by evolution over millions of years, the crab’s blood has been used in recent decades to test for the presence of a specific contaminant in vaccine vials. Climate change and overharvesting have dramatically reduced the crabs’ numbers and pharmaceutical companies have struggled to find a more sustainable alternative at a time when safe vaccine production is an international industrial priority.

This is an unusual but beautifully written book. Schama admits it was not the one he was planning to write when the pandemic began. It is a mixture of personal observations of North American nature, modern political commentary, microbiological exposition, historical analysis, anecdote and biographical diversion. Occasionally, present-day issues jarringly appear in the midst of a prolonged historical narrative, and I wonder if Haffkine deserves as much space as he gets. But these are quibbles and I am grateful to Simon Schama for painlessly curing me of my Covid-19 avoidance disorder. •

Foreign Bodies: Pandemics, Vaccines and the Health of Nations
By Simon Schama | Simon & Schuster | $59.99 | 480 pages

The post Lady Mary’s experiment, and other infectious stories appeared first on Inside Story.

]]>
https://insidestory.org.au/lady-marys-experiment-and-other-infectious-stories/feed/ 0
Enigmatic pariah https://insidestory.org.au/enigmatic-pariah/ https://insidestory.org.au/enigmatic-pariah/#respond Thu, 10 Aug 2023 04:55:03 +0000 https://insidestory.org.au/?p=75152

Two years after their return to power, the Taliban aren’t living up to many of their promises — and the West’s disengagement isn’t helping

The post Enigmatic pariah appeared first on Inside Story.

]]>
Two years after the Taliban captured Kabul the outside world is still uncertain about the regime’s goals, dismayed by many of its actions, and holding back from anything that might signify recognition or approval. Of Afghanistan’s thirty-four million people, meanwhile, the only significant beneficiaries of the change of regime are residents of the rural hamlets that bore the brunt of air and drone attacks and night-time raids by Western special forces.

Since the US-supported president Ashraf Ghani fled the capital, the economy has shrunk by 20 per cent or more. Around twenty million people are short of food, and an estimated 3.2 million children are malnourished. Some rural people are reportedly selling organs or even children for cash to survive. Others have streamed into relief camps near provincial capitals for meagre rations.

For its part, the Taliban leadership seems less focused on dealing with this crisis than applying its interpretation of sharia law to social behaviour. It bears down chiefly on women and girls, restricting or even stopping their access to work and education or movement outside the home.

Behaviour like this is the reason the world hangs back from helping the country recover from war. Pakistan, China, Russia, Iran and Qatar have kept their embassies running in Kabul, and India rejoined them in August last year. But none of those countries has formally recognised the Taliban’s Islamic Emirate, and nor has any other Muslim-majority country. Australia and other Western countries maintain cautious communication with the Taliban through diplomatic posts in Qatar, and in the United States’ case through occasional fly-ins or third-country meetings.

Around US$9 billion of the former regime’s foreign funds have been frozen by the United States, several European countries and the United Arab Emirates. After seventy top economists, including Nobel prize winner Joseph Stiglitz, urged president Joe Biden last August to let the Afghan central bank tap the reserves — and stop the “collective punishment” of the Afghan people — the United States set up a foundation in Switzerland to allocate half of the reserves in American banks (US$3.5 billion) to pay for humanitarian supplies and electricity from Central Asian neighbours.

But what more can and should the outside world do to alleviate the suffering and starvation of the Afghan people — and beyond that, influence the Taliban towards the more inclusive interpretations of Islam, especially in the treatment of women and religious minorities, that apply in so many other Muslim nations?

In The Return of the Taliban: Afghanistan after the Americans Left Pakistani-American scholar Hassan Abbas suggests that the immediate prospects for reform in Afghanistan are not great, but that the West must try anyway.

He opens his book by describing how contact between the Taliban and the United States in Qatar from 2012 first acquainted Western officials with some of the figures who were destined to emerge in top positions in the new emirate. After Donald Trump became president in early 2017, this contact developed into negotiations for a US withdrawal.

Zalmay Khalilzad, a seasoned diplomat of Afghan origin, was appointed leader of the American team, and in January 2019 he was cleared by secretary of state Mike Pompeo to offer a drawdown of US forces to zero. In July that year, Trump imposed a nine-month deadline for an agreement. With no gains to show from pulling out of the Iran nuclear pact and talking to North Korea’s Kim Jong Un, Trump needed a deal before the 2020 election.

The Taliban persuaded the Americans to agree on a complete pull-out, including from the huge Bagram air base near Kabul. In return they promised that Afghanistan would not become a base for terrorist attacks on the United States or its allies, and that US forces and their local helpers could withdraw without harassment. Rather less firmly, they also pledged to enter power-sharing dialogue with Ashraf Ghani, and to look after Shia and Hazara minorities and allow female education.

Trump got his peace deal in February 2020, though it was signed in Doha rather than, as he’d hoped, at Camp David near Washington. He overruled Ghani’s objection to the release of 5000 Taliban prisoners as part of the deal. With withdrawal by May 2021 pledged, the Taliban suspended action against American forces and concentrated instead on attacking Kabul’s army. By the time Biden formed his administration, Taliban fighters controlled most of the provinces and were closing in on Kabul. Ghani dithered and postured, losing any opportunity to bargain.

Biden decided not to abandon Trump’s agreement, though he shifted the final departure date to 11 September 2021, exactly two decades after the 9/11 attacks by Afghanistan-based al Qaeda. After a trillion dollars, 2448 Americans killed, 20,722 wounded and many more traumatised, Biden said, a changed outcome was highly unlikely even if America stayed another hundred years.

The reality, says Abbas, is that “the Taliban outlasted the Americans.” Afghans were disabused of any faith that the West and their favoured Kabul politicians would save them. “The glorious myth of the ability of foreign intervention to install a democratic order” was comprehensively debunked.


Parallel with the negotiations in Doha, the Taliban were undergoing successive leadership changes. In tracking these shifts, Abbas give us important insight into the make-up and views of the men now in charge of Afghanistan.

Mullah Mohammed Omar, the secretive but charismatic Ameer ul-Momineen (Leader of the Faithful) during the Taliban’s first spell of government in the 1990s, resurrected the movement after it was ousted by the Americans and the Northern Alliance in late 2001. Around 1995, he had boldly entered a museum in Kandahar, the country’s second city, taken out a rarely seen cloak said to have been worn by the Prophet Muhammad, and put it on before an amazed and adoring crowd.

In 2013, a little over a decade into the new insurgency, Omar became ill and died in a Karachi hospital. His death was kept secret by the Taliban and their mentors in Pakistan’s Inter-Services Intelligence agency, the ISI, while succession plans proceeded. The natural successor might have been Mullah Abdul Ghani Baradar, Omar’s young brother-in-law, but he was viewed with suspicion by the ISI because he’d opened contact with a brother of Mohammed Karzai, the then US-backed president in Kabul. He was also out of the picture: the ISI had arrested and jailed him in 2010.

In his absence, Mullah Akhtar Mansour was proclaimed the new emir in 2015. A mullah though he was, he was known for his worldly appetites, heading frequently to the Gulf to “buy perfume” — in other words, enjoy Russian sex workers — and hosting Gulf sheikhs for falcon-hunting. It was under his leadership that the Taliban made their first breakthrough in Afghanistan’s north, seizing the city of Kunduz.

Mansour’s term as emir ended when an American drone strike killed him on the road back to Quetta, his Pakistani hideout, after a stay in Iran. The ISI helped target him, Abbas says, so that US forces struck him on the road, rather than at a tea-stand halt, to avoid civilian casualties. With this “help” from the ISI the United States may have lost an emir more inclined to deal with Kabul.

Succession came down to one of Mansour’s two appointed deputies. The victor, Mullah Hibatullah Akhundzada, then fifty-five, was one of the few mullahs who actually knew the Qur’an and hadiths (sayings of the Prophet), though his interpretations diverged from those of most Muslims elsewhere. Apparently strict and calm, Abbas reports, even now he doesn’t know how to use a mobile phone.

Hibatullah retained Mansour’s other deputy, Sirajuddin (or Siraj) Haqqani, a military commander regarded by US intelligence as an ISI asset, who to this day has a US$10 million bounty on his head. One of Omar’s sons, twenty-six-year-old Mullah Yaqoob, was added as second deputy. Baradar, added as a third deputy in 2018 after his release by the Pakistanis at Washington’s request, was soon assigned to the Doha negotiations.


Two years after its return to Kabul, the new Taliban emirate has two centres of power. Hibatullah resides in Kandahar, surrounded by equally conservative mullahs in a council known as the Rahbari Shura. This is the ultimate power centre, akin to the supreme theocratic figure in Iran.

The other centre is the government in Kabul. Unlike its counterpart in Tehran, it isn’t the product of any form of popular election. Its most powerful figures are Siraj Haqqani and Yaqoob, who seized the interior and defence ministries respectively in August 2021 and remain entrenched there.

The prime ministership went to a seventy-year-old mullah, Mohammad Hassan Akhund, regarded as safe hands by Hibatullah. The important qualifications for the job, according to Abbas, were being in Pakistan’s good books, having been in the Taliban councils in Peshawar or Quetta shura and, having studied at the Darul Uloom Haqqania seminary, being of like mind with Hibatullah.

Akhund heads a cabinet of mostly Pushtun conservatives, nearly half of whom are on a UN terrorism blacklist. His government did become a little more diverse when deputy ministers were added, notably deputy economics minister Abdul Latif Nazari, a member of the Hazara ethnic minority who holds a PhD in political science, and deputy health minister Hassan Ghyasi, a medical doctor who is also Hazara.

From the time of the Doha peace agreement until their first weeks after entering Kabul, the Taliban purported to have changed since the 1990s, when women were forced into the all-enveloping burqa, and executions and amputations conducted in public were substituted for sport. Siraj Haqqani even told readers of the New York Times in February 2020 that “killing and maiming must stop,” that the Taliban would work for a new inclusive political system, and that women would have the “right to work” and the “right to education.”

There have been glimmers of progress since the takeover. Taliban fighters guard the Shia minority’s mosques and festivals. Women in the cities wear headscarves, as they would anyway, rather than the burqa, and women have been appointed heads of maternity hospitals and gynaecological schools. A contest to head the Afghan Cricket Board became a “fistfight,” suggesting that attitudes towards sport had changed from when the first Taliban regime expelled the Pakistan soccer team with shaven heads for wearing shorts on the field. Hibatullah has also issued a fatwa against forced marriage and the disinheritance of widows.

Mobile phones and social media are allowed. Indeed, Taliban spokesmen have hundreds of thousands of Twitter followers. With seven million Afghans using the internet — “a necessity of the people,” one minister has said — the regime accepts that this particular tide of modernity can’t be ordered back. A new 100,000-strong regular army and a 140,000-strong police force, many with shaven faces, have been formed. Foreign correspondents are allowed to stay in Kabul, and often get interviews with government figures.

Yet if the promises on taking Kabul seemed too good to be true, that’s because they were, according to Abbas. In December 2021 women were told they must be accompanied by a male relative when travelling medium to long distances. Girls’ schools for grade six (age eleven) and above were subsequently closed.

In June last year, a book by chief justice Mullah Abdul Hakim (with a foreword by ) emphasised the absolute authority of the emir, and entertains no notion of a representative mechanism. Modern (non-religious) education was causing all the country’s problems, he wrote, so education had to be inherently religious. Women could only be wives and mothers, and their intellectual inferiority meant they could never be the emir. They had to be taught at home by family members, and must never study alongside men; if they had to leave the house, the teacher must be a woman.

In October, a government guidance said girls shouldn’t take college entrance exams for subjects like economics, engineering, agriculture, geology and journalism, which were deemed “too difficult.”

Abbas sees two minds at war here, with the conservative clerics advising so far prevailing, to the dismay of more progressive elements. It doesn’t help that some Western media call this “a return to traditional Islam” — it isn’t, he says. The Taliban “routinely mix up their tribal norms with Islam” instead of following sayings of the Prophet such as “Education is incumbent on every boy and girl.” Once again, women are the victims of war, Abbas writes. “They have become the bargaining chip, their liberties the sacrifice.”


And what of the Taliban’s other promises?

On security, the main terror threat comes from the regional branch of ISIS, known as the Islamic State in Khorasan. Its suicide bombing amid the crowds outside Kabul’s airport on 26 August 2021 killed 170 Afghans and thirteen American soldiers, and it has also targeted the Shia and Hazara minorities where it can. The Taliban are said to be seeking aid from the Americans, including signals intelligence, to fight the ISK; outflanked in extremism, it worries that its now-idle fighters might gravitate to the radical group.

But old Taliban friendships persist. In July last year, a CIA drone strike killed the visiting al Qaeda leader Ayman al-Zawahiri in a Kabul residence where he was apparently a guest of interior minister Siraj Haqqani. The muted response of the government showed its embarrassment.

While the ISK, with its many foreign members, might struggle in Afghanistan, a worsening security problem is blowing back on the Taliban’s old puppet-masters in Pakistan. A wave of terror bombings by the Taliban’s counterpart, Tehrik-i-Taliban Pakistan, is aiming to establish an even purer (to its mind) form of Islamic rule in the country whose name means “Land of the Pure.”

As for inclusion, the Taliban resisted bringing figures from the former US-backed government into even symbolic roles. But Hamid Karzai, the former president, and Abdullah Abdullah, a former chief minister, continue to live in Kabul. The Hazara ethnic minority fares better than during the first Taliban period, when they were victims of a genocide that saw desperate journeys to foreign asylum — some to Australia by boat — but Abbas notes Hazara lands reportedly being taken by Pashtuns and Hazara being excluded from relief supplies.

Economic stringency is affecting the Taliban as well, and helping moderate figures. Baradar has come back into the picture as head of economic policy with oversight of the finance ministry. Though not an economist, his Doha background makes him best suited to approach foreign partners and donors.

Another frontman is a foreign ministry spokesman, Abdul Qahar Balkhi, who grew up in New Zealand, speaks fluent English and may be a son-in-law of the late emir Mansour. As part of this effort to improve their image abroad, the Taliban have invited foreign correspondents to witness the drive against opium cultivation.

Overall, Abbas says it’s too early to declare that anything resembling a “New Taliban” has arrived. The regime is a toxic mix of “religion gone sour,” patriarchy, tribalism, nationalism and ethnic rivalry — all surrounded by baleful geopolitical rivalries: Saudi Arabia vs Turkey vs Iran; India vs Pakistan; the United Arab Emirates vs Qatar. But change might happen over the next five years as the Omar-era old guard retires.

This is very much an interim book, breezily written, more journalistic than academic, with necessarily vague attributions to the Taliban, diplomatic, intelligence and army figures whom Abbas quotes. It is strong on the who, how and where, less so on the “why.” The explanation of the Taliban’s theology derived from the Deoband school in Northern India could be a lot clearer: Abbas assumes a knowledge of the Salafi and Wahhabi purist schools originating in the Arab world in making a distinction about the Taliban.

But Abbas does buttress his contention that holding back doesn’t help anyone. The Taliban are the de facto government, and the West recognises regimes with equally atrocious human rights records elsewhere. Distinguishing between engagement and endorsement, Abbas argues that only through “creative engagement” can the Taliban be influenced effectively. He concludes: “Not engaging is going to support the view of hardliners that the world is against them — and consequently they will rise further within the organisation.” •

The Return of the Taliban: Afghanistan after the Americans Left
By Hassan Abbas | Yale University Press | $34.95 | 305 pages

The post Enigmatic pariah appeared first on Inside Story.

]]>
https://insidestory.org.au/enigmatic-pariah/feed/ 0
Doing “the work that men do” https://insidestory.org.au/doing-the-work-that-men-do/ https://insidestory.org.au/doing-the-work-that-men-do/#respond Wed, 09 Aug 2023 01:09:40 +0000 https://insidestory.org.au/?p=75115

Two talented Liberal senators paved the way for future female ministers

The post Doing “the work that men do” appeared first on Inside Story.

]]>

Labor’s Dorothy Tangney made history in 1943 when she became the first woman elected to the Australian Senate. But though she sat in that chamber for twenty-five years, no Labor woman ever joined her. Instead, she watched as the second, third, fourth, fifth, sixth and seventh women were elected to the Senate — all of them Liberals. And while Tangney spent her entire career on the backbench, two of those six Liberals managed to become ministers.

They hardly shattered the glass ceiling. But that first wave of elected Liberal women — six senators, along with Enid Lyons elected to the House in 1943 — were real pioneers, prising open the men’s world of parliament.

Who were these pioneering women? And how did they get there? Recent biographies of two of them, Dame Annabelle Rankin and Dame Margaret Guilfoyle, describe two very different women who took strikingly different paths to power and who, against the odds and in different eras, became ministers.

Rankin, a Queenslander who became the first Liberal woman in the Senate, served from 1946 and eventually became Australia’s first female minister; her biography is written by long-time Canberra journalist and lobbyist Peter Sekuless. Three months after Rankin left the Senate, in 1971, Margaret Guilfoyle, a Victorian, entered; she served until 1987, becoming a senior and powerful cabinet minister. Her life is told by the prolific Anne Henderson of the Sydney Institute.

For both, the path to power, and the exercise of it required innovation, political smarts and sheer tireless persistence. But both operated within heavy constraints imposed on them by the masculine character of their chosen career. These biographies tell us important stories about the past that prompt good questions about the present: in particular, they stand as an implicit challenge to the present-day Liberal Party which, by its own admission, struggles to find and promote female members of parliament.

Annabelle Rankin came from a prosperous middle-class family in Queensland’s coastal Wide Bay region. Her father, a Boer war veteran, was elected to state parliament as a conservative; he then ran a colliery. The elder of two daughters, Annabelle later claimed in a well-worn anecdote that a childhood game had involved imitating her father “being a member of parliament. I would play that I was opening fetes and all that sort of thing and making speeches.”

Rankin attended the all-girls Glennie (Anglican) School in Toowoomba, and her path forward continued via women’s and girls’ associations as state secretary of the Girl Guides and assistant commissioner with the YWCA. But it was the second world war that made her, opening up leadership roles in two women’s paramilitary forces, the Voluntary Aid Detachment and the Australian Women’s Army Service.

Sekuless suggests Rankin’s constant travel and networking within local communities in these roles provided invaluable training for the future senator. Before long, her political potential was recognised and she was encouraged — by a man — to seek Senate preselection with the conservative-leaning Queensland People’s Party, or QPP (which soon merged into the Liberal Party).

In July 1946 Rankin, a thirty-eight-year-old single woman, found herself as one of two women and four men seeking endorsement for two QPP Senate spots. The gender make-up of the interviewing panel is not recorded, but one can assume a predominant male gaze. One of those present, state director Charles Porter, resorted to the language of love to describe the “splendid” impression Rankin made:

She was a strikingly handsome young woman, with a fine lot of auburn hair and she had this ringing clear voice, and she enunciated the principles that she believed in with such a fervour and dedication that was almost a passion…

She also wore her service uniform, which no doubt helped.

After making her speech to the panel, Rankin went home convinced she had lost, walked the dog and went to bed. But she’d won, and within days — a novice and a novelty — she was campaigning around the state. Her first rally, near her hometown at Maryborough, attracted 150, two-thirds of them women, and it was the women who led the cheering as Rankin outlined her political philosophy/strategy.

“I honestly believe,” she told the meeting, “that the need of a woman’s voice in the Senate is vitally necessary.” The audience applauded, and she went on: “For a number of years I have worked with women’s and children’s organisations all over Queensland. I have been honoured and privileged to meet and know so many women and men of our fighting services during my service years during the war years. I worked for those women and men during the war, and I want to go on working to help the woman and the wife during the years of peace.”

Rankin and her handlers carefully fostered her image: within a month, she was being widely described in the press as “our Annabelle,” creating what Sekuless describes as “a cosy familiarity” about her. She also carefully deflected questions about her decision to remain single. (Sekuless suggests there may have been a fiancé, who may have died, but he leaves it unclear.) In any event, Rankin routinely generated a high personal vote; in 1946, at third spot, she recorded twice the vote of the man at number two.

Rankin became the first female opposition whip, but was dumped when Menzies won government in 1949. Reinstated as whip in 1951 and despite tireless service, she was never promoted to the ministry by Menzies. It was Harold Holt who appointed her as the first female minister in 1966 (in the housing portfolio; Enid Lyons had been made a minister in 1949, but without portfolio — a deliberately toothless honorific).

Rankin then suffered the distinction of becoming the first woman dumped from the ministry (in 1971, by Billy McMahon). She quit the Senate in March 1971, reportedly in tears, and accepted as consolation prize another first — as first female head of a diplomatic mission (high commissioner to New Zealand).


The Belfast-born, state school–educated Presbyterian Margaret McCartney had few of Rankin’s social advantages. Night school at Taylor’s College led to accountancy qualifications, a corporate job, and a friendship with young RAAF veteran Stan Guilfoyle. They married in 1952.

Margaret and Stan quickly got involved in local Liberal Party work. Stan’s mother was a member of the Australian Women’s National League — one of the women’s organisations that later merged into the Liberal Party — and she had enrolled Stan as a Liberal while he was still in uniform; he was destined to be elected to the state executive.

Margaret became branch secretary in South Camberwell, set up her own accountancy business and produced three children. With the state’s Liberal Party division requiring fifty–fifty organisational power-sharing between men and women, Margaret steadily acquired influence and leadership in the Victorian Liberal Women’s section, the state executive and the Federal Council.

But these positions didn’t translate easily into parliamentary preselection. When senator Ivy Wedgwood, elected to the Senate for the Liberals in 1950, prepared to step down in 1971, it was Stan she first approached about replacing her; only when he demurred did Margaret come into the frame.

Even so, of the twenty candidates for Wedgwood’s spot, seventeen were men. Guilfoyle was opposed by the premier, Henry Bolte, and by a (male) member of the interview panel who asked her who would look after the children if she were in the Senate. An unimpressed Beryl Beaurepaire, another member of the panel, put the same question to the next (male) candidate. Guilfoyle won.

Guilfoyle became the third female Liberal senator elected from Victoria (after Wedgwood and Marie Breen) and the seventh overall. In opposition during 1975 she was one of the key Liberal senators, along with Reg Withers and Ivor Greenwood, who hung tough in refusing to pass Gough Whitlam’s budget, paving the way for his dismissal. Her reward was a senior position in the incoming Fraser government, becoming the first female member of cabinet as minister for social security (1975–80) and finance (1980–83).

As a young journalist in the press gallery I had the distinct joy of covering both the Senate and the social security portfolio. To visit Guilfoyle’s office was to undertake quite a trek: she occupied room M152, the most remote point on the southwest corner of the old Parliament House, accessed at the end of a long, gloomy, empty, creaking corridor.

The office was diametrically opposite the prime minister’s office in the northeast corner, and this seemed a metaphor for the way the Senate exercised power in those days — with aloof disregard for the hustle and bustle of executive government. There was no mistaking the silent sense of power in the air. Once admitted, I would sit with her private secretary Rod Kemp, who imparted as background a few carefully selected crumbs of news.

Henderson provides the broad context of Guilfoyle’s portfolio battles and crises, informed by interviews with former staffers and departmental officers, and analyses the complex way in which, even as a Fraser loyalist, Guilfoyle’s defence of her social security budget and turf managed to thwart the prime minister’s overall drive for reforms.

These interviews yield the gem that Guilfoyle’s always-assured and measured parliamentary performance was enabled by her “handbag statistics” — a notebook of key portfolio facts maintained by her department. But unfortunately we don’t hear Guilfoyle’s own voice; perhaps because of that same understated style, her Hansard is dull rather than daring.


These easily readable biographies form part of a series of short biographical monographs edited by political scientist Scott Prasser and published by Connor Court. Prasser describes the series as “scholarly rather than academic” — a very fine distinction that seems to mean narrative in form with clear referencing of sources. Fair enough, though a few of the “academic” virtues would not be out of place, such as a critical approach to sources and a more considered acknowledgement of previously published research (for example, Marian Sawer and Marian Simms’s A Woman’s Place: Women and Politics in Australia).

Neither author really probes the institutional obstacles and advantages facing these women. As becomes clear, though, both careers were at least partly subject to the will and whim of the (male) prime ministers of the day. Menzies fully recognised the importance of women for the Liberal Party, as a matter of organisational structure, political philosophy and electoral strategy. But talented women were routinely overlooked in preselections. And as PM he ruthlessly pruned the ministerial careers of colleagues male and female.

Rankin had to wait for her promotion until Menzies had finally gone. Fraser, by contrast, had to repay Guilfoyle’s loyalty in 1975 with portfolio heft in government; it probably helped that she was Victorian in a time when all but one Liberal prime minister had been from the jewel-like state.

Equally, it’s clear — though again, not analysed in either biography — that the political careers of these women depended heavily on the dynamics of the Senate. Fewer elections, longer terms and a less volatile statewide electorate helped to protect incumbents, including women. Once Rankin was in, she stayed in. A similar dynamic was at work in Victoria.

In fact, Guilfoyle’s replacement of Wedgwood was a watershed moment, effectively reserving one Senate spot in Victoria for women. (When Guilfoyle retired in 1987, she was replaced by Kay Patterson; when Patterson retired in 2008, Helen Kroger was elected; but the sixty-three-year line came to an end in 2013 when Kroger, from third spot, lost to Ricky Muir the Motoring Enthusiast — a perfect symbol of the decline of Liberals, and Liberal women, in Victoria.)

But these institutional explanations deny the agency exercised by each of these women in negotiating a narrow path into and through their male-dominated workplace.

After one setback in 1949 — when she was dropped as opposition whip — Rankin sought the comfort of Enid Lyons. Lyons told her that she would be accepted, “so long as you manage to do the work that men do and do it as well, and at the same time don’t antagonise them.” In remarkably similar terms, the newly elected Guilfoyle was advised by husband Stan “not to take on any responsibilities or portfolios that were women’s issues. If she was to make it, she would make it as a person like any man.”

Both women did indeed do “the work that men do”: long hours, late nights and mute persistence in hard slog. Both had a prodigious work ethic. It might have been harder for Rankin, a curio item in the 1940s and 1950s, than for Guilfoyle, who in the 1970s and 1980s was able to become a serious player. Rankin remained unmarried and lacked the personal support of a family; Guilfoyle had to negotiate a more complicated work–family balance.

But who would offer such advice to today’s female MPs? With unprecedented numbers of women in parliament, ten cabinet ministers, and teal and Green crossbenchers galore, the numbers have changed, thanks in part to Labor quotas. The nature of representative political work has changed as well. In today’s politics, does anyone (even a man) need to work “like” a man or “as well as” a man?

As for “antagonising” male politicians, Julia Gillard and others have shown that outing misogynists is a legitimate and valuable part of a female political career. But in an earlier era, it is notable that so many of these Liberal pioneers were rewarded — partly in tacit exchange for not antagonising the men — with the highest imperial honours. Rankin, Guilfoyle, Wedgwood, Lyons were all titled “Dame.” Even Tangney accepted one, though it was against Labor policy. •

Annabelle Rankin
By Peter Sekuless | Connor Court | $19.95 | 134 pages

Margaret Guilfoyle
By Anne Henderson | Connor Court | $19.95 | 84 pages

The post Doing “the work that men do” appeared first on Inside Story.

]]>
https://insidestory.org.au/doing-the-work-that-men-do/feed/ 0
Mixed heritage https://insidestory.org.au/mixed-heritage/ https://insidestory.org.au/mixed-heritage/#comments Tue, 08 Aug 2023 03:23:45 +0000 https://insidestory.org.au/?p=75092

A new survey of heritage protection highlights Australia’s uneven record as it prepares to host next month’s International Council on Monuments and Sites assembly

The post Mixed heritage appeared first on Inside Story.

]]>
By the time of the 1960s property boom developers were demolishing whatever they wanted in Australian cities, with the exception of churches and other colonial properties that caught the eye of the National Trusts. But something changed in the 1970s.

James Lesh’s Values in Cities charts the explosion of heritage advocacy in that decade, from the Whitlam government’s inquiry into the National Estate to the heritage legislation passed in every state — even Queensland, where Joh Bjelke-Petersen’s National Party hoped to find and mine oil on the Great Barrier Reef. With cities now no longer part of the national conversation — apart from the housing crisis and never-ending traffic jams — it is refreshing to find a book that places them centrestage.

As Lesh demonstrates, heritage battles and community advocacy have done more to retain a sense of place and history in our cities than the leadership of the town planning profession, which have been increasingly captured by property developers. Streetscape overlays in the inner suburbs of our cities, from the terraces of Sydney and Melbourne to the timber-and-tin of traditional Queenslanders in Brisbane, owe their existence to patient lobbying by National Trusts, historical societies, a variety of professionals, and state government heritage councils, all garnering community support.

Imagine Sydney without The Rocks and its finger wharves, Hobart without Battery Point, or Brisbane without remnant sandstone colonial buildings (even if one is now a casino). Office block developers — and later apartment developers — were restrained from knocking down much of Melbourne’s Collins Street. Central Perth didn’t survive so well, and Adelaide maintained its dignity partly because there simply wasn’t as much money to be made there as in the other capitals. In all these battles strong community groups emerged, aided by the remarkable green ban movement of the Builders Labourers Federation.

Some heritage protections have been aggressively undermined by developers over the last decade or so — particularly in the West End of Brisbane, where hapless investors have bought into high-rise towers with a view not just of the flood-prone Brisbane River but also of traffic jams on Coronation Drive. Vale, too, the Sirius building in Sydney’s Rocks, social housing refurbished for multimillion-dollar price tags.

In September this year Australia will welcome more than 1000 delegates to our first-ever general assembly of ICOMOS, the International Council on Monuments and Sites, founded in Paris in 1965. As an advisory body to UNESCO, it provides assessments of nominations for much sought-after world heritage recognition.

Australia has achieved notable international landmarks in heritage conservation, especially the Burra Charter, developed here in 1979 at a meeting in the town of Burra. This was the first charter to argue convincingly that heritage conservation should take account of successive uses of a building or site. As Lesh explains, it has had a major influence on heritage practice around the world.

Internationally, the federal government has successfully nominated major “natural” sites, not least the Great Barrier Reef and Kakadu, and notable buildings, including the Royal Exhibition Building in Melbourne and the Sydney Opera House, for world heritage listing. Innovatively, it also nominated a group of convict sites spread across the continent. Only national governments can nominate buildings and sites, and with that comes an obligation to maintain and conserve the site. Even the Morrison government had to look like it was trying hard to curb agricultural flows into the Barrier Reef.

One of the great strengths of Lesh’s book is his attention to First Nations issues. He points out that many heritage advocates (including me) were slow to appreciate the Indigenous context of the places they were writing about. We were all conscious of the contribution archaeologists and National Parks officers had made to understanding Indigenous landscapes, and the Mabo judgement established beyond doubt claims for continuous Indigenous occupation. But the fact that nineteenth- and twentieth-century structures were also on unceded land has taken much longer to apprehend. Lesh cites the impressive amount of scholarly work on clan recognition over the last three decades that has led to the renaming of many urban places, especially in Melbourne.

Values in Cities identifies the explicit and implicit values that have underpinned the heritage movement, values that have of course changed dramatically over time. TV restoration programs still celebrate the haute bourgeoise in Sydney and Melbourne doing up homes designed by Walter Burley Griffin and Robin Boyd, and gentrifying Sydney’s Millers Point, but they also feature workers on more modest budgets tackling abandoned country churches and even bush shacks.

You don’t have to be a heritage devotee for this book to be worth consulting. Any readers facing redevelopment pressures in their street or the destruction of key structures in their local community will get useful clues about how to protest against what they regard as essential to a sense of locality and history. Given the book’s high price, get your local library — one of our most vital community assets — to buy a copy.


We need a parallel book to Lesh’s that explains and exposes the implications of our private property regime and also how developers, building companies, land-consuming retail giants and wealthy tax-minimisers have ruined whole coastlines and happily demolished heritage structures in many suburbs and country towns.

At a time when most members of federal parliament hold one or more investment properties, we have run out of popularly elected representatives prepared to question the untrammelled rights of property owners to do what they want, not only with their own structures but also to undermine the amenity and sense of place of their neighbours. Local governments, with only modest conflict-of-interest reporting requirements, have given us many examples of the shameless reshaping of our urban environments. The evidence is in your face on the scandal-ridden Gold Coast, the Sydney suburb of Canterbury and the Melbourne suburb of Berwick, but you can also witness it in all those other councils that have been or should have been placed under administration.

As I write, developers are demolishing older blocks of flats — and expelling tenants — to erect grandiose luxury apartments. In the midst of our greatest housing crisis since the early postwar years, heritage advocates and town planners will have to give more thought to saving the apartment blocks that remain the backbone of affordable rental housing in all our cities.

Adaptive refurbishing conserves embodied energy and helps reduce climate change impacts — which is one of the reasons why the heritage-listed Marks and Spencer department store in London’s Oxford Street has survived the threat of demolition. Our design and planning professions, as well as our councils and parliaments, need to give a lot more thought to sustainable buildings and how they provide for current and future residents of our cities. •

Values in Cities: Urban Heritage in Twentieth-Century Australia
By James Lesh | Routledge | $256 | 325 pages

The post Mixed heritage appeared first on Inside Story.

]]>
https://insidestory.org.au/mixed-heritage/feed/ 1
Magnificently crumpled lives https://insidestory.org.au/magnificently-crumpled-lives/ https://insidestory.org.au/magnificently-crumpled-lives/#respond Wed, 26 Jul 2023 01:31:46 +0000 https://insidestory.org.au/?p=74952

A fascinating account of nineteenth-century phrenologists illuminates how ideas spread

The post Magnificently crumpled lives appeared first on Inside Story.

]]>
If you are not quite certain what “phrenology” is, you are not alone. Many of us are vaguely familiar with the word — something to do with bumps on the skull? — but might struggle to explain its defining principles. In historical memory it sits hazily alongside mesmerism as an arcane oddity: one of those fields of study beloved by the Victorians for their promise to render the mysteries of life legible and manageable. The belief that human character and capability could be determined by “reading” the external landscape of the head has long since been discredited, as has the idea that a magnetic fluid exists between and connects us all. The “science” of these fields has been largely forgotten, though traces of their vocabulary linger still.

Alexandra Roginski’s appointed task in Science and Power in the Nineteenth-Century Tasman World: Popular Phrenology in Australia and Aotearoa New Zealand is not to recuperate the forgotten field of phrenology, nor to restore its place in an intellectual history of science. Indeed, she devotes remarkably little space to explaining phrenology’s foundational principles or rehashing its chief lines of fracture or debate. Instead, she weaves a narrative that offers a counterpoint to that of the professionalisation of science and its establishment as an academic discipline. While “science sprouted tendrils across the settler colonies,” and took root in universities, museums, exhibitions, observatories, scientific societies and other institutions, phrenology was weeded out from such establishment bodies only to flourish in the terrain of local communities and popular culture.

The setting for Roginski’s exploration is the “Tasman World”: an intercultural setting brought into existence by the flow of people and products to and between Australia and Aotearoa New Zealand from the late eighteenth century onward. During the decades of phrenology’s most vigorous public life, from the 1840s to the early 1900s, this was “a region and period seared by European settler-colonialism… a region of immense displacement, mobility and remaking.” It was a place in which a popular practice like phrenology could “shore up momentary power.”

In these mobile worlds, a “cadre of self-appointed professors” took science to the public by offering private consultations, public lectures and popular performances. Some were sincere, dedicated and knowledgeable exponents of phrenology; they offered extensive expositions of its theory and made it their life’s work. But many seized on the potential it offered for commercial or professional exploitation, adapting its vocabulary and gestures to purposes of their own.

Those who bore the “capacious title of phrenologist” might double as “gold miners, fortune tellers, vagrants, petty criminals, ministers, physicians, actors, elocutionists, barbers and journalists.” Some “plucked just one or two things from practical phrenology’s toolbox” to wield in a losing battle for security, income, or reputation. Others borrowed from its platform oratory to spice up their earnest articulation of radical, unionist, racialist or spiritualist views.

These Tasman phrenologists remain, for the most part, shadowy figures. “Their grasp on power was not the expansive sovereignty of the great figures of history,” so it isn’t surprising that their traces on the archival record should be blurred and incomplete. That they come into view at all is to the credit of Roginski’s painstaking searches through digital and paper archives, her drawing together of seemingly inconsequential fragments to create suggestive, although always partial, histories.

Roginski seeks out these eccentric and elusive practitioners, not to give them solid and certain form but rather to hold them to the light, finding historical meaning in what they reflect and refract. She finds them, in abstract terms, wielding “a transnational science in charged negotiations already overlaid by structures of colonisation, class, race and gender.” Yet she finds also — and takes seriously — “the joy, earnestness, theatre, wit, ambition, desperation and sometime tragedy of magnificently crumpled lives.”

The result is a lively, anecdote-rich, grounded, complex, wide-ranging, eclectic and downright fascinating account of the promise, practice and performance of this popular science and its shady practitioners. Roginski’s command of her subject is assured, and her crisp, clear writing is both perceptive and witty. Rival phrenologists in a rural town clash over which of them has correctly identified the “angle of murder”; a Russian-born fortune teller strives to place limits on “the stories that could be stretched to fit around her larger-than-life persona”; Bernard O’Dowd is memorably described as “one of Australia’s most prolific nationalists and omnivorous snufflers in New Thought.” Yet Roginski resists the temptation to poke fun at her subjects or to trivialise the aspiration or desperation that drove their human dramas.

Wherever possible, Roginski highlights the instructive complexity of individual experience and performance. Take, for example, the “Wonderful Woman” whose studio photograph — dressed in an exotic, harem-style costume and apparently pointing to significant bumps on the head of her seated, suited, bearded subject — features on the front cover of the book. Madame Sibly (born Marie Eliments) toured southeastern Australia during the 1870s and 1880s as a phrenologist and mesmeric lecturer, reading the heads of subjects ranging from babies to eminent citizens. Hers was a popular performance, young men proving particularly eager to experience the “frisson” of a public head reading at her hands.

Behind the scenes, Sibly’s relationships ran less smoothly. The victim of a violent assault by her lover, she subsequently launched a string of attacks against various men who sought payment for bills or labour or who harassed her on or off stage. Her favoured weapon was a good horsewhip, but on one memorable occasion she caused a group of mesmerised subjects to rush an aggressive audience member “like a pack of hounds.”

Despite such ruptures, Madame Sibly won the affection and loyalty of audiences during her stays in different rural towns. The phrenological performance was shot through with such interplays of gender and power, reputation and respectability, public and private personas.

Or take the “Professor of Phrenology” Lio Medo, who brought his new identity into existence in Dunedin on the South Island of Aotearoa New Zealand in 1880, imperfectly stamping out a former life as Benjamin Strachan: hairdresser, perfumer, caterer, thespian, amateur elocutionist, bankrupt, petty criminal and alleged sex offender. Strachan was of African descent and had claimed to be “American by birth”; in his new life as a phrenologist, Medo traded on the public thirst for exoticism, eventually presenting himself as a “West Indian scientist.”

While Medo “navigated the exoticist shorthand of Black stage identities” with some skill, he also bore the burden of “the reality of life as a body on display.” His turn to performance coincided with the surging popularity of blackface minstrelsy in the Australasian colonies. On stage he “bristled against minstrel stereotypes” but increasingly found himself the object of racial jeering. Roginski presents his life as one of continual renegotiation of racial tropes and a “deft, sometimes frustrating game of self-representation,” but argues that he turned “the burden of double consciousness into a game of shifting identities that tantalised audiences.”


These are just two of the many practitioners who populate the pages of this book: drifters and dreamers, preachers and physicians, idealists and charlatans. Roginski doesn’t confine her interest to self-defined phrenologists, but turns her attention also to the negotiated performances of those who appeared on stage beside them — for example as allegedly mesmerised subjects who would perform appropriate actions when pressed on different parts of the head. For Indigenous participants, she suggests, these performances could be the site of “fleeting moments of empowerment” — albeit within an oppressive colonial framework.

She is equally interested in audiences, whose participation was essential to the success of popular science. One thoughtful chapter probes the responses of a particular audience, Aboriginal residents of the Maloga Mission on the Murray River, to successive visits by phrenologists in 1884 and 1892. Accustomed to being made the object of the flawed and contradictory conclusions of “racial science,” to which phrenology was often allied, they might have chosen to regard its practitioners with anything from resentment to indifference. But Roginski’s careful analysis uncovers uncertain moments of “nuanced interaction,” in which the phrenologists’ visits could become the locus of humour and play, and even vehicles for sharing culture for “people facing the unravelling of their worlds.”

This is a history of “science from below” that brings a cultural and even ethnographic lens to bear upon a strikingly popular phenomenon. The result is a gloriously illuminating study of the way ideas take off and percolate through a society, the different purposes to which they can be put, and how they endure long after they have been discredited: put to work as entertainment, as vocational identity, in the service of commercial rivalry, or as a mask.

Along the way, Roginski reveals “the contested nature of science and who could claim its authority.” Phrenology shared with more respectable branches of science many of its theatres of practice and performance, its capacity to function as entertainment as much as authoritative explication. But as science’s less respectable “other,” it also serves as a mirror to the discipline. Popular phrenology proves indeed, as she claims, an “ideal artefact” through which to study science’s multiple functions and purposes.

Roginski ends her history on a cautionary note. If her study of phrenology sheds light on a nineteenth-century world, it may also help to illuminate our own. The favoured rhetoric of its lecturers, who in the face of waning credibility asserted their authority as guardians of “suppressed knowledge,” finds its echoes still today. Phrenology may have few adherents, but its “promise of certainty and self-advancement still beguiles.”

A review of this length can’t offer much more than a sampler of the content of this expansive, intricate book. Each chapter pursues a different facet of the topic; each is rich in character, anecdote and careful, shaded argument. The diverse experience of colonists, the complexities of class and gender, the diversity of Māori and Aboriginal negotiations of phrenology’s power and promise — all are deftly handled through close attention to the particularity of experience.

If there is a narrative history of phrenology in the Tasman world to be found here, it emerges subtly and elusively from the whole. It is not a triumphalist history, but tells of diffusion and transformation rather than decline. But while the book may defy summary, it invites and rewards attentive, immersive reading. •

Science and Power in the Nineteenth-Century Tasman World: Popular Phrenology in Australia and Aotearoa New Zealand
By Alexandra Roginski | Cambridge University Press | $160.95 | 300 pages

The post Magnificently crumpled lives appeared first on Inside Story.

]]>
https://insidestory.org.au/magnificently-crumpled-lives/feed/ 0
What is a university? https://insidestory.org.au/what-is-a-university/ https://insidestory.org.au/what-is-a-university/#comments Wed, 19 Jul 2023 03:26:26 +0000 https://insidestory.org.au/?p=74833

A long-forgotten experiment throws light on the challenges facing Australian education in the 2020s

The post What is a university? appeared first on Inside Story.

]]>
At 4.25 on the afternoon of 18 September 1926 a long whistle sounded and the SS Ryndam pulled away from the Holland America Line’s pier in Hoboken, New Jersey. The flags of thirty-five countries flew from bow to stern as the ship made its way down the Hudson River, UNIVERSITY WORLD CRUISE painted on its side. More than 1000 friends and family members stood on the shore, waving handkerchiefs and hats and blowing tearful kisses from the gangway.

The crowd was there to bid farewell to more than 500 excited and slightly trepidatious passengers — 306 young men, fifty-seven young women, and 133 adults who were combining travel with education — and the sixty-three lecturers and staff who had signed up to join the Floating University: an around-the-world educational experiment in which travel abroad would count towards a university degree at home.

Over the next eight months they would meet some of the twentieth century’s major figures, including Benito Mussolini, King Rama VII of Thailand, Mahatma Gandhi and Pope Pius XI, and visit countries in the midst of change: Japan in the process of industrialisation, China on the cusp of revolution, the Philippines agitating against US rule, and Portugal in the aftermath of a coup.

In an era of internationalism and expanding American power, the leaders of this Floating University believed travel and study at sea would deliver an education in international affairs not available in the land-based classroom. It was through direct experience in and of the world rather than passive, indirect engagement via textbooks and lectures that they thought students could learn to be “world-minded.” The trip was promoted as an “experiment in democratic theories of education,” and New York University lent the venture its official sponsorship.

In championing the merits of direct, personal experience as a way to know the world, the Floating University was joining a set of public as well as scholarly debates taking place in 1920s United States about the relationship between professional expertise and democratic citizenship in increasingly complex industrial capitalist societies.

On the one hand, protagonists including secretary of state and future president Herbert Hoover and journalist and political commentator Walter Lippmann argued for the principles of scientific management and technocratic governance, and emphasised the importance of well-informed and expert elites. It was specialised knowledge, they believed, that was needed to address the challenges presented by rapidly changing economies and societies.

On the other hand, popular technologies such as photography, film, radio, inexpensive novels and newspapers, as well as cheaper transatlantic travel, jazz and the latest improvised forms of dance, seemed to offer direct, embodied and experiential ways of knowing that were at once deeply personal and widely accessible. Questioning the concentration of power in the hands of experts, labour, social and civil rights activists as well as populist and agrarian groups advocated for more participatory forms of democracy.

Although their differences are often exaggerated, the debates in the 1920s and 1930s between Lippmann and the educational reformer and philosopher John Dewey are often taken to be emblematic of this apparent opposition between technocratic expertise and democratic knowledge and deliberation.

Dewey’s thinking had a huge influence on the founder of the Floating University cruise, New York University’s professor of psychology, James E. Lough. Fascinated by education and the learning process, Dewey argued that knowledge does not flow from experience, but rather is made through experience; it was by doing things in and with the world that students would best learn. As a psychology student at Harvard in the 1890s, Lough was attracted to these ideas and, following his appointment as director of the Extramural Division at New York University, had a chance to put them into action.

Education at university — as at the primary levels of schooling — should be connected to the environment, experiences, and interests of students, Lough argued. From 1913 onwards his Extramural Division began offering credit-bearing courses at a variety of locations across New York City: onsite commercial, investment and finance courses on Wall Street, courses in government in the Municipal Building, art appreciation at the Metropolitan Museum of Art, and engineering courses at Grand Central Station.

Extending this logic, NYU also began offering summer travel courses to Europe to study economic conditions and industrial organisation in Britain and municipal planning in Germany. These courses resumed after the first world war and then — towards the end of 1923 — Lough took his ideas one step further. If summer travel courses could work, why not a whole year at sea? As he told the audience assembled at New York’s Waldorf Astoria hotel the night before the Floating University’s departure, those aboard the ship would experience “a method of study which actually brings the student into living contact with the world’s problems about to be realised.” The difference between it and what was ordinarily served up to students was, as he put it, the difference “between reading a menu and eating the full course meal.”


Putting this educational vision into practice, however, was harder than Professor Lough had anticipated. Despite some hiccups, the formal part of the undertaking was relatively successful. Students took formal classes while the ship was at sea. When it was stopped in port they participated in a variety of activities that included officially arranged shore excursions, visits to host universities and free time.

Although some professors were more diligent than others, the best among them linked their curriculum on the ship to the experiences students were having onshore. Undoubtedly a good number of students didn’t attend to their studies, but the official Report of Scholastic Work on the University Cruise around the World stated that during the cruise, 400 college-level students had attended classes (79 per cent of whom sought university credit). Their aggregated marks were mapped onto a bell curve: 16 per cent of grades were As; 38 per cent Bs; 28 per cent Cs; 9 per cent Ds; 3 per cent incomplete; and 3 per cent fails. Those who were “negligent in their work on board” were, concluded the Floating University’s academic dean, George Howes, no doubt also negligent in their college studies onshore.

It was the behaviour of the students in port that proved the biggest problem. Reports of sex, alcohol and jazz made their way back to an American press hungry for scandal, and the Floating University became a byword for what could go wrong with educational travel. “Sea Collegians Startle Japan with Rum Orgy” read one newspaper headline. “More than a hundred students, among whom six girls were to be noticed, were doing intensive laboratory work this evening, in the bar of the Imperial Hotel” continued the article.

And there were plenty of unfavourable stories to follow: more trouble with alcohol, rumours of romantic relationships and sexual relations between the students, accounts of a split between the cruise leaders, and even reports of an outbreak of bubonic plague. These accounts proved such catnip to American editors that it is hard to read the newspapers of 1926 and 1927 and not come across the story.

It didn’t matter to the newspapers that unruly student behaviour was a common aspect of life on college campuses across the United States in the 1920s. “There was a certain amount of necking on board,” was how one of the students, George T. McClure, put it, “but not more than I saw at the University of Colorado last year.” Playing on the popular image of the frolicsome college student — the smoking by women, the drinking by men, and the sexual promiscuity of both — was a guaranteed way to sell papers. But not far beneath such discussions of the misconduct of American youth lurked a fear that ungoverned youthful bodies might threaten the foundations of civility at home, while also betraying a lack of national readiness for the new global role the United States was rapidly assuming abroad.


By the end of the 1920s, huge numbers of Americans were travelling abroad. Many of them were students taking advantage of new and cheap “tourist class” transatlantic fares. And while they were away, many enrolled in one of the “educational courses” frequently offered by the shipping companies. During their voyages these travellers were undoubtedly learning something about international affairs and spending huge amounts of money in the process.

In fact, a report of the time suggests that in 1930 more than 127,800 Americans travelled “tourist class” to Europe: that is 5000 more people than were awarded a BA degree in the United States that same year. This was big business. With the Floating University and his other summer travel courses, Professor Lough had recognised the potential of this market for what was already beginning to be called “international education.”

But on the whole American universities wanted to have nothing to do with it. Although the trend had begun earlier, the 1920s was the decade in which they really marked out the boundaries of their empire of expertise. With newly established schools in a whole range of fields — from business administration and retailing to journalism and education — they asserted their claim to authority over both how knowledge could be acquired and whose knowledge claims should be trusted.

Rather than crediting educational travel programs, universities set about establishing what the League of Nations’ International Institute of Intellectual Co-operation called the “scientific study of international relations.” While for graduates and academic scholars who were undertaking research this might necessarily have entailed travel, for the much larger American undergraduate population it meant enrolling in credit-bearing courses and degree programs taught on home campuses, with syllabi, reading lists and assessments.

And for universities, it meant an entirely new discipline of teaching and study. It meant journals, conferences, summer institutes, government consultancies, and new paying audiences for university-sanctioned expertise.

None of this was compatible with educational travel of the kind Professor Lough envisaged. It was the university and its qualified faculty members that stood as the source of authoritative knowledge about the world, not the experiences of sundry travellers. In 1926 NYU pulled out of its sponsorship of the Floating University and over the course of the next few years abolished all its other study abroad programs. Although in 1930 the university did offer a course called Literary Tour of Great Britain, it took place entirely in a classroom in Washington Square, with readings supplied. In this 1920s contest between different ways of knowing the world, it was academically authorised expertise that triumphed, and it has undergirded the claims of universities — in Australia as in the United States — ever since.


Why does this matter?

For the last century or more, universities have derived their social standing (not to mention their income) from their claim to have authority over knowledge. They are the institutions that undertake the research, distil the learning, and provide the training so crucial to our economies and societies — or so the generally accepted story runs. Within their walls students learn from experts about the world and each other, developing both general and specialised disciplinary knowledge that prepares them not only for careers but also to be active and informed members of society.

But as anyone paying even a little bit of attention to politics and current affairs over the last decade will be aware, the university’s authority over knowledge is by no means uncontested. On the one hand, a new politics has emerged that challenges experts and their long-privileged authority, and instead prioritises personal, embodied and experiential ways of knowing. On the other hand, the proliferation of highly granulated, linked and disembodied big data, and the artificial intelligence algorithms that process it threaten to make obsolete many of the tasks that experts and knowledge workers have traditionally undertaken. Who gets to know in this new world?

There are many ways of warranting or justifying knowledge claims. In 1926 Professor Lough argued for the legitimacy of personal experience, but doing so brought him into conflict with the universities’ assertion of the authority of academic experts and “book knowledge.” But there are also other warrants for knowledge — authority, testimony, culture, tradition, or even divine revelation; all these can be invoked to support a claim to truth, and frequently they come into conflict with each other. Thinking about these conflicts can tell us a lot about how power and knowledge work in a society, especially in moments of change.

In their book Leviathan and the Air-pump, science historians Steven Shapin and Simon Schaffer examine one such moment of conflict: the historical controversy surrounding the experimental demonstrations of the vacuum pump conducted by Robert Boyle and his assistant Robert Hooke in the seventeenth century. Boyle’s approach, which emphasised systematic observation, measurement and repeatability, represented a new way of producing knowledge that conflicted with Thomas Hobbes’s emphasis on deductive reasoning and mathematical principles. But crucially, as Shapin and Schaffer show, Boyle’s effort to establish the credibility of this new, scientific form of knowledge relied heavily on the social status and reputation of those men who were performing experiments and observing them.

We might think today that scientific experiment and academic expertise are self-evident means of arriving at the truth. But as various people (from feminist, Black and anti-colonial thinkers to Trump supporters) have pointed out, they are underwritten by social conventions and forms of power. Or, to put it another way, the social recognition Robert Boyle was able to mobilise was something Professor Lough failed to muster.

Too often, expertise is cast as a neutral or natural phenomenon, but expertise also has a history, one that is intimately connected to shifts in the nature and mode of power and rule. Thinking about why the Floating University was deemed a failure in the 1920s matters because it highlights the failure in our own times to ground knowledge claims in ways that are recognisable to those outside the community of academically authorised experts.

Experience and academic learning may now not seem so far apart. Internships, service learning, study abroad programs, field studies, work-integrated and simulation-based learning, collaborative research, and capstone projects are all part of the way most universities today deliver their degrees. In the United States, the Semester at Sea program, which claims the 1926 voyage as its progenitor, even allows students to credit time at sea towards their college degree.

But these initiatives don’t really settle the questions the story of the Floating University’s 1926 world cruise ultimately provoke: Who gets to know in our society? What forms of status determine what knowledge counts as legitimate?

These are pressing questions for democracies seeking to navigate change, and they are as relevant for twenty-first-century Australia as they were for Lough and Dewey and Lippmann in the 1920s and 30s United States. •

The post What is a university? appeared first on Inside Story.

]]>
https://insidestory.org.au/what-is-a-university/feed/ 1
Three “bloody difficult” subjects https://insidestory.org.au/three-bloody-difficult-subjects/ https://insidestory.org.au/three-bloody-difficult-subjects/#respond Mon, 03 Jul 2023 23:30:35 +0000 https://insidestory.org.au/?p=74614

Historian Ruth Ross, the Waitangi Treaty and historical mythmaking are the subjects of a provocative account of New Zealand’s founding document that throws light on Australian debates

The post Three “bloody difficult” subjects appeared first on Inside Story.

]]>
In 1958, while she was employed to write history materials for schools, New Zealand historian Ruth Ross (1920–1982) made a close study of the development and text of the Treaty of Waitangi, New Zealand’s two-language founding agreement between Māori chiefs and the Pākehā (Europeans). Her first essay about her findings, which she unsuccessfully submitted to Northland Magazine, made two claims that would transform scholarly and eventually public understanding when they were published in 1972.

Ross found, first, that many of the 540 signatures to the treaty were added after the stated date of signing (6 February 1840), and even after the treaty’s proclamation (21 May 1840), by William Hobson, the British lieutenant-governor. And, second, she found that not one but two proclamations were made on that day, the second correcting the first by adding the South Island and Stewart Island to the North Island, though not because of the chiefs’ cession but because of the British “discovery” of these territories. Privately, Ross said that her research showed Britain to have been “both very well-meaning and very cynical.”

The treaty — especially approached in Ross’s painstaking manner — was seen as too esoteric for Northland Magazine’s general readership. Friends encouraged her to persist in examining the treaty documents and their context, but a new job with the New Zealand Historic Places Trust pulled her in other directions.

Māori were meanwhile asserting their collective rights in terms that confronted the official policy of assimilation. By the time Victoria University invited Ross to speak at a public forum in 1971, her views had more resonance. She argued that the Māori text of the treaty, not the English text, should be the focus of legal and political debate because most of the chiefs’ signatures applied to a Māori translation whose English original had been signed by relatively few chiefs at Waikato Heads on 6 February 1840.

Importantly, Ross was critical of the Māori translation, and particularly the question of whether it had adequately rendered the English term “sovereignty.” Where the missionary coinage kawanatanga had been used to refer to what Māori were being asked to cede to Queen Victoria, might the Māori concept mana have better conveyed the full significance of “sovereignty”?

The purpose of historian Bain Attwood’s latest book, “A Bloody Difficult Subject”: Ruth Ross, te Tiriti o Waitangi and the Making of History, is not only to celebrate the scholarship and character of Ross (widow, wife and mother in a milieu of male professional privilege), though he surely does that, making full use of her papers. Attwood’s larger points are that the significance of historical truths is highly dependent on the context of their reception, and that any “truth” that has public utility (as knowledge of the treaty surely has in New Zealand) may “assume the form of myth.”

Myth need not mean falsehood. Rather, Attwood means that the treaty has become a kind of blank screen on which “moral, political and legal norms” have been projected. Settler colonial nations need foundation stories, and the treaty is central to New Zealand’s. It can have a “mythic history” because stories about it “have a genuine link to a genuine past” and “at least a partial relationship to past reality and what is regarded as historically truthful.” But what the treaty does, as myth, is serve needs now.


Ross finally got her treaty research published, fourteen years after her first attempt, in the New Zealand Journal of History. According to Attwood, her overriding argument was that it is hard for us to know the intentions of those drawing up and signing the treaty. In her debunking words, the treaty was “hastily and inexpertly drawn up, ambiguous and contradictory in content, chaotic in its execution. To persist in postulating that this was a ‘sacred compact’ is sheer hypocrisy.”

Revealing the treaty to be unfit to serve as a “moral compact, let alone a legal contract” (Attwood’s words), Ross saw her truth as demystifying. But changes in Pākehā/Māori relations meant this was not an argument readers became aware of and valued.

In response to Māori self-assertion, the government established the Waitangi Tribunal in 1975, making the treaty a central “constitutional” document (in a nation without a written constitution). It was not the purpose of Ross’s research to establish how the treaty could work as a morally and politically central two-language text, but this was her research’s fate. Indeed, there is irony in her most iconoclastic assertion — “the Treaty of Waitangi says whatever we want it to say” — because, by the 1970s, Pākehā and Māori were wanting the treaty to say a lot.

In fact, the treaty’s protean character was not the undoing but the making of the treaty as a focus of national life. A non-debunking reading of Ross’s scholarship — a reading that found, in Attwood’s words, “that there were substantive differences between the Māori text and the English texts, that the Māori text constituted the treaty, and that any consideration of its meanings and implications should proceed on that basis” — proved unstoppable.

The jurisprudence of the new tribunal was soon shaped by Māori judge Eddie Durie, its chair from 1980 to 2004, who saw the Māori text as fundamental. The centrality of the treaty to New Zealanders’ conceptions of themselves as a socially just people has ensured the text remains in focus as the nation debates the terms of Māori–Pākehā coexistence and the mutual honouring of their sovereignties.

Attwood conveys the disconnect between Ross’s own understanding of her paper’s primary point and others’ later understandings. Her paper is open to being read for both its major argument (a treaty botched, a nation’s veneration of it misconceived) and for its minor argument (the Māori text as the treaty). In other words, as Attwood says, “Ross’s approach to an account of the treaty resembles the treaty itself” by being available to more than one purposeful reading. The minor argument has become canonical because the nation needs it.

In his book’s final two-thirds Attwood reviews the writings of treaty scholars including Paul McHugh, Claudia Orange, Judith Binney, James Belich, Michael Belgrave, John Pocock, Andrew Sharp, Keith Sorrenson, W.H. Oliver, Lyndsay Head, D.F. McKenzie and Mark Hickford. These (Pākehā) names will be well known to anyone reading New Zealand history. Their conversation has used or generated a number of terms — Whig history, common law history, juridical history, and Māori history — that Attwood has found useful in previous publications (for example, in his critique of the “juridical” alignment of Henry Reynolds’s scholarship with the High Court’s 1992 Mabo decision).

Attwood brought these terms (apart from Whig history) from New Zealand, and he now uses them to take the reader through the New Zealanders’ work, showing us the rich soil from which he grew as a historian. The debates in New Zealand are highly relevant for Australian historians who wish to respond to the Uluru Statement’s demand for truth-telling. By making this careful exposition, he has done the discipline of history everywhere a great service.

In his final chapter Attwood returns to a theme he canvassed in Telling the Truth about Aboriginal History (2005). There, he remarked that the democratisation of the production of knowledge — accelerated by “contemporary forms of technology” — “has made it difficult to agree on what historical truth comprises.” He then presented academic history as somewhat embattled by having been drawn into the public sphere to perform political and legal service.

Since then, Attwood has read Nietzsche’s On the Advantage and Disadvantage of History for Life (1874), with its typology of histories: monumental, antiquarian and critical. He endorses Nietzsche’s view that each has advantages for living and each needs the tempering presence of the other two. This seems to have had the effect of weakening his strictures on “juridical history,” though the category remains important to him.

Thus Attwood now distances himself from those who privilege the “critical.” In their practice of what he calls historicism, its NZ practitioners concede too little public value to what people have made the treaty mean according to the political dynamics and moral sensibilities of their times. “Many of the matters at stake in regard to the treaty concern justice and ethics and so are legal and philosophical in nature rather than historical.”

Here Attwood enters a global discussion that has featured two formidable contemporary Australian theorists of history, Ian Hunter (University of Queensland) and Anne Orford (University of Melbourne), debating how we should and should not historicise international law. (Their debate is the subject of a perceptive commentary by another Australian, Natasha Wheatley, in a 2021 issue of History and Theory.)

By the final pages of Attwood’s very fine book, the reader will be acutely aware that New Zealand has been a ground for exploring a question that Australians can’t avoid: how does historical scholarship serve a democratic reckoning with a settler colonial past? The idea that he seems to find most promising is John Pocock’s proposed “treaty between histories” (that is, mutual respect between Pākehā and Māori ways of doing history that cannot be blended). We have much to learn from what they have been talking about on the other side of the ditch. •

“A Bloody Difficult Subject”: Ruth Ross, te Tiriti o Waitangi and the Making of History
By Bain Attwood | Auckland University Press | $59.95 | 320 pages

The post Three “bloody difficult” subjects appeared first on Inside Story.

]]>
https://insidestory.org.au/three-bloody-difficult-subjects/feed/ 0
Pandemic déjà vu https://insidestory.org.au/pandemic-deja-vu/ https://insidestory.org.au/pandemic-deja-vu/#comments Thu, 15 Jun 2023 01:31:42 +0000 https://insidestory.org.au/?p=74477

In the aftermath of the worst of Covid-19, what does history tell us about how best to deal with the experience?

The post Pandemic déjà vu appeared first on Inside Story.

]]>
Traumatic experiences can provoke a wide range of symptoms — not just the obvious ones like flashbacks. Trauma survivors often undergo what the DSM-V calls a severe and persistent change of worldview and loss of self-esteem. Our ability to see the world as essentially fair, and our sense of ourselves as capable of adapting to adversity can be powerfully challenged. These changes can be accompanied by intense grief and volcanic anger. We have all just lived through a mass traumatising experience, and we are well and truly seeing those pandemic affects — grief, blame, anger — playing out on the public stage.

Twitter in particular has witnessed some egregiously bad behaviour by “Covid Zero” advocates seeking public vengeance for what they paint as a malign conspiracy to deny the hidden truths of Covid-19 — its transmission via the airborne route, its prevention via public masking and lockdowns, the threat of Long Covid and disability, and the need to aim for elimination rather than mitigation.

I was genuinely shocked to see a senior researcher, a full professor, seeking to humiliate a PhD student with whom they disagreed by naming and shaming his supervisors, implying guilt by association. I was staggered to see a medical doctor describe people at an airport not wearing masks as “oldies, fatties and crumblies.”

It sickens me to watch as one prominent ventilation advocate launches abusive screeds targeting doctors working on the Covid frontline. I feel sorry for clinicians abused because they don’t wear the level of PPE favoured by Covid Zero advocates.

If the case for Covid Zero is strong, it shouldn’t be necessary to try to publicly shame people who disagree with it. These tactics call into question the strength of the arguments that underpin this campaign. They highlight the non-rational drivers of these positions: the traumatic affects that are being given full voice on social media platforms.

As a queer person working in HIV, I’ve lived through this before, and our present situation gives me a powerful sense of pandemic déjà vu. I can’t overstate the importance of excising these practices from socially acceptable norms of conduct, while undertaking the kind of cultural production that helps us understand where they are coming from — the traumatic affects, experiences and practices to which pandemics give rise.

This can be an exceptionally slow process, prone to sparking “history wars” and paroxysms of public rage over seemingly benign topics. The queer community fought all-in battles over the changing meanings of HIV. First, as the HIV response became professionalised rather than resting in the hands of activists and volunteers. Then, following the widespread uptake of effective antiretroviral treatments, as the meaning of HIV changed from a death sentence to a lifelong, manageable condition. Finally, as the advent of preventive medication meant condoms were no longer the only game in town.

On each occasion, community figureheads took up purist and punitive positions and strategies. The damage caused to vulnerable members of the community was incalculable; it was toxic shaming at its absolute worst. It wasn’t enough to win the argument; it wasn’t even a debate in any rational sense — chosen scapegoats had to be obliterated, if not from this earth then, at least, from public view; they were shamed into silence. Each time this happened, necessary debates over HIV prevention policy and programming were set back years, if not decades.

Traumatic affect was clearly playing out in those conflicts. In 1995, at the peak of the AIDS crisis in the United States, 50,000 people were dying each year in communities that made up, on contemporary estimates, about 1.5 per cent of the adult population. That’s 1.2 per cent of that community’s population dying each year — for reference, that’s four times the mortality rate of Covid-19 in the mainstream US population.

These dry calculations only thinly approximate the human, social, relational and emotional impacts of the epidemic. It falls to cultural products like film and television and books to account for the incalculable costs.

One film that does this especially well is Robin Campillo’s 120 BPM (2018) about ACT UP Paris. (It is available to rent for $5 on Apple TV.) The film opens in a lecture theatre, in a loud and only loosely organised collective meeting, as newcomer Nathan (Arnaud Valois) watches activists planning their next protest and, while this happens, checks out cute firebrand Sean (Nahuel Pérez Biscayart). The film tells the story of the collective, their protests against recalcitrant governments and drug companies, and the emerging relationship between Nathan and Sean, which ends in scenes of unbearable tenderness.

The film interweaves the moments of tragedy and agency that were simultaneously embodied in the queer community’s response to HIV and AIDS, and powerfully evokes its protagonists’ overwhelming perception of government and the public: “They don’t give a fuck about us.” (A perception that is no doubt familiar to the many people left behind as Australia transitions into a Covid-Normal existence.) It highlights the diversity of people and groups engaged in the battle against HIV, rather than presenting cisgender, white, educated middle-class men as the heroes of the epidemic response.

Right now, we are missing two things.

First is the community infrastructure that, in the HIV epidemic, enabled affected communities to respond effectively with prevention programs, and to care for and support people living and dying with HIV and AIDS. We have a Heart Foundation, Cancer Councils, AIDS Councils, PWDA, but no organisation dedicated to representing the people most affected by Covid-19. This gap harms people who are vulnerable to severe illness and people fighting for recognition, treatment, services and research on Long Covid. In the absence of a representative body we are only hearing the loudest voices, not voices informed by the diverse needs and experiences of this community.

Second, we urgently need an investment in public storytelling that can help us understand Covid-19 as a mass traumatising experience. There is scholarly debate over whether Covid-19 lockdowns had lasting impacts on people’s mental health, often judged using simplistic measures of depression and anxiety included in longitudinal survey research. But the whole point about trauma is that symptoms often take time to emerge, and they are often quite indirect — it’s not immediately obvious where they are coming from and what has triggered them.

Trauma also has effects that play out at the collective level, reshaping how a group of people sees itself and organises its everyday life. Public narrative is one of the most powerful therapeutic interventions for grappling with and resolving individual and collective traumatic experience.

For this to happen we need the discourse over Covid-19 to shift gears. Public rage and pathos-filled personal narratives can pay off; both are ways of building an audience. But it is possible to get hooked on rage, stuck in the black-and-white, blame-and-shame mindset it produces. There is little possibility of processing the traumatic experience when you are spending hours each day marinating your brain in other people’s digitally mediated stress hormones. It is time for the merchants of rage to take a breath or take a seat. •

The post Pandemic déjà vu appeared first on Inside Story.

]]>
https://insidestory.org.au/pandemic-deja-vu/feed/ 1
The silence that makes sense of modern China https://insidestory.org.au/the-silence-that-makes-sense-of-modern-china/ https://insidestory.org.au/the-silence-that-makes-sense-of-modern-china/#comments Tue, 13 Jun 2023 07:04:53 +0000 https://insidestory.org.au/?p=74459

Two new books excavate everyday experiences of the Cultural Revolution

The post The silence that makes sense of modern China appeared first on Inside Story.

]]>
It’s generally accepted that China’s ultra-left, ultra-violent Cultural Revolution ended shortly after Chairman Mao Zedong’s death in late 1976. But such extreme social, political and psychological turbulence, set in motion more than a decade earlier, doesn’t just come to an end when the powers-that-be say it does. A minority of Chinese people, outraged by contemporary inequalities and nostalgic for an idealised era of egalitarianism and ideological purity, believe it should never have ended. For many more, ongoing and intergenerational trauma has ensured that it still hasn’t.

Postwar Germany dealt openly, painfully and at length with the history of the Nazi era. After apartheid, South Africa established a Truth and Reconciliation Commission aimed at restorative justice, another harrowing but necessary process. Argentina and Chile have undergone similar processes, and so too have other countries, some with more success than others.

For the Communist Party of China, though, the most relevant, instructive — and alarming — precedent for dealing with past injustice comes from the Gorbachev years, when the Soviet leadership allowed access to historical records, including those of the Stalinist era, with its purges, labour camps, man-made famines and killings. As the Chinese communists witnessed with alarm, it was not long after glasnost and its accompanying program of political reform, perestroika, that the Eastern bloc disintegrated and the Soviet Union collapsed.

None of China’s post-Mao leaders have permitted a full and honest reckoning with the Cultural Revolution (or other inglorious episodes in the party’s past for that matter). But Xi Jinping has made it a personal mission to eliminate what he calls “historical nihilism,” which is essentially any version of history that contradicts the highly sanitised party-approved version: something something misapprehension something something counter-revolution.

This historical obfuscation has been so effective that I was once asked by a young person in Beijing whether the Cultural Revolution took place “before or after Liberation [in 1949].” Yet, as Tania Branigan puts it in Red Memory, understanding what happened in the Cultural Revolution is vital to understanding China today. It is nothing less than “a silence, a space, that [makes] sense of everything existing above or around it.”

Red Memory is one of two important new books that offer English-language readers a look at the history of that period and how it continues to affect society and politics in China today. Both Red Memory and Wang Youqin’s monumental Victims of the Cultural Revolution shift the usual focus from the pronouncements and machinations of the top leadership to the experiences of the people, inside and out the party, who were directly affected by them.

Branigan, who reported from China for the Guardian for seven years beginning in 2008, interviews survivors, victims and perpetrators, and their children, an artist who paints them and psychoanalysts who treat them. She meets people whose actions — or failures to act — led to the torment, torture and even murder of friends and family, and who must cope with that hard truth every day, and others whose lives and families were destroyed by the violence. She shows how the trauma inflicted by the Cultural Revolution was not just national and individual but intergenerational as well.

Given that it was “an age of betrayal, of political choices fuelled by fear, idolatry, adolescent rage, marital bitterness and self-preservation,” Branigan is impressed by how many “stood firm” and refused to bend under pressure. She is taken aback by those who cling to the ideas and ideals of the period, whose phones ring to the tune of the “Internationale” and who organise trips to North Korea “to admire society as it should be.”


Among Branigan’s interviewees is the author of Victims of the Cultural Revolution. Wang Youqin was fourteen when her Red Guard classmates battered their teacher to death in what would later be seen as a pivotal moment in the movement’s violent turn. She reflected in a secret diary at the time that she was powerless to change the “bad things” that were happening all around her, but she could at least record them. Wang, who counts Anne Frank and Aleksandr Solzhenitsyn among her inspirations, has since moved to the United States, where she teaches university-level Chinese while continuing the documentation that has become her life’s work.

Conservative estimates place the number of unnatural deaths associated with the Cultural Revolution’s violence, including murders and suicides, at nearly two million. Most victims have never had their stories told or their sacrifices honoured. To date, Wang has interviewed more than a thousand survivors and witnesses, meticulously checking archives and other sources to corroborate their testimonies and fill in, or correct, details. Since 2000, she has been publishing the results on her Chinese-language website Chinese Cultural Revolution Holocaust Memorial. Unsurprisingly, the website is blocked on the mainland.

She has now produced from these materials the prodigious Victims of the Cultural Revolution. Superbly translated, annotated, edited and abridged by Stacey Mosher, it tells the stories of 659 people. They include famous writers and political figures as well as cooks, police, factory workers, farmers and sports coaches, among many others. But the majority are educators, from primary school teachers up through professors and university presidents. Educators were archetypal targets of the violence and students among the worst perpetrators.

The Chinese original was ordered alphabetically by name (according to Pinyin romanisation), which would have condemned the book to obscurity in English, a resource for specialists only. By working closely with Wang to reorganise the text with attention to chronology, theme and place, Mosher has helped craft a compelling and contextualised narrative that is essential reading for anyone with an interest in modern Chinese history.

One breakout from the text is a table that lists sixty-three victims from Peking University alone. The table has columns for names, gender, ages (ranging from twenty to seventy-seven), status (typist, professor, worker, librarian, canteen cashier, student, “father of legal department administrator,” equipment room manager), department (name it), Communist Party or Communist Youth League membership (which twenty had) and cause of death (beatings, leaping from heights, poison and vein cutting, shot in crossfire, hanging, lying on railway line and so on).

Wang doesn’t spare the reader the details of the physical and psychological savagery experienced by the victims. The images of the Cultural Revolution encountered by most Western readers probably include pictures of Tiananmen Square crowded with Red Guards ecstatically waving Mao’s Little Red Book, and kitschy stills from Red Detachment of Women, as well as the odd photo of a struggle session with victims kneeling on a stage wearing giant dunce caps and placards around their necks, perhaps with a Red Guard gesturing above them, belt in hand. Wang tells us how heavy those caps were, and how some people were whipped so fiercely with the belts that their shredded clothing was embedded in their broken flesh.

Woven throughout these stories of terror, moral plight and violence are Wang’s astute observations and analyses, personal stories from her meetings with witnesses and survivors, and comparisons both with other repressive historical eras in Chinese history and with the Stalinist purges and the Khmer Rouge’s killing fields.

She shows that both Mao and premier Zhou Enlai (generally seen as a mitigator of the movement’s worst excesses) knew, often in significant detail, about specific acts of violence. Their enthusiastic support for the Red Guards meant that the murders, especially on campuses, “were carried out with great fanfare and were considered meritorious and honourable.” They received appeals from some victims, and occasionally intervened on their behalf; but Mao ignored a personal appeal from Li Da, the president of Wuhan University who, along with Mao, was one of the dozen or so founding members of the Communist Party. The seventy-six-year-old was “struggled” outdoors multiple times in the furnace heat of Wuhan’s summer, soon after which he collapsed and died.

Wang writes of how her immersion in these tragic stories has affected her. She admits that friends supportive of her work worry about her mental health. Yet “now that I’ve started,” she writes, “I have to continue, even if it tears at my soul like a wire brush.”

Strikingly, the longer biographies in Victims often include the victim’s role in the many political campaigns from the early 1950s onwards: some were victimised again and again. Others were former models of official thought reform, and even participated in the persecution of “class enemies” or “counter-revolutionaries,” never dreaming that they would one day find themselves so accused. “People who helped build the machinery of persecution,” Wang observes, “risked being crushed alive by that very machine.”

As for those who, under torture or threat, made false confessions or incriminated others, she comments: “It is futile to hope for people to be impervious to gun and knife; the best we can do is glean some kind of truth from history and use it to establish a system under which human flesh is no longer obliged to withstand the cold, hard steel of autocracy.” •

Red Memory: Living, Remembering and Forgetting China’s Cultural Revolution
By Tania Branigan | Faber | $32.99 | 304 pages

Victims of the Cultural Revolution: Testimonies of China’s Tragedy
By Youqin Wang | Oneworld Academic | £50 | 592 pages

The post The silence that makes sense of modern China appeared first on Inside Story.

]]>
https://insidestory.org.au/the-silence-that-makes-sense-of-modern-china/feed/ 1
The evolution of a myth https://insidestory.org.au/the-evolution-of-a-myth/ https://insidestory.org.au/the-evolution-of-a-myth/#respond Mon, 29 May 2023 05:43:23 +0000 https://insidestory.org.au/?p=74259

How William Cooper became “the man who stood up to Hitler”

The post The evolution of a myth appeared first on Inside Story.

]]>
As recently as the early 2000s the Aboriginal leader William Cooper (1860–1941) was barely recognised in his own country. But he has been celebrated in recent years, and this greater recognition can be attributed to a story that has come to be told about him: the story of “the man who stood up to Hitler.” The story’s origin lies in a verifiable event: in December 1938 the Australian Aborigines’ League, an organisation headed by Cooper, tried to present a petition to the German consul in Melbourne protesting against Nazi Germany’s persecution of Jewish people the previous month.

That fragment of a story began its evolution when the well-known Melbourne Aboriginal activist Gary Foley came across a brief report about the event in a newspaper of the time. In an essay published in 1997 he drew a connection between the League’s protest and the event now widely known as Kristallnacht, a Nazi-sponsored pogrom against Jewish people. Foley believed the League was the first group in Australia to try to formally protest against the German government’s persecution of Jewish people, but his main aim was to draw attention to Australia’s persecution of Aboriginal people by noting the comparison with the Nazis.

The story Foley told about the League’s protest piqued the interest of staff at Melbourne’s Jewish Holocaust Museum and Research Centre (known widely at the time as the Jewish Holocaust Centre). It no doubt struck them as a good example of a people seeking to combat racism, and especially anti-Semitism. Telling a story about it would be a means of advancing the centre’s educational goal.

In the years that followed, the thread of the story told by Jewish institutions and organisations, here and in Israel, solidified into the account we know today. The protest at the consulate was the heroic work of one man, William Cooper, rather than the political organisation he represented, let alone any broader political movement of which it was a part.

According to this account, Cooper’s was the only non-governmental protest made in Australia, or even the world, against the Nazi persecution of Jews. In raising his voice, Cooper was bearing witness to the Nazi genocide of European Jews. His act sprang from his courage, humanity and compassion, and his empathy with the Jewish people, rather than any intention to advance his own people’s interests. It was all the more remarkable and worthy because he was standing up for the rights of Jewish people despite having no rights himself.

How did the story come to take on mythological qualities in this way? When they learned of Foley’s discovery, leading figures at the Holocaust Centre may well have also been influenced by two other narratives: that Kristallnacht was a turning point in the history of Nazi Germany’s treatment of Jewish people, which culminated in the Holocaust, and that tens of millions of gentiles had stood by while the Holocaust took place.

Foley’s argument that the League had been the first Australian organisation to raise its voice against the pogrom provided a striking counterpoint to the behaviour of other bystanders, or so it was believed. Just a few years earlier, Steven Spielberg’s remarkably popular Hollywood movie Schindler’s List had told a similarly uplifting story about an unlikely figure who rescued hundreds of Jews from the Nazi genocide.

I imagine the Holocaust Centre — and Melbourne’s Jewish community more generally — would also have been attracted to the story of the League’s protest because a particular kind of politics had become increasingly influential in Australia and many other Western societies — the politics of recognition, whose key words included remembrance, rights and reparation. Many non-Aboriginal people, or at least Anglo-Australians, now felt moved to tackle what was called “the great Australian silence” about Australia’s history of Aboriginal dispossession, displacement, destruction and discrimination. Increasingly, some were characterising this history as genocide — most recently in a report for the Australian Human Rights and Equal Opportunity Commission about the stolen generations.

The Holocaust Centre’s key figures would likely have been influenced, too, by a shift in how the past was being recounted and how people were relating to it, namely the rise in both scholarly and public circles of what was called “memory,” especially in the form of testimony, which had occurred most notably in accounts of the Holocaust. Those who had experienced a historical event had come to be seen as the most authoritative bearers of the truth about the past, so much so that “memory” was increasingly regarded in the media, and even by some scholars, as a substitute for history as told by academic historians, rather than just a supplement to those accounts.

An emerging scholarly and popular discourse also encouraged dividing those present in difficult historical circumstances into perpetrators, victims, collaborators, bystanders and resisters. In the case of settler societies, Indigenous people were called on to recall the past as victims; and non-Indigenous people were urged to listen to their testimony, acknowledge the truths they uttered, recognise the pain they had suffered, repudiate a past in which European ancestors were held to have been perpetrators, collaborators or bystanders (though sometimes resisters), and make amends for its legacies.


Over many years, this story has been told repeatedly in myriad forms outside the Jewish community: in commemorations, memorials, exhibitions, re-enactments, naming ceremonies, news reports, radio programs, books, magazine articles, essays, plays, paintings, musical compositions, blogs, videos and podcasts. It has been taken up by numerous government institutions and embraced by many sympathetic Anglo-Australians.

In December 2010 the largest and most senior Australian parliamentary delegation ever to visit Israel travelled to Jerusalem to participate in a series of events commemorating the League’s protest. In November 2017 a representative of the German government responsible for relations with Jewish organisations and issues relating to anti-Semitism, Felix Klein, accepted a document purporting to be the petition in Berlin, and in December 2020 he issued a formal apology for the German consul’s refusal to accept the petition some eighty years earlier.

In Australia, government bodies decided to name places in Melbourne in Cooper’s honour — a federal electorate, a building that houses several law courts and legal tribunals, an institute at a university, and a footbridge at a train station — and in each instance reference was made to the protest to the German consulate. The Aboriginal filmmakers Rachel Perkins and Beck Cole saw no reason to discuss the protest in their 2008 documentary First Australians; in 2020 the Aboriginal radio broadcaster Daniel Browning commissioned an episode of the ABC’s AWAYE! about Cooper that was framed by the story.

Cultural institutions have followed suit. The National Museum of Australia’s website feature “Defining Moments in Australian History” includes the story. Heritage Victoria has taken an interest in two of the houses in which Cooper lived, each of which displays an account of the story. The Victorian government department of education and training has included the story in its curriculum. And a historical society in Cooper’s traditional country has created an online exhibit about Cooper that focuses on the protest to the German consulate.


In the account of the protest I give in my life history of William Cooper, key historical facts are different. For example, the deputation to the German consulate can’t be attributed to Cooper alone, for while he was the Australian Aborigines’ League’s principal figure, League members (who included a whitefella by the name of Arthur Burdeu) played a major role; and the League, let alone Cooper, was by no means the first in Australia to formally protest against the Nazi German persecution of Jews after Kristallnacht, for two left-wing organisations had already tried to deliver a protest to the consulate in Melbourne. Nor can we be sure that Cooper was present when the attempt was made to hand over the petition — indeed, it is quite likely that he was not, as his health had declined considerably by this time.

More importantly, in the story that I tell, the meaning of this event is different. My point of departure is that the League was quintessentially a political organisation — and an Aboriginal one at that — and that it consequently went about its work in a strategic fashion, always considering what might be the best possible ways to fashion a case that could persuade white Australians to support its struggle to improve the lot of its people.

In the month prior to drawing up its petition, the League had evidently been conducting much of its political work by drawing parallels between Nazi Germany’s persecution of Jewish people and more than one Australian government’s treatment of Aboriginal people. It pointed out that the persecution of Indigenous people in Australia was akin to that experienced by racial minorities in Europe, and asserted that Australia should be as concerned about the rights of its people as it was about the rights of those other minorities.

One sentence in particular in the League’s petition — or what has survived of it — provides evidence that this was the point of its protest: “Like the Jews, our people have suffered much cruelty, exploitation and misunderstanding as a minority at the hands of another race.”

Other parts of the historical record also suggest that the League’s protest sought to draw attention to similarities between the treatment of the Jewish minority by the Nazi German government and the treatment of the Aboriginal minority by Australian governments.

Barely ten days after the League left its petition at the German consulate, a letter in Cooper’s name sent by the League to a federal minister said, “We feel that while we [Australians] are all indignant over Hitler’s treatment of the Jews, we [Aboriginal people] are getting the same treatment here and we would like this fact duly considered.” Several months later, in a letter now held in the National Archives, Cooper told prime minister Robert Menzies, “I do trust that care for a suffering minority will… not allow Australia’s minority problem to be as undesirable as the European minorities of which we read so much in the press.”

Shortly afterwards, following the outbreak of war, Cooper spelled out the kind of connection he and the League were trying to make in protesting against the Nazi persecution of German Jews. “Australia is linked with the Empire in a fight for the rights of minorities…” he told Menzies. “Yet we are a minority with just as real oppression.” A year later, Cooper’s protégé Doug Nicholls posed this rhetorical question to a congregation in a Melbourne church: “Australians were raving about persecuted minorities in other parts of the world, but were they ready to voice their support for the unjustly treated Aboriginal minority in Australia?”

My interpretation of the League’s protest rests not only on these written historical records but also on a source that can be seen as the product of collective memory or tradition. About a year before the League’s deputation to the German consulate, Cooper, in the course of speaking at length to a white journalist, referred to his people’s “horror and fear of extermination,” saying: “It is in the blood, the racial memory, which recalls the terrible things done to them in years gone by.” (His most important political act, a petition to the British king, also expressed a fear of extermination by speaking of the need to “prevent the extinction of the Aboriginal race.”)

These statements give a sense that the League was drawn to make its protest to the German consulate because, consciously or unconsciously, its members identified themselves with German Jews as a result of their own people’s experience of violence.

The story I have told of the League’s protest makes clear that the story of Cooper as the man who stood up to Hitler has leached the event of the meaning it had at the time for those responsible for it, and the meaning it could have today.


In everyday parlance, “myth” refers to a statement that is widely considered to be false. In using this word to describe the story of Cooper as the man who stood up to Hitler I don’t want to exclude this connotation, but I have something more ambiguous in mind.

Most myths have a genuine link to a genuine past. To be considered plausible, an account of the past must have at least a partial relationship to past reality, and thus to what is regarded as historically truthful. In this instance, it is a historical fact that the Australian Aborigines’ League sent a deputation to the German consulate in Melbourne in December 1938 to present a petition in which it protested against the persecution of the Jewish people.

But the rest of the story is a good example of what the great British historian Eric Hobsbawm once called “the invention of tradition.” It has been created by projecting onto Cooper a purpose and a character that the storytellers wish him to have had. The historical fact of the deputation aside, none of the story has been formed on the basis of the historical record.

Like most myths, this story achieves its most powerful effects not by falsifying historical material — though one of the organisations that has played a leading role in producing the story has fabricated the petition the League presented to the German consulate — but through omission, distortion and oversimplification.

Consequently, Cooper is recognised not because of his people’s loss, pain and suffering, but because he recognised the Jewish people’s loss, pain and suffering. This is the point of the storytelling. As a result, the popular account deflects attention from the devastating impact of racism and colonialism on this country’s Aboriginal people, and their struggle to lay bare its legacies and get them redressed. Such can be the cunning of recognition. What might purport to recognise the history of Aboriginal people misrecognises it.

The degree to which this myth has distorted how Cooper should be remembered — and the costs of that distortion — is highlighted by comparing it with what Gary Foley was doing. He was practising history in keeping with the discipline’s protocols, which include the recovery of the relevant historical texts and historical contexts, and was also adopting the techniques of Aboriginal history, a subdiscipline that seeks to make sense of the past and its presence by engaging with Aboriginal historical sources, subjects, agency and perspectives. The evolving story of Cooper’s protest ignored Foley’s main aim: he was seeking to draw attention to parallels between the murderous Nazi German campaign against the Jews of Europe and what had happened and was still happening to Aboriginal people in Australia.

In Australia, as elsewhere, history as a way of knowing and understanding the past threatens to be displaced by myth and memory (or what is deemed to be memory) that make claims about the past that are seldom tested and provide little explanation for what happened and why. Yet, as historian Allan Megill has suggested, “truth and justice, or whatever simulacra of them remain to us, require at least the ghost of History if they are to have any claim on people at all. What is left otherwise is only what feels good (or satisfyingly bad) at the moment.” •

The post The evolution of a myth appeared first on Inside Story.

]]>
https://insidestory.org.au/the-evolution-of-a-myth/feed/ 0
Boomer time https://insidestory.org.au/boomer-time/ https://insidestory.org.au/boomer-time/#comments Wed, 24 May 2023 02:22:08 +0000 https://insidestory.org.au/?p=74214

Inside Story editor Peter Browne introduces a memoir of Australia’s fifties by contributor Robert Milliken, who died last Sunday

The post Boomer time appeared first on Inside Story.

]]>
Since our mutual friend Hamish McDonald sent news that Inside Story contributor Robert Milliken had died on Sunday morning I’ve been thinking about how best to write a short piece — an appreciation rather than an obituary — sketching his life and career.

The task is complicated by a paradox. As well as having a great gift for friendship Robert was in many ways a very private person. So I’ll leave it mainly to the extract below — from a short family history he was working on — to give a sense of the forces that created a gifted reporter who published thousands of carefully crafted pieces over a more than fifty-year career.

Robert spent his childhood in Wingham, a NSW town on the Manning River, where his parents ran a residential hotel. Those years left him with warm memories of the character and pace of postwar country life, tempered by a growing sense that change was inevitable. More importantly, life at the Wingham Hotel — a microcosm of rural Australia — fuelled in him an intense curiosity. Journalism seems always to have been the logical end point of those early influences.

After studying politics at the University of New South Wales he took up a cadetship with the Sydney Morning Herald, where his reporting skills were soon apparent. He became known to readers outside Sydney after he moved to another Fairfax paper, the National Times, to write and edit features.

He was also contributing Australian news to the Guardian in London, and it was probably those pieces that attracted the attention of the Independent, the exciting new paper launched by a trio of journalists in London in 1986. One of his first assignments as the paper’s Australian correspondent was the legally delicate job of covering the Spycatcher trial. Reporting on this attempt by the British government to suppress the Australian publication of a controversial MI5 memoir was complicated by a ruling by the Law Lords back in London, who had declared any mention of the book’s contents off limits for the British media.

After more than a decade with the Independent Robert was appointed Australian correspondent for the Economist, to which he continued contributing — regularly then occasionally — until quite recently. Throughout those years he also contributed to Australian magazines including Australian Society, Anne Summers Reports, the Good Weekend and, from 2009, Inside Story. For a time he wrote editorials for the Sydney Morning Herald.

Somehow during these years he found time to write a history of British nuclear testing in Australia, a book about rural Australia’s social and economic upheaval and a biography (extracted here) of the pioneering rock journalist Lillian Roxon.

Among his articles for Australian Society were two on the royal commission into Aboriginal deaths in custody. That interest in Indigenous affairs carried over into two outstanding pieces for Inside Story based on visits to Bourke and Moree to see innovative justice projects in action. Among his other features for Inside Story was a profile of the maverick western Sydney Liberal Craig Laundy, an account of the migration-led revival of Dubbo, and a report on the unveiling of a new statue, also in Dubbo, of Aboriginal rights leader William Ferguson.

He was a fierce critic of Australia’s treatment of refugees and an equally fierce advocate of an Australian republic. He wrote meticulously but responded amiably to editorial meddling. His circle of friends and acquaintances was wide, and he was invariably a welcoming presence during my visits to Sydney. I am among the many who will miss him enormously.

Here, then, is a short extract from Robert’s last writing project…


On Friday 20 September 1946 the Wingham Chronicle carried a small item near the top of its “Personal” column: “Mr and Mrs Dave Milliken, of the Wingham Hotel, are being congratulated on the birth last weekend of a son and heir.”

The son and heir was me. My sister and only sibling, Sue, had been born six and a half years earlier, but no one ever called her a daughter and heiress. My birth came in the first year of the baby boomers, the post–second world war generation whose arrival presaged big social change. But old attitudes on women’s role in society, and much else, still died hard.

Heir to what? My grandfathers, Harry Cross and James Milliken, had separately built enterprises of the kind around which life on the mid-north coast of New South Wales, and in many other rural regions, revolved: a country hotel and a dairy farm. The worlds these tow institutions encompassed had barely changed in at least fifty years. But they were about to do so, not least for their baby-boom grandchildren.

It was probably 1950 when the first of us boomers became aware of the world around us. Shorn of the privations of economic depression and war, we were defined by youth and renewal: the opening up of education, the postwar rebuilding, the arrival of different sorts of people from the mono-Anglo immigrants of our parents’ generation, and a new popular culture captured largely by the biggest glamour figures of all time, Marilyn Monroe and Elvis Presley. All this spelled confidence. How could we not be different?

I’ve long wanted to write about my childhood in an Australia that has largely passed, where people in New South Wales, at least, lived according to simpler patterns and precepts. Political trends and social mores seemed set in stone; few, if any, questioned them. There were no movements to advance the interests of women, immigrants, First Australians, gay people and others outside society’s masculine conformity because they barely seemed to exist.

Inevitably, my two grandfathers — whose businesses defined much about the rural Australia I entered — provided the stepping-off points. The first baby boomers were born during a crucial transition, from the tail end of the era of European expansion to the opening up of new cultural frontiers.

The Wingham Hotel, also known as Cross’s Hotel, stood confidently and invitingly at the entrance to Wingham, a town of perhaps 3000 people on the Manning River, about 320 kilometres north of Sydney. The Milliken farm, “Magheramorne,” faced the Wallamba River at Darawank, a hamlet near the Pacific Ocean about thirty-six kilometres southeast of Wingham.

Before the days of motels and licensed clubs, country hotels like ours played key roles in country life. They were the places where people stayed, ate, met, did business and, at the Wingham Hotel at least, lived. The residents weren’t people just looking for somewhere cheap to doss. They were what today would be called young professionals, for whom the hotel offered comfort and security.

In my first few years, residents included a pharmacist, a doctor, an ex–prisoner of war from Changi, and the venerable Miss Paterson, who became Wingham’s first female health inspector in 1949. They were the “permanents” who, in some ways, became part of the family.

Yet social mores kept familiarity at a distance. We called them Mr, Mrs or Miss, never by their first names (the honorific Ms hadn’t been coined). When I met her again fifty years later in her retirement in a mid-north coast beach town, Miss Paterson gave a sense of how these rigidities were starting to break down when she landed in Wingham after the war.

“There was a first-name basis largely, and I didn’t think that was right,” she told me. “You weren’t going to have a disciplined staff if they were going to call you Bill and Joe and whatever. So I was trying to educate them, but I don’t think I had any success at all. In the office itself, the girls all called one another by their first names, but maybe I just looked difficult. The town clerk always called me Miss Paterson. Some of the labourers would come in and say, “Is Jim in?” meaning the town clerk. I’d give them a lecture, and say Mr-whoever-was-the-town-clerk was in.”

Social life was more relaxed, with people expressing their feelings in sayings that have largely fallen out of use. Instead of swearing, publicly at least, they said “Strike a light,” “Spare me days,” “God strewth” or just “Strewth” to convey shock or exasperation, and “God give me strength” for outright disapproval.

I didn’t inherit either the hotel or the dairy farm, but each of them has remained embedded in my imagination. That’s because the hotel in particular, but even the farm, were such vibrant places where people, not machines, computers and algorithms, were the drivers of daily life.


By the time I was born, both grandfathers were dead. My parents, Thelma (known as Thel), Harry Cross’s elder daughter, and David (known as Dave), James Milliken’s youngest son, had married in 1939 and, the following year, taken over the Wingham Hotel in partnership with Thel’s younger sister, Jennie. We lived as a family in a sprawling flat upstairs, and while Thel, Dave and Jennie were running the business downstairs Sue and I were endlessly fascinated by the colourful cast of characters — staff, patrons, diners, drinkers, travelling salesmen and visitors of all kinds — who thronged the hotel’s kitchen, dining room, bars and lounges.

In some ways, it was like living in a frontier town of the kind depicted in the Westerns that featured in Wingham’s two cinemas (then known as picture theatres) in the 1940s and 50s. One artist’s depiction of the approach to Wingham — looking across the Cedar Party Creek bridge and up the rise of Wynter Street to the Wingham Hotel — evokes the town entrance of my childhood, unchanged as it must have been for decades. I imagine coaches bringing people along the dirt road and bullock trains taking freshly sawn native cedar and eucalyptus logs from forests in the hills around Wingham, down Isabella Street to the wharf, where they were shipped to Sydney and the wider world.

Wingham’s own world was a self-sufficient one. There were no supermarkets, no clothing or hardware chain stores owned by distant conglomerates. Local families — the Moxeys, the Gleesons, the Maitlands, the Mellicks and others — owned and ran the local businesses that provided food, groceries, clothes, farm equipment and almost every provision townsfolk needed.

This self-sufficiency helped to give Wingham and its district’s tight-knit population a strong sense of identity. So did the local economy, which revolved around dairy and beef farming and timber. It belonged to a world in which most of Australia’s exports came from the bush. That, too, was about to change, as hardships from the past faded away and the new golden age, born with the baby boomers, began.

Thel, Dave, Jennie and their generation had lived through two of the worst events of the twentieth century: the Depression of the 1930s and the second world war. The war had come to the Wingham Hotel in various ways. Family friends went, or were sent, to live there, seeking sanctuary from isolation and attack. And Dave fought battles of a different sort with government authorities over the rationing of beer.


Although the war had ended just a year before I was born, through my childhood eyes it was as if it had never happened. A new world of abundance and prosperity was unfolding.

A fortnight after I was born Ben Chifley won the 1946 election for the Labor Party, claiming Australia was “about to enter upon the greatest era in her history.” The start of the baby boom fuelled demand for housing and consumer goods, and a big rise in immigration helped to underpin postwar economic expansion. As the historian Stuart Macintyre observed, “The third quarter of the twentieth century was an era of growth unmatched since the second half of the nineteenth century.”

Along with growth and prosperity, three events in 1949, three years after I was born, roughly defined the world I was entering. Mao Zedong led the Chinese Communist Party to power, founding the People’s Republic of China. The Soviet Union successfully tested its first atomic weapon, ending America’s monopoly as a nuclear power. Those two events consolidated the cold war: a strategic rivalry between the West and the Soviet Union and its allies, including the fear of nuclear war, that was a fundamental feature of the 1950s.

The third significant event of 1949 that helped fix Australia’s political world happened closer to home. Bob Menzies, founder of the Liberal Party, won the 1949 federal election, and remained Australia’s prime minister for a record seventeen years. Menzies was a consummate politician for whom the economic boom at home and the cold war’s uncertainties abroad facilitated a hold on power. The government’s anti-communist rhetoric pervaded the 1950s, with Menzies warning of Australia falling victim to a “thrust by Communist China between the Indian and Pacific Oceans.”

There was little sense of a new form of postwar Australian nationalism emerging. Another twenty years had to pass for that to happen. Menzies, the ultimate Anglophile and monarchist, folded Australia’s identity into its British colonial heritage just as that world was growing rapidly out of date. In a speech to the US House of Representatives in 1950, he declared: “The world needs the United States of America. The world needs the British peoples of the world.” He made no mention of his own land as a separate sovereign entity.

As a child at Wingham public school, opposite our family’s hotel, I attended Empire Day, a curious annual celebration of the British Empire, with bonfires and fireworks, that ceased only in 1958. The Biripi Aboriginal community, who’d lived in the Manning Valley for tens of thousands of years before the Crosses, Millikens and other settlers arrived, were not included. The empire had robbed them of their lands and much of their cultural heritage. They were not seen, and nor did the school mention their names or story. As a child, I didn’t know they existed; to my knowledge, I never saw an Aboriginal person in Wingham.

In the first years of the baby boomers, Aboriginal Australians were kept in their colonial-era places, the missions and settlements, usually in squalor. Purfleet, near the Manning town of Taree, and a settlement in Forster, at the mouth of the Wallamba River, offered my first childhood glimpses of Aboriginal people, but only as we drove past, and with no discussion of who they were or how they got there. Righting injustices was not part of Australia’s immediate postwar agenda.

Too much else was happening to redefine postwar Australia as a land of wealth, confidence and leisure. The first Sydney–Hobart yacht race was held in 1945. Australia started making cars in 1948. Construction of the most ambitious public enterprise — the Snowy Mountains Hydro-Electric Scheme — started in 1949. Many of the workers who built that project, who comprised the first wave of immigrants drawn from European countries other than Britain, were trailblazers of the multicultural profile that eventually changed the country’s human face.

The changes didn’t stop at home. Overseas, Australia was joining the American Century. To replace our old dependence on Britain, we looked across the Pacific to form security alliances with our new “great and powerful friend,” as Menzies called the United States, which had led us to victory in the Pacific war. America’s cultural influence reached a zenith during the 1950s, when the first wave of baby boomers came into childhood. The surge of popular culture from America included the birth of rock-and-roll, resonating among a new generation in an Australia that had given barely any encouragement to local voices in film, drama or music.

All this gave a young baby boomer the sense of an exciting and prosperous, yet secure world. Menzies’s reassuring tones on the radio and in newsreels (television didn’t come to Australia until 1956) helped see to that. The rhythm of life in the sheltered worlds of the Wingham Hotel and the Magheramorne farm, and elsewhere, hardly varied from one year to the next.

And yet it was about to change. In the mid 1950s, Thel and Dave sold the Wingham Hotel, bringing to an end a family ownership of three generations. We moved to Glory Vale, a beautiful farm near Gloucester, also on the Manning River. I rode a horse every day to a one-room bush school. In this unlikely place, we had a brush with Hollywood glamour when the star Anne Baxter settled incongruously for a time further along the Manning. A way of life for rural Australians would soon pass forever. •

The post Boomer time appeared first on Inside Story.

]]>
https://insidestory.org.au/boomer-time/feed/ 1
Slicing the tide https://insidestory.org.au/slicing-the-tide/ https://insidestory.org.au/slicing-the-tide/#respond Tue, 16 May 2023 04:01:52 +0000 https://insidestory.org.au/?p=74082

English writer Alethea Hayter pioneered a new way of framing history

The post Slicing the tide appeared first on Inside Story.

]]>
When the editor of Inside Story asked me to review Faber’s reprint of Alethea Hayter’s book A Sultry Month: Scenes of London Literary Life in 1846 — a book that “shifted the way I looked at history,” according to one reviewer — I happily agreed. I was drawn to the task partly because I have myself recently tried reconstructing London life in the 1840s for a family history, but mostly by a longstanding interest in historical “slicing.”

Almost fifty years ago, in 1977, the historian Ken Inglis proposed that the history profession should mark the bicentenary of the arrival of the first convict ship in Sydney Cove in 1788 by publishing a series of multi-authored volumes spanning Australia’s 200 years. Most significantly, he suggested that four of these should be “slice” volumes, each telling the history of a single year: 1788, 1838, 1888 and 1938. They would recreate past lives and understandings at a particular point in time, without historical hindsight and imposed interpretation.

His plan was not well received by the profession. Historian Graeme Davison remembers the slice idea being described as “idiosyncratic, antiquarian, monopolistic and, worst of all, anti-historical.” After the publication of the volumes in 1987, reviewers argued that the slices were not “real history.” Kinder critics like Janet McCalman acknowledged that slicing produced vivid recreations of past lives, but found it “frustrating that we do not know how these lives turned out.” Historian Jill Roe wrote that “I’ve met some interesting people” but in the end historians should “get on with making sense of our history.”

I am not a disinterested reporter here. Alan Atkinson and I organised “the push from the bush” — a group of historians from disparate disciplines, professional and non-professional — to write the 1838 slice volume. It seemed to me at the time, and it is far more evident now, that any attempt at “making sense” of Australian history was more complex and contradictory than our critics allowed.

In 2023 it is even clearer that the big questions that used to drive historians have failed us, and we are the wiser for that realisation. The acknowledgement of many voices as makers of our history has enriched our understanding even as it unsettled our certainties. And slicing hasn’t gone away: Google Scholar tells us that the bicentennial slice volumes have been regularly cited across the years, and are still useful to young scholars. This is not something one can say for most histories published in 1987.

The reprint of Hayter’s A Sultry Month is timely for another reason, too. Slicing literary history is in the news, at least in literary circles. The administrators of the annual Baillie Gifford Prize for non-fiction, the equivalent of fiction’s Booker Prize, have just celebrated its twenty-fifth anniversary with a Winner of Winners Award chosen from the past twenty-five years’ prize-holders. The uber winner is James Shapiro’s 1599: A Year in the Life of William Shakespeare, described by the judges as “an ingenious fusion of history, politics, and literary criticism.” The judges claimed a particular significance for Shapiro’s book as a creative work of narrative non-fiction, telling stories that are true in an age of fake news.

A Sultry Month might equally be described as “an ingenious fusion of history, politics, and literary criticism.” But the understanding it presents of historical truth is more complicated than the Baillie Gifford judges’ equation of “non-fiction” with “true stories.”

Hayter’s book was hailed as extraordinary and pioneering when it was first published in 1965; it still deserves these descriptors today. She begins with an impressive claim for the historical truth of her story in terms of her sources:

Nothing in this book is invented. Every incident, every sentence of dialogue, every gesture, the food, the flowers, the furniture, all are taken from the contemporary letters, diaries and reminiscences of the men and women concerned, nearly all of them professional writers with formidable memories and highly trained descriptive skills.

From these sources Hayter presents, as she says, “a set of authors… as a conversation piece of equals, existing in relationship to each other at a particular moment, encapsulated with one dramatic event in an overheated political and physical climate.”

The overheated physical climate is the heatwave that enveloped England in June 1846 — “the hottest summer month that anyone could remember.” The overheated political climate centres on the repeal in mid June of the Corn Laws that had protected English wheat growers for decades. The heroes of this story are Sir Robert Peel, under vicious attack in the Commons for betraying the landed interests at the heart of the Conservative Party, and Peel’s loyal lieutenant in the House of Lords, the old Duke of Wellington.

In Hayter’s story the two men function as the reluctant agents of change, condemned by the older generation and applauded by the young. She tells how the historian Thomas Carlyle sent a copy of his biography of Cromwell to Peel with a note hailing “the great veracity” Peel had carried off in parliament, “a strenuous, courageous and needful thing.” For Carlyle the deed was “true” because it was on the side of history.

Thomas Carlyle is a central figure in the “set of authors” whose conversation Hayter follows across this heated month. The parties hosted by Carlyle and his wife Jane bring together most of the subjects of this group biography: Robert Browning, Alfred Tennyson, and others forgotten today but esteemed by their peers.

One major player, Elizabeth Barrett, barely leaves her room in Wimpole Street during June; she is linked to the “set” by her relationship with Robert Browning. Across the month the young couple, the Barrett-Brownings, and the older couple, the Carlyles, move to moments of crisis. Jane Carlyle discovers a deep dissatisfaction with her marriage, to her husband’s astonishment. Elizabeth Barrett and Robert Browning plan their secret marriage and elopement to Italy, only achieved in July and for this reader too briefly described — a casualty of slicing.

These well-known players draw and hold the readers’ interest, but the narrative is mostly driven by someone of whom few will have heard. The central figure in Hayter’s “one dramatic event” is the painter Benjamin Haydon, an artist of mediocre ability and huge ambition. Haydon knew, in Hayter’s words, that “he had been called by God to raise his country’s tastes” to an appreciation of “High Art” — “historical painting,” that is — “the only real art was huge pictures of the heroic actions of history.”

Haydon’s large canvasses were admired in his lifetime, though more for their motivating ideas than their artistry, and they didn’t sell. Other artists were chosen to paint the historical murals he had proposed for London’s public buildings. His lasting contribution to the visual arts was his campaign to popularise the Elgin Marbles, ensuring they remained in the British Museum — a more virtuous legacy in 1965 than it seems today.

Hayter opens A Sultry Month with the moment on 18 June when Benjamin Haydon left five pictures and three trunks of papers with his “friend by correspondence,” Elizabeth Barrett, to save them from seizure for debt. The narrative keeps Haydon in view across the next few days, alongside parties at the Carlyles’, debates in the Commons, and constant letters exchanged between Robert and Elizabeth. It follows the agitated entries in Haydon’s diary, looking back to the end of May when he wrote “Oh Lord! Carry me through the next and dangerous month” and cited a series of prayers asking for the strength to finish his paintings. On 20 June he wrote nothing in his diary but “Oh God, bless us all through the evils of this day.” On Sunday 21 June he wrote: “Slept horrible. Prayed in sorrow and got up in agitation.” On the morning of Monday 22 June he committed suicide. The cleansing storm of wind and rain that swept across the country that evening came too late for Haydon.

Hayter tells us that forty years earlier Haydon had written in his diary:

I knelt down and prayed to God to bless my career, to grant me energy to create a new era in art, and to rouse the people and patrons to a just estimate of the moral value of historical painting.

She suggests that for Haydon history was “a series of historical paintings of great events.” He believed he was the divinely chosen hero to create these paintings, making public and private history in the process. History, both public and private, failed Benjamin Haydon.

Back to historical truth. The stories Hayter tells are true in every detail, down to the slippery pool of Haydon’s blood on the floor of his studio, and the leaves exchanged by Robert and Elizabeth — brown autumnal leaves from him, green wild-rose leaves from her, carrying contradictory messages about the coming end of summer and the urgency of their elopement. Hayter can sustain this level of authenticity because, as she says, her actors/informants are “professional writers with formidable memories and highly trained descriptive skills.”

But not everything that they remember and record is true. Hayter notes that Haydon did not recognise the irony with which his wife described him as forever “feeding on his own thoughts.” Elizabeth Barrett was “a not quite objective witness” in ranking Robert Browning alongside Alfred Tennyson as a “modern poet,” since in 1846 Tennyson was already famous and Browning’s reputation was “confined to a few discerning critics.” In recreating the lives and understandings of her actors, Hayter intervenes in her own voice with a gentle flow of commentary and inference.

But Hayter doesn’t add a layer of interpretation to her account, doesn’t try to “make sense” of their history. Only in the political scenes does she suggest a reading that goes beyond the events she describes: the tension between older and younger generations lies in their responses to larger historical change. And even this is located firmly in the language of her actors. Old William Wordsworth wrote from Westmorland that the passage of the repeal of the Corn Laws presaged the beginning of a bloody revolution. Thomas Carlyle praised the repeal as “a strenuous, courageous and needful thing,” though he had grave doubts about the rise of a democracy without proper heroic leadership.

Back to slicing. In retrospect, Ken Inglis’s highest praise for slicing was that it had encouraged historians “to be more self-conscious about our prose than is general among academic authors.” This may seem one of the least of things, but it may be one of the greatest, especially when coupled with self-consciousness about the language of our sources. In Inglis’s own words:

I believe that sensitivity to the actor’s own vocabulary, idiom or rhetoric gives the historian a better chance of crossing that mysterious line between just chronicling past events and beginning to recreate past lives.

Alethea Hayter’s A Sultry Month triumphantly crosses that line. •

A Sultry Month: Scenes of London Literary Life in 1846
By Alethea Hayter | Faber | $29.99 | 256 pages

The post Slicing the tide appeared first on Inside Story.

]]>
https://insidestory.org.au/slicing-the-tide/feed/ 0
Injured instincts https://insidestory.org.au/injured-instincts/ https://insidestory.org.au/injured-instincts/#respond Fri, 12 May 2023 06:48:58 +0000 https://insidestory.org.au/?p=74035

Writer Kapka Kassabova continues her beguiling exploration of the Balkans

The post Injured instincts appeared first on Inside Story.

]]>
Baba Acivé is a woman of indeterminate age with a limp going back to when a granary door fell and crushed her hip many years ago. Her home is high in the mountains lining the valley through which flows the Mesta, the mighty river that runs between North Macedonia and Bulgaria, joins the Nestos in Greece, and empties into the Aegean.

Acivé is a baba, or healer, arguably the most esteemed of an army of healers operating in the heavily forested area near Clear Water River, one of the Mesta’s many mountainous offshoots. All sorts make the pilgrimage to her, she tells Kapka Kassabova: “Some that can’t walk. Some that can’t talk. Some that can’t see and some that can’t hear. Some with tumours and fright. Some come for babies. Some with sick children.”

Kassabova is a poet, novelist and memoirist whose work is a beguiling mix of history, geography, family history and travel. Born in Bulgaria in 1973 when that country was still within the Soviet bloc, she has lived in various places since its break-up in 1991. She now lives in a remote part of Scotland, but the Balkans, the scene of her birth and childhood, have had an enduring pull for her.

In two earlier books, Street Without a Name: Childhood and Other Misadventures in Bulgaria and To the Lake: A Balkan Journey of War and Peace, she interwove the vicissitudes of her family with the tumultuous history of a mountainous region that has formed the southern crossroad between Europe and Asia down the centuries. Her new book, Elixir, is also dense with information but relieved by the stories of the people she encounters on her travels.

Acivé’s healing power, we’re told, resides in her connection with a megalithic rock with a hole big enough for a person to clamber through. She travels there by bus with her supplicants, who are instructed to bring flour, salt and a length of red thread as long as they are tall. The thread is left on the railing by the ladder they climb to get to the stone; the flour and salt are for its invisible keeper. Acivé approaches the stone with an eclectic assortment of prayers, mainly from the Qur’an, but also from the Bible.

With its narrow hole symbolising rebirth, the rock is unsurprisingly called The Passage. People come from all over Europe hoping to benefit from its curative powers, or simply out of curiosity. It forms part of the tourism that has supplanted the industries that sustained this corner of the world for centuries.


Is Baba Acivé a wise woman or a charlatan? In essence, this is what Elixir is about.

Because of its mountainous inaccessibility, the Mesta basin retains old ways of healing overtaken elsewhere in Europe. From ancient times the area was rich in the medicinal herbs that formed the basis of its economy. The communist regime’s effort to capitalise on the local industry put paid to that. Attempting to transform the herbs into a cash crop, it disrupted the plant ecology and wiped out myriad useful medicinal plants.

Locals began recultivating the herbs after 1989, but large-scale farming brought a repeat of the communist-era disaster. Today, medicinal plant cultivation is a boutique enterprise, often undertaken by newcomers who have responded to the beauty and peacefulness of the region while locals take on seasonal work elsewhere in the European Union.

But plants don’t explain Baba Acivé’s allure. For that, Kassabova turns to psychology. Acivé’s powers, and those of The Passage seem to cure a sufficient number of her supplicants to substantiate her renown. Kassabova is sympathetic: in her view, we moderns suffer from what she calls our “injured instinct,” a general condition in which the mind is so separated from the body that we’ve lost the capacity to trust our own feelings. If the mind can make us sick, she writes, then it also has the power to heal: “The psyche performs its own alchemy.”

Kassabova’s argument here is nothing new, even in pill-popping societies like ours, where an array of medicinal alternatives are popular. Even conventional medical practitioners will encourage meditation alongside modern, allopathic methods.

I’m thankful to live within walking distance of a doctor: needless suffering too often occurs when access to life-saving modern medicine isn’t available. But I’m not about to dismiss Kassabova’s views out of hand, particularly when she draws on the wisdom of Carl Jung and Joseph Campbell about the ubiquitous power of myth, or of Nicolas Culpeper and Hildegard of Bingen about the medicinal properties of plants. Nor can I discount the beauty of her prose. And if the political analysis that distinguished To the Lake takes a back seat here, it isn’t missing altogether.

The Pomaks, a people of Bulgarian Slav origin and Muslim persuasion, are prominent among the peoples of the Mesta. Elders like Baba Acivé, maintain lives similar to those of their ancestors, with distinctive customs, clothes and dialect. The communists persecuted them, suppressing their language and their religion.

Just as they introduced cash crops, destroying the native forests and the plants that thrived in them, the Soviets cut traditional Pomak apparel to make it more like that of collective farm-workers. Kassabova contrasts this with how they were treated for centuries under the “laxer” Ottomans. Their empire’s collapse ushered in a wave of homogeneous nationalism that swept away many of its mixed villages. In Kassabova’s reckoning, here was another instance of monoculture; with people as with plants, multiculturalism is ever the better option.

Kassabova marshals a wealth of evidence to support her contentions, incorporating history, botany, folk wisdom, psychoanalytic and ancient philosophical insights, and ecology, each of which has the capacity to enrich our understanding of the world, not to mention ourselves. It’s been perplexing to me, then, that I found this book somewhat disappointing.

Perhaps I am less engaged because the family history that anchored her narrative in Street Without a Name and To the Lake is missing here — though, in their way, her travels up and down the Mesta are more deeply personal, powered by a search for psychic health and connection so many of us humans share. The problem might also stem from this new book’s expanse of sources and material. There’s so much to take in, and if ever a book needed an index it’s this one.

That said, I’m glad to have it sitting on my shelf. Elixir is a book to dip into whenever I want to find out something about a plant or, overcome by bouts of debilitating weltschmerz, need the inspiration and the balm for my soul that Kassabova sought in the wild mysterious mountains of the Mesta. I’m never going to make it there myself, and that in the end is what books like hers are for. •

Elixir: In the Valley at the End of Time
By Kapka Kassabova | Jonathan Cape | $45 | 380 pages

The post Injured instincts appeared first on Inside Story.

]]>
https://insidestory.org.au/injured-instincts/feed/ 0
Rock, water, paper https://insidestory.org.au/rock-water-paper/ https://insidestory.org.au/rock-water-paper/#respond Mon, 24 Apr 2023 02:00:52 +0000 https://insidestory.org.au/?p=73791

Newly opened and unexpectedly vulnerable, the Australian War Memorial faced its first onslaught in January 1936

The post Rock, water, paper appeared first on Inside Story.

]]>
In March this year the Australian War Memorial invited Canberra schoolchildren to name the two massive cranes that will tower for the next two years over the memorial’s building extensions. Visible from space no doubt, the cranes will be named “Duffy,” for one of Simpson’s donkeys, and “Teddy,” after Edward Sheean, Australia’s latest Victoria Cross recipient. “Poppy,” “Anzac” and “Biscuit” were among the names rejected.

The exercise was presumably designed to make Canberrans feel good about the controversial $550 million project. Cranes hovering overhead and massive earthworks front and rear will invite many uneasy glances at a building that has nestled for decades at the foot of Mount Ainslie as if it grew of its own accord out of the ancient earth.

Of course it did not. As Michael McKernan showed in his history of the memorial, Here Is Their Spirit (1991), between the official announcement of the site in 1923 and the opening of the building by prime minister John Curtin in 1941, hurdles and setbacks tested the faith of its most ardent supporters. Even in 1941, the building was incomplete: the exhibition galleries were opened to the public but the grounds and commemorative areas, including the Roll of Honour and the Hall of Memory, took several more decades to finish.

All those struggles might be forgotten, but the project was once regarded with such trepidation by federal authorities that it was held to a budget — £250,000 — that was manifestly inadequate even for the modest, restrained building that Charles Bean, one of the memorial’s founders, had dreamed of. He had imagined a memorial on a hilltop: “still, beautiful, gleaming white and silent.”

Politicians, though, were more interested in memorials in their local districts than a national memorial most of their constituents would never see. After a vexed and abortive architectural competition, a design for the national memorial was agreed upon in 1929, but with the onset of the Depression the project had to be shelved. Finally, in February 1934, the building contract was awarded to Simmie & Co., a firm that built many of Canberra’s early public and commercial buildings.

It’s long been a fancy of mine that the land itself tried to reject the building being raised upon it, calling up malevolent spirits to cast spells over it. For starters, the winter of 1934 was the wettest then on record. Next, the foundations took much longer to excavate than expected because the trial holes dug during the tender period had not revealed how hard and rocky the site really was. Quizzed over delays in the project, Simmie’s principals complained that they had been “grossly misled” in this regard.

The building was declared weathertight and ready for occupation in November 1935, but after all that effort the result — a long, low construction of garish red bricks from the local brickworks — was embarrassingly basic. The beautiful Hawkesbury sandstone cladding that lends so much quiet beauty to the building had not yet been applied, and influential observers complained it looked “squat” and “prison-like.” Building plans were hastily altered to raise the height of the walls, and later the dome, causing more headaches for Simmie.

Despite these inauspicious circumstances, a doughty bunch of about twenty-five staff began preparing to move themselves and their families from Melbourne to the infant capital, along with 770 tons of objects, paintings, photographs, books, archival records and stores. These had been stored and exhibited in leased premises in Sydney and Melbourne.

Staff arrived in November 1935. While deputy director Tasman (Tas) Heyes moved into a house provided in Forrest, south of the Molonglo river, director John Treloar made what he called a “private arrangement.” After several previous stints living in Canberra, his wife Clarissa had refused to move this time and remained in Melbourne with their four school-age children. Treloar set himself up in the memorial with his suitcases, a wardrobe and a single stretcher. He was not a man with elaborate personal wants; as a staff clerk on Gallipoli in 1915 Treloar had slept and worked in the same dugout and took advantage of the short commute to work punishing hours. This he now proceeded to do again. Although not a cold or humourless man, austerity suited him.

It had been a wet weekend, and from his house in Forrest late on that Sunday afternoon, 12 January 1936, Tas Heyes was keeping an uneasy eye on the sky. In those near-treeless days you could see far across Canberra, and it was obvious that a storm was gathering over Mount Ainslie. He and Treloar had inspected the memorial building on the previous Friday evening after heavy rain and found water seeping in through an unfaced brick wall on the lower-ground floor where the library would be. Cases of collection material stood nearby, ready for shelving. The water seepage had not been serious then, but now, when Heyes found that the storm had blotted out all sight of the memorial from his home, he got into his car.


In January 1936, just as everyone was settling in, those evil spirits decided as a final gesture to turn on one of Canberra’s cataclysmic summer storms. Today, staff in Canberra’s cultural institutions fully comprehend the power of these events, but in 1936 the memorial’s building was piteously vulnerable and the newly arrived Melburnians quite innocent of the harsh extremes and occasional violence of the weather on the high plains south of the Brindabellas.

John Treloar was already there, of course, along with two watchmen, Thomas Aldridge and George Wells, at their change of shift. Mount Ainslie was the centre of a terrific cloudburst, and from its slopes torrents of water were descending. The stormwater drain on its lower slopes had overflowed and water was washing silt and debris down to Ainslie, Reid and Braddon, and becoming trapped in the excavation around the memorial. The building’s lower-ground floor was below the watercourse and water was advancing into the building, sweeping down passages and up to the cases containing precious war records.

Another war: the AWM’s first director, John Treloar, shown here shortly before his secondment to the military in 1941. Ted Cranstone/Australian War Memorial

Many of the cases were raised from the floor on timber baulks, but this precaution had ceased when the building had been declared weathertight, and now several hundred cases were in immediate danger. The three men on site needed help, but none of the staff at that time had home telephones, so Aldridge drove off to gather them from their homes, leaving Treloar and Wells to scavenge timber to make platforms for the cases.

Scarcely had they begun this task when Aldridge returned, having abandoned his car where it had become bogged even before he got out of the grounds. By now, more water was sweeping into the building across a landing that had been built at a rear entrance to help bring in large objects. Treloar and Aldridge tried to dig a ditch to divert the water, but, as Treloar later reported, “the rocky ground defied the shovels which were the only tools we had.” They tried to wreck the landing but it was too well built.

Leaving his men to struggle with the records cases, Treloar phoned the fire brigade and was told that the chief fire officer could send men to pump water out of the building but only if it reached six inches, and they could not help move records or exhibits. Soon after, the telephone service broke down, leaving the three drenched and desperate men isolated. At this point, Tas Heyes finally made it through.

It was growing dark and the building in its primitive state had hardly any lights. Water was washing under doors and through unfinished sections of the roof. The waste pipes of wash basins and drinking fountains, as Treloar said later, “threw into the air jets of water several feet high.” Water was about to enter the room where the works of art were stored. It was impossible to move the cases in time, and improvised squeegees proved to be hopeless. Using chisels and their bare hands, Treloar and his staff tore up the floorboards at the entrance to the room, and the water, which was now creeping around the edges of the art cases, escaped beneath the building. Heyes set out in his car in another attempt to round up more staff to help; by 8pm about a dozen men were on the site and a few oil lamps had been obtained.

The worst was over. Manholes over drains were opened and water swept into them. Staff continued to clear the building of as much water as was possible, working in the dark with only improvised tools. By 1am Treloar decided to suspend work. The men were exhausted and most had been wet through for hours.

Treloar later told a colleague that the suit he had been wearing that day was ruined, a rare reference to himself and his personal comfort. Where he slept for the rest of that night isn’t known, but Heyes, a friend and colleague for many years, probably took him back to his house. Forrest had received no more than an ordinary shower of rain.


The next day the Canberra Times carried long reports of the flood. Six inches (more than 150 millimetres) of rain had fallen on Civic and the inner north in ninety-five minutes. The paper had rarely had such a dramatic local event to cover.

The memorial’s misfortunes were ignored at first in favour of the dramatic rescue of motorists stranded on Constitution Avenue, the many roads that were scoured or washed away, the five feet of water in the basement of Beauchamp House (a hotel in Acton), the “pitiful” state of Miss Mabbott’s frock shop in Civic, and the washed-out gardens and drowned chickens in Ainslie. These local calamities mattered more than what had happened at the memorial, of which the paper finally gave a brief report the next day. Few people really knew what went on in this strange new building anyway.

Monday 13 January at the memorial was a heavy, depressing day of mopping up, opening hundreds of cases and separating the wet from the dry. Two to three inches of water had entered the building. Some 2648 books were damaged and 719 had to be rebound. Among the most valuable was a large collection of histories of first world war German military units, which Treloar described to a newspaper reporter as “irreplaceable.” More than 700 cases of archival records were damaged, as were 10,327 photographic negatives. Thankfully paintings had been stored on their edges in crates so that only the frames were soiled, but 389 were damaged and 300 had to be remounted.

In the end, the damage was not so bad. The museum objects, stored on the upper floor, were untouched. Some of the damaged records were duplicates and, as Treloar reassured his board of management, water-stained books would not be less valuable as records, and the pictures when remounted would be “as attractive as formerly.”

Prints existed of some of the negatives, and the emulsified surfaces of the negatives had fortunately been fitted with cover glasses to protect them. Most of the records cases had been stored on timber two or three inches above the floor, although Treloar bitterly regretted his decision to abandon this practice shortly before the flood.

The salvage operation was instructive and useful in many ways. Treloar was enormously capable, but he liked to consult experts and tried to keep himself abreast of practices in museums, galleries and libraries in Australia and overseas. Here was a chance to call in some help and renew important associations. Leslie Bowles, a sculptor who often worked with the memorial, travelled from Melbourne to advise on the treatment of some battlefield models affected by the flood. Although Kodak and the Council for Scientific and Industrial Research were contacted for advice on the treatment of the negatives, Treloar soon turned to an expert from the Photographic Branch of the Department of Commerce in Melbourne.

The paper items needed the most treatment. A hot-air blower was obtained for the soaked documents, and eight local teenage girls arrived with their mothers’ electric irons on the Thursday after the flood. Treloar had been advised that the best way to fully dry and flatten the documents, newspapers and pages of books was to iron them, presumably with a piece of cloth over the paper. This was to be the job for the next few weeks for Enid, Ivy, Agnes, Betty, Jean, Thelma, Stella and Gwen.

The eight had been recruited through the Canberra YWCA, whose secretary had had many applications for the curious engagement. They were paid under the award for government-employed servants and laundresses, but surely never was a laundress entrusted with such a strange and delicate task. How anxiously Treloar must have watched them go about their work.

Some of the damp documents were part of the memorial’s collection of unit war diaries — not soldiers’ private diaries (although the memorial had a fine collection of those as well) but official records kept by each military unit. For Treloar, they were probably the most important part of the collection and he knew them intimately. They were mostly created on the battlefront, and it would have been agonising to imagine them engulfed by muddy water in the very building created to house and protect them.

Support and commiserations poured in. Arthur Bazley, assistant to official historian Charles Bean, phoned Treloar from Sydney to offer any help he could, using his Sydney contacts. Bean, on holiday in Austinmer, wrote to Treloar that he and Heyes “must have this comfort, that you know that all concerned are so aware of your carefulness and forethought, that their only feeling will be one of sympathy.” Federal interior minister Thomas Paterson, who had responsibility for the memorial, telephoned to find Treloar still lamenting the cases stacked directly on the floor; “an officer could not expect to be a prophet” was his kindly advice to the director.


After all the years of work and worry, Treloar was not present at the opening of the memorial on Armistice Day, 1941. He was in uniform again, based in Cairo managing the collecting effort for yet another war, leaving Tas Heyes to organise the ceremony.

The first Anzac Day at the memorial was held in 1942, the national ceremony having previously been held at Parliament House. With so many Australians fighting abroad and with the enemy at the nation’s doorstep, Anzac Day in the nation’s capital had never been so sombre (and wouldn’t be again until 2020, when Covid-19 restrictions forced the cancellation of traditional commemorations).

No veterans’ march was held that year, and Anzac Day sports were cancelled. The Canberra Times editorialised that the day found Australia a “battle station.” Anzacs “now stood guard on their own land” and any honour owed them was never so much due as on that day. It was to be a day “not of works but abiding faith.”

At the memorial a twenty-minute ceremony held in the commemorative courtyard was attended by a mere 600 people. Around them were bare walls: no Roll of Honour yet, and an empty Hall of Memorial. A single aircraft flew low overhead. Weatherwise, the morning was cool and overcast but there was no rain. •

The post Rock, water, paper appeared first on Inside Story.

]]>
https://insidestory.org.au/rock-water-paper/feed/ 0
Mayo Joe, son of Ballina https://insidestory.org.au/mayo-joe-son-of-ballina/ https://insidestory.org.au/mayo-joe-son-of-ballina/#respond Sat, 15 Apr 2023 00:15:57 +0000 https://insidestory.org.au/?p=73666

Did the American president’s deeply personal sense of Irish history meet the moment?

The post Mayo Joe, son of Ballina appeared first on Inside Story.

]]>
The timing of US president Joe Biden’s visit to Belfast and Dublin this week could not have been more delicate. The occasion was the twenty-fifth anniversary of the Good Friday Agreement, but the trip was as much an exercise in salvaging a peace process that has been teetering for some time.

The fallout of Brexit in 2016 brought an immediate souring of relations between the British and Irish governments, the likes of which had not been seen in decades. More critically, though, it rekindled communal tensions north of the border, culminating in the closure of the Northern Ireland Assembly at Stormont and the suspension of power sharing in May last year.

Much is now riding on a new British government with Rishi Sunak at the helm, signalling a more pragmatic approach to the Northern Ireland border. The recently negotiated “Windsor framework” for regulating goods crossing the Irish Sea has produced a tentative thaw in relations with the European Union, renewing hopes that Stormont might soon be reopened. Biden’s hastily planned visit was thus a calculated move to tip the balance at a crucial juncture.

His devout Irish Catholic affinities, however, also risked achieving the very opposite by raising the question of whether a presidential visit was the very last thing Ireland — north or south — needed at this time.

At his first and only public engagement in Belfast on Wednesday morning, Biden called on the people of Northern Ireland to leave their history behind and embrace the opportunities of a shared future. “Renewal,” “progress” and “repair” were the dominant themes in a speech that made only scant reference to the divisions of the past.

It was a big ask of a community only just emerging from the “decade of centenaries” — a rapid succession of major anniversaries marking some of the most difficult moments in modern Irish history, from the Ulster crisis of 1912, through the Great War, the struggle for independence, and the fateful partition of Ireland in 1921.

Historian Ian McBride recently said of his country’s acute sensitivity to the weight of history, “We’ve come to believe that dealing with the past or working through the past is somehow good for us.” But it’s only ten years since many people feared that the coming wave of commemorations would be anything but good for the peace process in Northern Ireland. Academics, politicians and community leaders were enlisted to find ways of ensuring that raking over the past would not spark a recrudescence of communal discord.

In the end, the decade of centenaries came and went largely without incident, foreshadowed by a historic visit to Dublin by Queen Elizabeth II — the first of its kind since her grandfather George V in 1911. Even the bedrock enmities evoked by the iconic year 1916 — Dublin’s Easter Rising and the loyalist veneration of the Battle of the Somme — produced few complications when it came to the commemorative program.

But 2016 brought dim tidings of an entirely different kind with the Brexit vote of June that year. Ultimately, it was not the inner workings of Irish memory that tested the mettle of the peace process, but the entirely unforeseen exigencies of a crisis manufactured in England.

An extraordinary inattention to the past was arguably Brexit’s defining characteristic — and its most potent legacy in Ireland. The referendum was notable for the almost complete absence of debate about the possible effects of leaving the European Union on a key plank of the Good Friday Agreement — keeping the Northern Ireland border as invisible as possible. It was one thing to dispense with border checks when both sides were members of the EU single market, but another matter entirely after the imposition of a hard border between the United Kingdom and the Republic of Ireland.

Even as these problems came painfully to light in the referendum’s protracted aftermath, advocates of a so-called “hard Brexit” continued to display a callous disregard for the Good Friday Agreement’s brittle compromise. “Softer” options were available to successive Tory governments, but any dilution of Britain’s freedom to chart its own course proved unacceptable to the Brexit ultras. Polling data corroborates this, suggesting that the most passionate leavers were less likely to care about the peace dividend in Northern Ireland.

Biden’s heartfelt message to his hosts — that “for too long, Ireland’s story has been told in the past tense” — somehow failed to capture the issue in all its complexity. Much of the recent turmoil might have been avoided had the past tense — not least the astonishing gains of the last twenty-five years — been given considerably more airplay.


In any event, Biden soon dispensed with his own maxim as he crossed the border into his ancestral home of Carlingford in County Louth. Suddenly, talking “in the past tense” seemed the only thing on the president’s mind in a place that felt “like I’m coming home.” It was as though he had stepped into an entirely different world. The fifty-mile drive from Belfast might as well have been fifty years.

Addressing the Irish parliament in Dublin the following day, Biden notched up eighteen references to “history” and the “past,” reflecting on the “hope and the heartbreak” of his ancestors upon “leaving their beloved homeland to begin their new lives in America.” These stories, he urged, comprised “the very heart of what binds Ireland and America together” — a story of shared “dreams,” “values,” “heritage,” “hopes,” “journeys” and, crucially for Biden, “blood.”

The remainder of his trip was an act of personal homage, flying to County Mayo to visit a family history and genealogical centre before proceeding to the Catholic pilgrimage site of Knock Shrine (the scene of a purported holy apparition in 1879). He rounded off his visit with a major speech to a crowd of some 27,000 outside St Muredach’s Cathedral — another site of deep family significance. “I’ll tell you what,” he assured his audience, “it means the world to me and my entire family to be embraced as Mayo Joe, son of Ballina… the stories of this place have become part of my soul.”

As only the second Catholic president of the United States, Biden rivals John F. Kennedy for the sheer intensity of his identification with his Irish “soul.” As with Kennedy, his Irishness is bound inextricably to his Catholicism, which is why his equally English heritage (on his father’s side) consistently plays second fiddle. Moreover, it is a memory of Ireland rooted in a bygone age — “that fusion of ethnicity and religion,” as Fintan O’Toole puts it, “that has lost much of its grip on the homeland.”

Conspicuously, it is also an Irishness aligned with the very atavisms that the Good Friday Agreement was meant to transcend. Biden’s uncorked nostalgia for his family ties can be irresistibly endearing in its simplicity and humble authenticity. But it also carries unnerving undertones given the tragic consequences of tribal loyalties over the last fifty years.

Little wonder, then, that leading Unionist figures spent much of the week dismissing his credentials as a potential peace broker. Former first minister Arlene Foster was the most forthright in declaring that the president “hates the United Kingdom.” Other Democratic Unionist Party figures concurred that Biden was by far “the most partisan president there has ever been when dealing with Northern Ireland” — suspicions that were only compounded by Biden’s veiled criticism of a UK government that “should be working closer with Ireland” to resolve the wreckage of Brexit.

Though Biden seems largely unaware of the recidivist slant of his Irish colours, he nevertheless understands that his appeal is limited in the North. There would be no reprise of Bill Clinton’s celebrated role in brokering the 1998 Good Friday Agreement. The trip itself was cobbled together hastily, at unusually short notice, with an itinerary shrouded in secrecy.

At no stage did the president sit down for direct talks with the key stakeholders or engage directly in problem-solving of any kind. Indeed, the total length of his stay in Belfast was barely sixteen hours (much of it in bed). No press conference ensued from his brief encounter with Rishi Sunak, nor was it possible to deliver his keynote address from the symbolic chair of the Stormont Assembly.

This may have represented a form of political leverage in its own right — holding Sunak and the DUP at arm’s length until they commit fully to implementing the Windsor framework and minimising the disruptions of Brexit. In that sense, the contrasting warmth in Carlingford sent its own clear message.

But if the president had hoped that his mere presence in Northern Ireland might move the dial on a rapid restoration of power sharing, the decision to mix the political with the personal was probably ill-judged. In a week when petrol bombs were hurled at police in Derry by the “New IRA,” he might have chosen instead to prolong his stay in Northern Ireland and leave the family history tour to his retirement. •

The post Mayo Joe, son of Ballina appeared first on Inside Story.

]]>
https://insidestory.org.au/mayo-joe-son-of-ballina/feed/ 0
Lifting the shadow https://insidestory.org.au/lifting-the-shadow/ https://insidestory.org.au/lifting-the-shadow/#comments Tue, 28 Mar 2023 23:54:17 +0000 https://insidestory.org.au/?p=73460

What constitutes “evidence” of a queer life?

The post Lifting the shadow appeared first on Inside Story.

]]>
Queer history in Australia received a considerable fillip recently with the broadcast of the three-part series Queerstralia by the ABC. Timed to coincide with WorldPride in Sydney in February–March, its upbeat and affirming style treats the troubled aspects of queer history with a relatively light touch. It was another demonstration that the energy in queer history tends to form around legal reform and the advancement of LGBTQIA+ rights from the 1970s onwards.

To research and write queer history before living memory — without oral testimony, that is — is to enter a much darker place. The last man to hang for sodomy in the British Empire was in Tasmania in 1867, and in 1997 Tasmania became the last Australian jurisdiction to decriminalise male homosexuality. Relationships and life choices that are criminalised, stigmatised and pathologised are unlikely to leave much of an imprint on the public record, and surviving historical evidence is often patchy, obscure and cloaked in euphemism.

In 1990 I wrote an honours thesis in the history department of the University of Tasmania on the Tasmanian writer Roy Bridges. It wasn’t a piece of literary criticism, for that would have been a short thesis indeed. Most of Bridges’s thirty-six novels were adventure stories for boys or middle-brow historical romances and melodramas dealing with the early days of Tasmania and Victoria. Frequently he was inspired by stories his mother, Laura Wood, told of her family history on their farm near Sorell, east of Hobart, going back to the earliest days of white settlement.

Bridges was Tasmania’s most prolific novelist, successful and admired in his time, but his reputation didn’t outlast his death in 1952. I wasn’t interested in the quality of his writing so much as his interpretation of Tasmanian colonial history, and how his own deep connection with the island was refracted through his works of fiction and memoir.

Born in Hobart in 1885, Bridges started publishing in 1909, and at first wrote prolifically for the gutsy little New South Wales Bookstall Company. Time and again he sold his copyright for fifty pounds per novel, whenever he was hard up (“which was often,” he once observed), grateful for the support the Bookstall gave to new Australian writers.

In his mature period his novels were published in London by Hutchinson or Hodder and Stoughton, but during and after the second world war his output declined. The gratifying success of That Yesterday Was Home (1948) eased his final years. Part history, part family history and part memoir, the book is a passionately expressed meditation on memory and connection with place. He died in 1952.

Roy Bridges in 1937. Inscription reads, “To my friends at Robertson & Mullins. Roy Bridges. 1937.” State Library of Victoria

By the time I started work on Bridges he was remembered mainly by enthusiasts interested in the literary culture of Tasmania. As a thesis project, though, he was perfect. No one else was claiming him, and significant collections of his papers were held in libraries in Hobart, Melbourne and Canberra. Methodologically I had Bridges’s memoir as a guide, which, unreliable as any memoir always is — and I knew this — was at least a place to begin.

I bought a 1:25,000 map of the Sorell district and pinned it to my wall in the history department. I drove out to meet Bridges’s nephew and his family, who were still working the property that Bridges had named “Woods” after his mother’s family.

The town of Sorell has always been a stopping point for travellers from Hobart heading either to the east coast or to the convict ruins at Port Arthur. To get there you must first drive across Frederick Henry Bay via the Sorell causeway at Pittwater. “All my life,” Bridges wrote in 1948, “Frederick Henry Bay has sounded through my mind and imagination. Like drums… or like cannonade in storm, or in the frozen stillness of winter’s nights.”

Every time I drive across the Sorell causeway I think of him, and did so again one brilliant day in February this year while heading up to Bicheno on holiday. With the sun sparkling off the bay I shouldn’t have been brooding on old stories, but suddenly I knew that the time was right to tackle again a biographical dilemma I had evaded, all those years ago.

The few others who have written about Bridges have struggled to understand the source of the loneliness and sorrow which, towards the end, amounted to torment. His journalist friend C.E. (Ted) Sayers first met Bridges in 1922 and remembered him as a haunted, “tense little man,” a chain smoker, embarrassed in the company of women, who had allowed a streak of morbidity and violence to enter his fiction. I developed my own suspicions about this haunting, and in my thesis in 1990 I speculated, briefly and carefully (because this was Tasmania), that Roy Bridges had been a closeted and deeply repressed gay man.

I wouldn’t have thought of this except for a conversation I had with the one friend of Bridges I could still find, a well-known local historian named Basil Rait. I visited the elderly Mr Rait in a tumbledown house in north Hobart somewhere near Trinity Church. Just as I was deciding that his recollections weren’t going to be particularly useful, he astounded me with the remark that one day, Roy Bridges had been seen emerging from the Imperial Hotel on Collins Street in central Hobart, and that the Imperial was a known place for homosexual men to congregate.

When did this occur? And did Rait see this himself? I was too amazed — and too timid, I think — to ask enough questions and, rookie historian that I was, I did not record the conversation. Why was Rait so frank, and what did he think I would do with his information? Perhaps I’d gained his trust because I had arrived without a tape recorder. I don’t know.

But I did consider his revelation very carefully. The once-elegant Imperial was rather seedy by then, which seemed to lend plausibility to what Rait had said. I had gay friends and I asked if anyone knew anything about the Imperial’s reputation. No one did.

Unable to verify Rait’s assertion, I turned to the textual sources. Although I was aware of the danger of reading too much into odd snippets of evidence that might have signified nothing, I was also unwilling to ignore what I had been told, which, if true, might explain everything. To speculate about Bridges’s sexuality in the thesis, or not: my thesis supervisor left it up to me. On an early draft I can see in his handwriting: “You decide.”


Royal Tasman (Roy) Bridges came from a family of prosperous wicker manufacturers and retailers. His father Samuel and uncle James ran Bridges Brothers, in Elizabeth Street, Hobart, which had been founded in 1857 by their father, Samuel senior. After graduating with an arts degree from the University of Tasmania, Bridges joined the Tasmanian News as a cadet in December 1904. Journalism was his career for most of the next twenty-five years. He accepted a job with the Hobart Mercury in 1907 but soon became disaffected by poorly paid sixteen-hour days on what his memoir described as a “rotten sweat-rag” and headed for Sydney.

He got a job immediately on the Australian Star under its editor, Ralph Asher. Sydney was a relief from Hobart’s “superficial puritanism, social restrictions and moral repressions of human nature,” but in 1909 the chance of a job on the Age lured him to Melbourne, where he settled in happily for a decade. Then, between 1919 and 1935, when he retired permanently to the farm near Sorell, he switched between freelance writing and journalism, mostly with the Age but also, briefly and unhappily, with the Melbourne Herald in 1927.

A shy man, Bridges did love the companionship of other journalists. Keith Murdoch, future father of Rupert, was one of his early friends on the Age, although they didn’t remain close. There was Neville Ussher, of the Argus and the Age, who died during the first world war and whose photograph Bridges kept close to him for the rest of his life. And then there was Phillip Schuler, son of Frederick Schuler, editor of the Age.

High-spirited, charming, handsome: Phillip Schuler’s nickname was “Peter” because of his Peter Pan personality. Friendship “blossomed” during a bushwalk on a “golden August Sunday at Oakleigh,” then only sparsely settled, and after that the two young men spent many weekends together. They read the same books, roistered in restaurants and theatres, and tried their own hands at writing plays.

On a walking holiday in Tasmania in 1911 the two men tramped from Kangaroo Point (Bellerive, on the eastern side of the Derwent) down to Droughty Point, “the way of many of my boyhood days.” They climbed Mount Wellington to the pinnacle and spent two nights at the Springs Hotel, part way up the mountain (sadly burned to the ground in the 1967 bushfires). From an upper window they watched the “glory of the sunrise,” looking across to Sorell and Frederick Henry Bay. In 1948 Bridges wrote:

The beauty and wonder of the island rolled on me, possessed me, and possesses me yet. We were talking and talking — life, Australia, journalism, literature; always we planned; always we hoped. We were worshipping life, the island, the sun.

If you are thinking what I think you are thinking, then no. Schuler returned Bridges’s friendship, but as his biographer has made clear, Schuler was thoroughly heterosexual and Bridges knew it. This could have been one of those passionate platonic friendships between men, but in 1990 I thought, and I still think, that Bridges was absolutely in love with Schuler.

After brilliant success as the Age’s correspondent during the Gallipoli campaign in 1915, Schuler enlisted for active service but was killed in northern France in June 1917. His last letter to Bridges ended: “Keep remembering.” Schuler’s photograph was another that Bridges cherished always, and indeed he had it reproduced in his 1948 memoir, but Bridges himself was no Peter Pan. He had to carry on facing the disappointments that life inevitably brings, and he was not stoic. In his fifties, living with his sister Hilda back at Woods, he felt the loneliness deeply and became a demanding, querulous, self-pitying man who drank too much.

He did still have many friends though, and in 1938 he began corresponding with Ted Turner, an amateur painter whom he met through their membership of a Melbourne literary society known as the Bread and Cheese Club. Bridges was only a distant member because he rarely left Tasmania by then, but he took a fancy to Turner and found great entertainment in the younger man’s letters, which reminded him of his own Bohemian days in Melbourne. Bridges heaped affection and confidences on Turner, requested a photograph and was delighted with it. He was cross if Turner delayed writing and begged him to visit Tasmania (“Ted old son… I wish I had your friendship — near me!”), but Turner never did.

The two men met only once, in April 1940 when Bridges made the trip to Melbourne, but Bridges went home hungover and with a bout of influenza. He admitted to Turner that the trip had been “a series of indiscretions.” What exactly that meant I couldn’t tell, and their correspondence declined later that year.


Did I indulge in absurd speculation in my thesis about domineering mothers and emasculated fathers? No, but it was impossible to ignore the breakdown of the marriage of Samuel and Laura Bridges, Roy’s parents, in 1907 when Roy was twenty-two. Samuel was pleasure-loving and extravagant, and eventually the house in north Hobart where Roy and his sisters were brought up had to be sold. Of Laura, Samuel apparently said that she “may as well” live with Roy because “it’s plain she’ll never be happy without him.”

Laura managed the household while Hilda became her brother’s amanuensis, writing or typing all his novels from his rapidly scrawled sheets. Roy supported them all financially, although Hilda earned an income as a musician and fiction writer. Only now does it occur to me that there might have been an understanding among the three of them, tacit one would think, that Roy would never marry. Before Laura died in 1925 she begged Hilda, “Whatever happens, look after Roy,” which Hilda did. She never married.

Hilda Bridges, probably in the 1910s. State Library of Victoria

Did I mine Bridges’s writings for autobiographical clues to his sexuality? Yes, for no one warned me against mistaking writers for their characters, and anyway there was so much material to work with. Convicts, bushrangers, and the endeavours of the early colonists to establish a free and democratic society on Van Diemen’s Land: Bridges wrote obsessively on these themes for years.

Novel after novel, especially in his mature period, features a misaligned relationship between a beautiful, passionate woman and an unsuitable man. A son of the relationship will turn up as a convict in Tasmania, and the plot revolves around whether the mother’s folly can be forgiven and her son redeemed by love. Bridges despised hypocrisy and religious intolerance, and his clergyman characters are tormented by unsuitable desires and undone by having to preach Christianity to convicts who are not inherently evil but victims of an unjust society.

Symbolic of society’s condemnation of a convict were the physical scars left by flogging, for which Bridges seemed to have a horrified fascination. In his final novel, The League of the Lord (1950), the Reverend Howard France sits in his study in Sorell picturing an illicit meeting between a beautiful young local girl and her convict lover, which he knew was occurring at that moment. France is jealous of them both. “[Joan’s] eyes are deep blue… her mouth is red, her hands long and white… exquisite…” Further down the page France imagines the couple being caught, which would mean the triangles for young Martin: the “hiss and crack of the lash across strong young shoulders… red weals… red flesh… red running… red.”

Martin is deeply ashamed of being a convict and struggles to accept the love offered by his (free) family in Tasmania. He recalls his journey there on a transport ship, hoarded below decks with hundreds of other convicts:

The faces, the eyes, the voices, the hands; the loathsome, pawing, feeling, gliding, gripping hands… the squeaking laughter in the obscene dark… the foul perverted horde that [had] been men and boys… the brooding, breeding evil, the bestiality, lifelong contamination, incurable, malignant, cancerous.

I underlined this passage in my copy of The League of the Lord but didn’t know how to use it. Now I see it two ways. It could simply be an evocation of Marcus Clarke–inspired Tasmanian gothic. Or it could be evidence that Bridges’s many convict characters are studies of profound shame, self-hatred and alienation. In this reading, those convict characters were versions of himself, their alienation his own, and homosexuality his source of shame. Either interpretation is possible.


Roy and Hilda Bridges’s return to Woods in 1935 fulfilled a promise Bridges had made to their uncle, Valentine Wood, who’d died in 1930, to take on the old place. He knew that Woods meant more to him than Melbourne: “that I was of this land; that it was stronger than I, and that when it willed it would call me back.” Still, brother and sister missed Melbourne terribly, even though overstrain and a nervous dread of noisy neighbours had driven Roy to the brink of a breakdown.

It might have been in these years that the Imperial Hotel incident occurred. Did it? Bridges disliked Hobart, but if it was casual sex he needed, where else could he go? And yet, if the Imperial was a known place for gay men to meet, the police would surely have been there too. Put that way, the incident seems unlikely.

Bridges’s heart condition worsened in the late 1940s and he had a chronic smoker’s cough. He refused to go to Hobart for tests and hated doctors visiting from Sorell. One doctor threatened to have him certified to get him to hospital. “He implied my not liking women about me in such treatment was an abnormality,” Bridges grumbled to a friend. The burden of his care fell as usual on Hilda. Eventually he had to be rushed to hospital in Hobart anyway, and he died there in March 1952 aged sixty-six. Hilda stayed on at Woods for many years until she moved to a Hobart nursing home, where she died in 1971.

I never spoke with Bridges’s family about his possible homosexuality because I was relying on them for recollections and photographs. I drove out to Woods for a final polite visit to give them a copy of the thesis, and after that, unsurprisingly, I never heard from them again.

My research had not included any reading on the ethics of biography so instead I learned it the hard way. I’d gained the trust of my subject’s family only to betray that trust in the end. However, this time — for this essay — I contacted a relative a generation younger and did have an open conversation. There is nothing new to say except that Bridges left a complex personal legacy that is still being felt.

Some people blame homosexuality among male convicts for the long shadow of repression and homophobia in Tasmania that delayed gay law reform until 1997. Perhaps. Such a thing would be hard to prove, and in any case, what is “proof”? What constitutes “evidence” of a queer life? When found, how do we assess its significance? The thing is to not shrink from the task, because with patience and honesty we might still open up some of these painful histories to the light. •

The post Lifting the shadow appeared first on Inside Story.

]]>
https://insidestory.org.au/lifting-the-shadow/feed/ 1132
Writing history in dark places https://insidestory.org.au/writing-history-in-dark-places/ https://insidestory.org.au/writing-history-in-dark-places/#respond Thu, 23 Mar 2023 00:49:19 +0000 https://insidestory.org.au/?p=73418

A historian tries to hear the voices of lost children

The post Writing history in dark places appeared first on Inside Story.

]]>
The historian Hugh Stretton liked to tell his students a story about the questions historians ask and how they find answers. In his Political Sciences he told it this way.

“A passer-by finds a drunk on his hands and knees under a street lamp, and asks ‘What are you looking for?’ ‘A dime I lost.’ ‘Where did you lose it?’ ‘Up that alley there.’ ‘Well why look for it down here?’ ‘Because there’s some light down here.’”

Stretton drew the moral that the researcher who really wants to find the dime will go, “groping but rational, up the dark alley.”

I was reminded of Stretton’s story as I read Lucy Frost’s new book, Convict Orphans. Frost chooses to search in the light, with results that are both rewarding and to this reviewer a bit frustrating.

Lucy Frost has been bringing us vivid voices from the past for almost forty years. I remember the joy of discovering her first book, No Place for a Nervous Lady (1984), and introducing my history students to the women’s “voices from the Australian bush” that she gave us there.

Her second book further enriched my teaching. The Journal of Annie Baxter Dawbin (1998) presented us with an educated, articulate woman whose voice was instantly accessible to an educated late-twentieth-century audience. That hers was a world-weary voice made Annie even more interesting to my feminist students — and to me — though I was uncomfortable with the middle-class voices that dominated my own research at the time, and keen to find ways of understanding the lives of the less articulate.

From the late 1990s Frost has been in search of less educated, less privileged voices, with more success than I could muster. She has been involved with the Founders and Survivors project, a huge enterprise digitising and linking all the available records left by and mostly about the 73,000 convicts transported to Van Diemen’s Land/Tasmania. Links to court records in Britain and Australia bring some of these convict voices to life, and a few have left letters and diaries. Frost’s publication with Hamish Maxwell-Stewart, Chain Letters: Narrating Convict Lives (2001), shows what can be achieved by history from below where the sources are sufficiently thick.

Frost continues to be especially interested in the historical experiences of women. She has served as founding president of the Female Convicts Research Centre and as an editor and contributor to its publishing arm, the Convict Women’s Press. Her research into the lives of 150 female convicts who arrived in 1838 on board the Atwick resulted in Abandoned Women: Scottish Convicts Exiled Beyond the Seas (2012). Nineteen children were landed with the Atwick women; wondering about their fate, she tells us, led her to write her latest book.

Convict Orphans sets out to tell the stories of the thousands of children who passed through the Queen’s Orphan Schools and their rebrand, the Queen’s Asylum for Destitute Children. Most were not orphans in the modern sense, having at least one parent living; they were, as Frost puts it, orphaned by the convict system. She places their stories at the head of “a long corridor of suffering”: First Nations children stolen from their families; child immigrants similarly stolen; children abused in institutions that should have protected them. It is time, she writes, that their voices “were heard — and heeded.”

Frost began her research by listing all the children admitted to the state orphanages but soon committed herself to searching where the evidential light was brightest. “Because the array of sources is thickest for children apprenticed during the period of the Queen’s Asylum,” she writes, “I decided to concentrate on ‘orphans’ indentured during the institution’s final decades, 1859–79.” With almost 1000 subjects in hand, she combed a great numbers of other sources — newspaper reports, court records, committee minutes, family histories. “Out of this process grew the collection of stories from which I have woven the narrative of this book.”

She begins with the story of Hannah Bennett, a dark tale with a surprisingly bright ending. Hannah was a convict’s daughter, held in an Orphan School as a toddler, “retrieved” by her mother at five years old but raped and seriously injured at nine. She gave convincing evidence in court against her assailant, and was then returned to the Orphan School. At thirteen she again gave evidence, this time to an inquiry into the management of the school. The inquiry found systematic abuse in the institution: the children’s rations stolen, their bodies mercilessly beaten. But the matron was merely reprimanded, and the girls who had given evidence against her were returned to her care.

Hannah suffered a year more in the school before again being “discharged to mother.” At this point the story changes. Fifteen-year old Hannah married a twenty-one-year-old farm servant and lived a long and settled life, raising seven children with, as Frost says, “proudly confident names” like Hannah Georgina and Victoria Elizabeth.

Hannah’s story is emblematic of the trajectory of the book. We read stories of children whose lives were blighted by their experience of orphanage and apprenticeship, while others survived and even prospered. Some established families of their own; others — particularly the boys — seemed incapable of long-term relationships. The concluding chapter begins by laying the blame for this trauma squarely on the convict system — “transportation smashed families.” But then Frost ends with the stories of two women who raised generations of descendants and lived long enough to be remembered and memorialised by family historians.

Convict Orphans begins by asking us to “heed” the children’s voices, and Frost’s storytelling is so skilful that we won’t forget them. But it isn’t clear what lesson we can learn from these stories. Frost doesn’t weigh the successes against the failures; we are not told how many of her thousand subjects she could trace to a settled life, how many vanished, how many lives were destroyed. These are facts she could draw from the sources within her evidential pool; she chooses not to.

Another recent history of the Tasmanian convicts, Janet McCalman’s Vandemonians, sets out to put their stories in historical context. McCalman has taken a leading role in a project gathering biographical data on some 25,000 convicts: “cradle-to-grave data” intended to evaluate the effects of penal servitude. In setting their research questions, McCalman and her fellow researchers did indeed go “groping but rational, up the dark alley.”

As it happens, many of their questions were wrong. Life expectancy was not affected by height, nor by literacy. Flogging did not shorten lives; rather, “men’s mortality rose in proportion for every day spent in solitary confinement.” Overall McCalman found “that transportation was good for men, and compounded the risks for women.”

Frost doesn’t offer this level of certainty, but she doesn’t need to. Most of her readers will be caught up in the immediacy of her children’s voices, moved by their sufferings and resilience. That they lived is significance enough. •

Convict Orphans
By Lucy Frost | Allen & Unwin | $34.99 | 304 pages

The post Writing history in dark places appeared first on Inside Story.

]]>
https://insidestory.org.au/writing-history-in-dark-places/feed/ 0
Eastern Europe’s faultline https://insidestory.org.au/eastern-europes-faultline/ https://insidestory.org.au/eastern-europes-faultline/#respond Mon, 20 Mar 2023 23:53:09 +0000 https://insidestory.org.au/?p=73393

A distinguished historian uses one family’s story to illuminate the borderland between Europe and Russia

The post Eastern Europe’s faultline appeared first on Inside Story.

]]>
Russia’s war of aggression against its neighbour has piqued unprecedented interest in the history of Ukraine. Volumes explaining the background of the war crowd the display tables of local bookshops. Some are examples of instant scholarship; others are based on decades of thinking and writing about this region. Historian Bernard Wasserstein’s A Small Town in Ukraine is among the latter.

Wasserstein has poured an extraordinary amount of research into this book. The bibliography lists thirty-four archives in seven countries (Poland, Ukraine, Russia, Germany, Israel, Britain and the United States) alongside oral history interviews, written testimonies, websites, unpublished doctoral dissertations, official publications from Austria, Britain, the United States and the Vatican, and a long list of published books and articles. These materials were assembled, read and digested over three decades of “digging ever deeper into what turned out to be an immense historical quarry.”

During his research, the historian built up “vast data banks of official records, newspaper dispatches, census materials, registers of births, marriages and deaths, electoral results, medical reports, maps and photographs, as well as meteorological, geological, ecological, ornithological, architectural, judicial, military, ecclesiastical and every other category of information I could find.”

Wasserstein’s biographical database alone includes information about “over seventeen thousand persons” who once lived in the small Galician town of Krakowiec (pronounced Krah-KOV-yets), the place where his grandparents were born and where, together with their daughter, they were shot at the end of the second world war.

With all this material, he could have produced a turgid multi-volume history of the town of his ancestors. At the very least, he could have written one of those doorstoppers commercial publishers somehow believe “the general public” has time to read. Thankfully, however, he has instead written a short and eminently readable account.

Wasserstein’s readers might recently have encountered Krakowiec — or Krakovets, as it is called today in Ukrainian — just across the border from Poland, in reporting about the refugee crisis created by Russia’s aggression. Founded sometime in the early fourteenth century, the town started life as a frontier settlement of the Kingdom of Poland. When Poland was partitioned in 1772, it became part of the Austrian-ruled Kingdom of Galicia and Lodomeria. As it grew and became more prosperous, it turned from a Polish settlement into an increasingly Jewish town — a shtetl.

The Jewishness of the town was typical. In Galicia, landowners tended to be Polish aristocrats, the peasants were mostly Ruthenians (some of whom, by the nineteenth century, began to call themselves “Ukrainians”), and the town dwellers — tradesmen, tavern keepers, money lenders, and shop owners — were Jews. The division of labour was both functional and conflictual: its violent potential would be enhanced in the age of nationalism, racism and total war.

Wasserstein uses the history of this interaction between Poles, Ruthenians and Jews — and eventually a variety of invading military forces — to situate his own family’s history. He is not the first historian of East European Jewish heritage to embark on such a project. Shimon Redlich, in Together and Apart in Brzezany (2002), was among the earliest; most recently, that celebrated historian of the Holocaust, Omer Bartov, did something similar for another Galician town, Buczacz, in Anatomy of a Genocide (2019).

These accounts belong to a broader but relatively new genre of history writing: the transnational history of Eastern Europe. In books like Sketches from a Secret War (2005) and The Red Prince (2010), Timothy Snyder used the fate of individuals to chart new historical grounds between established national narratives. In A Biography of No Place (2005), Kate Brown presented an intimate portrait of how the borderland between Poland and Russia became a “Soviet heartland.”

At times of war, when national narratives are hardening, such books provide important correctives between and beyond national and nationalist history-telling. In each of them, the first world war plays a pivotal role.

As happened elsewhere in the region, that war came to Krakowiec as “a sudden, direct, and shattering blow.” The “unrelieved terror and carnage” it unleashed lasted not just four but seven years: it prompted the dissolution of both the Austro-Hungarian and the Romanov empires, and transformed seamlessly into a civil war and wars between successor states over real estate and the peoples of the fallen empires.

These years left “a residue of vicious collective suspicions and hatreds,” writes Wasserstein. “Ordinary human relationships collapsed into dog-eat-dog ruthlessness. The people of Krakowiec were plunged overnight into a dark realm. Their world would never be the same again.”

In this maelstrom, all sides distrusted the Jews: the Austrians no less than the Poles (who were soon in charge of their own state); the Russians of the Tsar no less than the Red Cavalry that came later from Soviet Russia to “liberate” the region from the “Polish lords” and the “capitalists” (the Jewish shopkeepers, mill owners and money lenders). Although the revolutionary Ukrainian state, formed in 1917 and declared independent in 1918, was originally committed to multi-ethnicity, the troops of the Ukrainian republic were soon engaged in pogroms just like everybody else.

Only the Germans, despite the harshness of their occupation in 1918, were not known for anti-Jewish excesses — a perverse legacy that convinced some locals two decades later that the stories of Nazi atrocities were Soviet propaganda and there was no reason to flee.

Eventually, the newly established Polish republic won out over its Ukrainian and Soviet Russian competitors. The Treaty of Riga of 1921 divided the Ukrainian state between victorious Poland and defeated Russia, and made Krakowiec Polish yet again. It would remain so until 1939, when Poland was invaded, first (on 1 September) by the Germans from the west and then (on 17 September) by the Soviets from the east. Krakowiec ended up on the Soviet side of the border and was integrated into Soviet Ukraine.

What followed would change the face of Krakowiec even more dramatically than had the first world war and the ensuing civil and inter-state wars. Stalin’s police went after political enemies of the Soviets as well as “class enemies.” Many of them were Polish, of course, but also Jewish: a shopkeeper, a factory owner, even the operator of an export business for Galician eggs (which were shipped to Germany and as far as England) were “capitalists” in Soviet eyes, particularly if they “exploited” (employed) others to do some of the work.

Many Jewish entrepreneurs were arrested and their families deported to the Soviet hinterland. Perversely, this saved many of them: life in Stalin’s concentration camp state was less lethal than being Jewish under the Nazis.

When the Germans invaded in the summer of 1941 they brought with them the genocidal Einsatzgruppen, the mobile killing units that systematically murdered Jews. They had help from Ukrainian nationalists who had become inspired by fascism, like the radical right everywhere. An increasingly bitter four-way struggle developed between these radical Ukrainians, the Polish underground Home Army, German counterinsurgency troops and Soviet partisans, with Jews caught between all fronts. When the Red Army liberated Krakowiec in May 1944, only one Jew emerged from his hiding place. Of the 104,700 Jews who had lived in Krakowiec before the war, only 1689 survived.

Wasserstein’s grandparents, Berl and Czarna, and his aunt Lotte had originally escaped deportation to a ghetto and then the ghetto’s “liquidation.” But the Ukrainian neighbour who had sheltered them for a year eventually gave them up. The Nazis shot them in April 1944, just three months before the Red Army arrived.


Why and how the Wassersteins found themselves in Krakowiec when the war broke out, and why Wasserstein’s father Abraham (“Addi”) escaped their fate, is a history in itself.

A Small Town in Ukraine begins with the deportation of Berl and Addi from Berlin in October 1938, part of a mass expulsion of Ostjuden (“eastern Jews”) from Nazi Germany. Berl had been sixteen when the first world war came to his native Krakowiec. Like many Galician Jews, he and his family fled the advancing Russian army in 1914, eventually moving to Vienna, capital of the Habsburg empire, of which they were loyal subjects.

Perhaps trying to evade military service, Berl kept moving, first to Holland, then to Germany, where he married Czarna Laub, who also hailed from Krakowiec. The couple settled first in Frankfurt and then in Berlin, where Berl built a business producing raincoats. Neither he nor his wife ever became German citizens, but their children grew up speaking German rather than Polish or Yiddish. Nevertheless, for the Nazis after 1933, they were aliens in two senses: Polish refugees and Jews. The deportation of this group in 1938 marked one step in the radicalisation of anti-Jewish policies that would culminate in genocide.

Thus, the Wassersteins were forced back to the provincial Krakowiec they had worked so hard to escape. Berl was allowed a short visit to Berlin to collect the women of the family and liquidate his assets under rules that effectively meant confiscation. Addi, equipped with false papers, managed to travel through Germany, ostensibly en route to Latin America. He arrived in time to say farewell to his sister and parents at the Eastern Railway Station in Berlin. He would never see them again.

Like the family of historian Richard Pipes, who would do so a little later and under somewhat more adventurous conditions, he then moved on to Italy. When Germany went to war with Poland shortly after Addi arrived in Rome, Mussolini’s government suspended tourist visas. Eventually he managed to reach Palestine via Turkey. His survival — the result of quick decisions and chance encounters — was little short of a miracle.

Wasserstein’s book ends with an account of his own travels to Krakowiec after the fall of the Soviet Union and his deeply ambiguous encounter with contemporary Ukraine. The once multi-ethnic Krakowiec, now Krakovets, has been transformed beyond recognition. The Nazis destroyed the Jews, and a postwar, state-led campaign of ethnic cleansing in the border regions moved Ukrainians from Poland to the Soviet Union, and Poles and the few surviving Jews in the other direction. Today, the town is a thoroughly Ukrainian settlement.

Popular memories there diverge sharply from those Wasserstein reconstructs in his book. The town was the birthplace not only of Wasserstein’s grandfather but also of Roman Shukhevych, a controversial Ukrainian national hero. He served under the Germans during the second world war before deserting to fight his own war once it became clear the Nazis would lose. Among other deeds, he commanded a German-controlled unit that “shot all the Jews we encountered” in at least two villages, according to one of his subordinates. In the postwar years he fought a guerilla war against the Soviet occupiers until his death in battle in 1950.

Today’s Krakovets not only has a monument to its questionable hero; the school Berl Wasserstein attended is named after Shukhevych as well, as is a street.


Wasserstein completed A Small Town in Ukraine just as Russia attacked the country early last year. At a time when shades of grey seem to have vanished, when intellectuals are called on to unequivocally condemn “NATO expansion” as the source of the war or throw their lot in behind Ukraine, defender of freedom and democracy, he carves out a third position.

His feelings, he writes, are “mixed.” He shares “the general abhorrence at Russian aggression and brutality” and notes that “Russian claims about ‘Nazis’ in Ukraine are outrageous black propaganda.” Ukraine today, he notes correctly, “is a democracy, albeit a fragile one.” At the same time, he is filled with “unease” at the prospect of a Ukrainian victory parade “past the garlanded statue of Roman Shukhevych” on the square in which the town’s Jews were assembled for deportation.

The glorification of Shukhevych and his comrades from the second world war, Wasserstein warns, is not “harmless exuberance.” Collective identities based on false history “are inherently contaminated and potentially dangerous.” His book is the very opposite of such mythologies: a thoughtful exploration of a painful past that lives on in the present. •

A Small Town in Ukraine: The Place We Came From, the Place We Went Back To
By Bernard Wasserstein | Allen Lane | $35 | 320 pages

The post Eastern Europe’s faultline appeared first on Inside Story.

]]>
https://insidestory.org.au/eastern-europes-faultline/feed/ 0
With Edith Berry in Geneva https://insidestory.org.au/with-edith-berry-in-geneva/ https://insidestory.org.au/with-edith-berry-in-geneva/#comments Tue, 21 Feb 2023 06:29:52 +0000 https://insidestory.org.au/?p=73093

The real-world backdrop of Frank Moorhouse’s celebrated trilogy was alive with idealistic characters

The post With Edith Berry in Geneva appeared first on Inside Story.

]]>
International politics over the past year has taken on a distinctly 1930s flavour: a focus on Europe; an authoritarian leader seeking to bring real and imaginary kin into his fold; debates over economic sanctions; a war involving ground troops, artillery and tanks. The minds of Australian readers might well have turned to Frank Moorhouse’s Grand Days trilogy, much of which is set at the Geneva headquarters of the interwar League of Nations.

Canberra scholar James Cotton takes us back to those years in his new book, The Australians at Geneva, introducing us to a real-life group of talented people who worked in Geneva with the League and the associated International Labour Organization. He shows us how Australia’s role in world affairs evolved from the crass nationalism of prime minister Billy Hughes at the Versailles peace talks and in the years immediately after the first world war (“Who cares what the world thinks!”) to the ambitious internationalism of H.V. Evatt at the San Francisco conference after the second.

And what an array of Australian talent it was, working as League and ILO officials, support staff, visiting delegates to assembly sessions, and members of interest groups. As Cotton writes, “Geneva was Australia’s school for internationalism.”

Whether Hughes liked it or not, Australia had no choice but to engage with the League. At the insistence of American president Woodrow Wilson, colonies captured from the Germans and the Italians during the war had not been annexed to the victors, but were instead granted as League of Nations mandates, with sometimes tough conditions attached.

Australia was let off lightly, gaining relatively strings-free control of German New Guinea and Nauru. The government in Canberra wasn’t expected to grant early independence to either territory, or to open up two-way migration and trade.

But the Australian mandates didn’t altogether escape scrutiny. Why had a surplus in New Guinea’s public accounts been sent to Canberra rather than invested locally, came a query from Geneva. Why were workers in Rabaul on strike? How come explorer Mick Leahy was happily writing about shooting thirty tribesmen on his trek into the New Guinea highlands?

Canberra had other reasons for engaging with the League. It felt that a strong presence in Geneva could help ward off pressure on Australia to water down the White Australia policy, dismantle its tariff wall and submit to compulsory arbitration of disputes.

Despite the awkward queries, Australia came to see the League as a benign theatre, compatible with the emerging British community that came to be known as the British Commonwealth of Nations. Australia’s presence in the League — overseen by the high commission in London, which former prime minister Stanley Bruce headed from 1933 until 1939 — also helped extend its contacts to less familiar nations.

Former Rhodes scholar William Caldwell, a veteran twice wounded in France, was among the impressive Australians who took a senior role in Geneva, and he went on to feed the wider vision he gained there into the public debate back home. Having joined the ILO in 1921, he visited Australia a few years later to try to persuade sceptical state governments and employer groups to improve labour entitlements — though he received little help from trades halls convinced the ILO was a capitalist plot to divert workers from the revolution.

With states holding much of the responsibility for labour issues, Caldwell and another Australian with ILO experience, Joseph Starke, were among the first to suggest that the Commonwealth could use its treaty-making powers to intervene. The pair’s arguments were first tested in the High Court in 1936 and finally used successfully in 1983 to stop Tasmania’s Franklin Dam.

As his remit in Geneva extended to “native and colonial labour,” Caldwell also kept a file on the treatment of Aboriginal employees. In a 1932 letter found by Cotton, he wrote that “Australia is shamefully neglecting her obligations towards that dispossessed race.” Progressives like Mary Montgomerie Bennett were welcomed at the ILO, where they were able to raise the treatment of Aboriginal Australians.

Raymond Kershaw, another Rhodes scholar with a distinguished war record, joined the League at the end of 1923. As an officer in its minorities section he helped deal with the grievances of subnational ethnic, linguistic and religious groups stranded in other countries after the contraction of Germany and the break-up of the Russian, Austro-Hungarian and Ottoman empires. It was an insoluble problem and a thankless task.

Kershaw left the League in 1929 to join the Bank of England. In 1930 he accompanied the bank’s Sir Otto Niemeyer on his mission to crack the fiscal whip over Australia’s “feckless” Depression-era state governments. Kershaw at least wrote papers arguing that the pain of Niemeyer’s prescriptions should be shared rather than fall mainly on the unemployed and poor.

Another Australian, Duncan Hall, had studied at Oxford during the war and completed the equivalent of a doctorate on the future of the Commonwealth. With Fabian socialists Sidney Webb, Beatrice Webb and Leonard Woolf, he had done much to popularise the idea of a grouping of like-minded nations that could be a model for — or the nucleus of — the League. Failing to get an academic position back in Sydney, he joined the Australian delegation to a new regional body, the Institute of Pacific Relations, run from Honolulu.

From there he took up a professorship at Syracuse University in New York state, where he set up a model League of Nations among his students, a teaching simulation widely copied. On a visit to Geneva in 1927 he was invited to join the League’s opium section, which was trying to regulate trade in narcotics, as well as investigating child welfare and the trafficking of women.

Hall’s inspections and conferences took him out of Europe, to Iran, India and Siam (as Thailand was then known). In Calcutta he encountered erudite Indians who pointed out the double standard in the British Commonwealth: a higher status for the white dominions, a much lower one for the Asian and African colonies. The perspective he gained during the trip contributed to the emergence of an “Australian School” of international relations that looked beyond the Atlantic.

In the 1930s, Hall turned to psychology to explain the rise of mass movements supporting authoritarian regimes across Europe. He drew on the work of scholars, including the Austrian psychoanalyst Robert Waelder, who applied Freud’s concept of the “group mind” to the trend. Hall worried that advances in technology like radio were intensifying group consciousness, bringing people closer, extending mob oratory to entire nations and creating a collective psychosis.

At the League, Hall urged that a “realist” view of this threat should prevail over “Utopian pacifism.” His warnings were not welcomed by League secretary-general Joseph Avenol, who was ready to compromise with dictatorships. He barred Hall from making public addresses or broadcasts on his trips.

Another Australian in Geneva with a fascinating backstory was C.H. “Dick” Ellis, who arrived as a correspondent for a popular London newspaper in the late 1920s, and improbably wrote a heavyweight book on the League that became a standard reference. Ellis was also an MI6 officer, having served in the British army during the war, taken part in the anti-Bolshevik intervention in Central Asia, and studied languages at Oxford. He later became a friend of Australian external affairs minister Dick Casey, and in the early 1950s, by then very senior in MI6, he advised Casey on the formation of the Australian Secret Intelligence Service.


As Frank Moorhouse shows in his trilogy, the mood in Geneva shifted from the optimism of Grand Days, set in the twenties, to the gloom of Dark Palace, set in the thirties. The hopes of world disarmament foundered at a failed League conference in 1929. Japan annexed Manchuria in 1931 and withdrew from the grouping. Germany quit after Hitler took power in 1933, and other nations began pulling out. Italy’s invasion of Ethiopia in 1935 was met with ineffective sanctions. Despite Wilson’s early enthusiasm, isolationism kept the United States out.

With the failure of peacekeeping, diplomats including Australia’s Stanley Bruce tried to keep the League useful by expanding its roles — not without a touch of national self-interest — in fields like health and nutrition. As Cotton remarks, “Bruce also considered that a world thus organised would be more receptive to exports of Australian agricultural commodities.”

In May 1939, with war becoming more likely, Avenol turned to Bruce to help rescue the League from its dire straits and the impasse on collective security. He asked the Australian to chair a committee to report on extending the League’s role in encouraging cooperation on health, social matters, economic affairs and financial regulation. By the time Bruce reported in August, his proposals had been overtaken by war. Avenol resigned to join the Vichy government in France.

An Irish deputy-secretary and a handful of staff kept a skeleton office going. After the war, however, Geneva was no more than an annex to the new, New York–based United Nations, though the city also became home to some of the agencies that took up the Bruce committee’s ideas, and the ILO continued to run from there.

Moorhouse’s novels picked up on a pioneering aspect of the League that Cotton also explores: the role of women. From the outset, the League declared all its positions open to female recruitment, though few women made it into positions more senior than the typing pool. Surprisingly, the otherwise conservative Hughes government decided in 1922 that Australia’s delegation to the League’s annual assembly should include at least one woman.

The League gave women like Emilia Hernya a role, albeit subordinate, and a voice.

In practice, the Australian women were only appointed as substitute delegates, not full members. When one woman asked her delegation leader what her role was, he responded, “Your business is to hold your tongue.” In 1927, women’s rights campaigner Alice Moss did stand in for delegate T.J. Ley, a federal MP and former NSW justice minister — as well as a confidence man and, later, convicted murderer — who had other things to do in Europe rather than attend a League committee in dreary Geneva.

Many of the women delegates had well-off spouses who could provide the funds for them to travel and agitate. Back in Australia they gave speeches and wrote articles about the League, often through the League of Nations Union branches that sprang up around Australia and across the world.

Bessie Rischbieth, the wife of a wealthy wool dealer in Perth, was an outstanding advocate of the League. In articles that still read well, writes Cotton, she pointed out how the effects of the Versailles treaty could be seen in the rise of the Nazis, and argued for a stronger League covenant, lower trade barriers, the abolition of exchange controls, and an open door to goods from colonies and mandates. As Cotton notes, “Seen in its context, Rischbieth’s assessment of the times was as comprehensive and insightful as any offered in Australia in the later 1930s.”

Melbourne woman Janet Mitchell was in Shanghai as a delegate to an Institute of Pacific Relations meeting in 1931 when the Manchurian crisis erupted. An Australian newspaper correspondent, W.H. Donald, persuaded her to visit Mukden (now Shenyang) and see for herself. She stayed a year in Harbin, a city teeming with Japanese occupiers and White Russian refugees.

Back in Australia Mitchell became a frequent broadcaster for the League of Nations Union, and in 1935 she was invited to join the League information section (probably by Duncan Hall, whom she had met at the 1925 meeting in Honolulu). She arrived in time to be disappointed by the League’s handling of the Ethiopia crisis. As she left Geneva, and the League, she wrote, “I began to see it, not as it was conceived by its founders as an effective force for peace, but as a little world born before its time, bound to fail.”

Ella Doyle, an Australian shorthand typist, moved to the League from the Australian Imperial Force staff in London. In 1937 she visited Germany and wrote an article about the Nazi crackdown on “decadent art,” pointing out that the works under attack were by the cream of modern German artists, one of whom had designed the stained-glass windows in the League’s new building. After 1945 Doyle had several UN engagements, and much later, in 1978–81, she was Australia’s first female ambassador in Dublin.

Alas, no one on Cotton’s list of Australians fits the character of Frank Moorhouse’s Edith Campbell Berry. Emilia Hernya perhaps comes closest, though her parents were Dutch and British and she was schooled in France. But she moved to Australia in 1913 and was naturalised in 1917 before returning to Europe and joining the League in 1920. Hernya’s childhood had been colourful: her parents worked near Paris’s Moulin Rouge nightclub where as a toddler — Cotton informed me by email — she is said to have sat on Toulouse-Lautrec’s knee and pulled his beard.

The model for Edith, Moorhouse confessed, was actually a Canadian, Mary McGeachy, who worked in the League’s information section. McGeachy left a huge volume of writing about the League and its time, which Moorhouse transferred into his character’s thoughts and speeches. She also appears in the novels as one of the real-life characters around Edith.

Though Hernya and McGeachy socialised in places like Geneva’s International Club, and no doubt had their love affairs, the model for Edith’s adventures on the boundaries of sexual identity was Moorhouse himself. Geneva provided neutral ground in more ways than one.

Moorhouse’s trilogy, an Australian classic unlikely to be added to a school reading list except by the bravest principal, is yet to make it to the screen as the film or series many of its fans have long anticipated. Two production groups have started projects but let them lapse. Meantime, I can attest that the trilogy is just as enjoyable on a second reading, and Cotton’s fascinating book amplifies the factual thread of Australian involvement. •

The Australians at Geneva: Internationalist Diplomacy in the Interwar Years
By James Cotton | Melbourne University Press | $39.99 | 246 pages

The post With Edith Berry in Geneva appeared first on Inside Story.

]]>
https://insidestory.org.au/with-edith-berry-in-geneva/feed/ 2
Harry, Meghan and the republic  https://insidestory.org.au/harry-meghan-and-the-republic/ https://insidestory.org.au/harry-meghan-and-the-republic/#comments Tue, 07 Feb 2023 01:27:37 +0000 https://insidestory.org.au/?p=72957

On Netflix and in print, the couple’s story has been informed by a historical perspective with implications for Australia

The post Harry, Meghan and the republic  appeared first on Inside Story.

]]>
The conflict between the British media and Prince Harry and Meghan Markle has gripped — and split — the English-speaking world in recent months. There are those who have eagerly watched the Netflix series Harry and Meghan, released in early December, and/or read Harry’s autobiography, Spare, released last month. And there are those who believe Harry and Meghan’s action are ruled by a desire for money and refuse to watch the series or read the memoir.  

We find ourselves in the former group. We were deeply moved by the Netflix series, directed by the critically acclaimed American documentary film-maker Liz Garbus, and were absorbed by the book. It isn’t simply the human drama that gripped us, or our sympathy for Harry and Meghan. We also see significant implications for Australia in the way the debate over their actions has played out.

Any account of these recent events must begin with Princess Diana, for it is increasingly apparent that her rebelliousness lives on strongly in Prince Harry and is evident in Meghan’s attitudes and behaviour. When Diana was alive, many people saw her as the best thing going for a stodgy and rapidly fading royal family. What’s often forgotten is that before her death in August 1997 she had become a prominent social activist.

We were particularly struck by footage in the BBC documentary, Heart of the Matter, showing her walking in protective clothing through a recently cleared minefield in Angola earlier in 1997. “I’d read the statistics that Angola has the highest percentage of amputees anywhere in the world,” she explained to the camera. “That one person in every 333 had lost a limb, most of them through landmine explosions. But that hadn’t prepared me for the reality.”

We were also struck by another TV image: Diana sitting by the bedside of an HIV/AIDS sufferer in a hospital. During a visit to Cape Town to see her brother, Earl Spencer, in 1997 Diana had met with Nelson Mandela, who praised her dedication to helping those infected with HIV/AIDS. “We saw her sitting on the beds of AIDS patients and shaking hands with them, and that changed perceptions dramatically with regards to AIDS,” Mandela recalled. He also expressed his appreciation for Diana’s visit to children in Angola crippled by landmines, observing that she had helped inspire the campaign to destroy South African landmines.

An important feature of Diana’s social activism was its internationalism. As well as AIDS awareness and prevention, she supported charities and organisations committed to battling poverty and homelessness, visited charities in Nigeria, Zimbabwe, Nepal, India and other countries fighting leprosy, and opposed the stigma surrounding mental illness.

In the last year of her life, Diana began dating Dodi Fayed, an Egyptian producer whose well-known films included Chariots of Fire. Perhaps her attraction to an Egyptian man partly reflected a desire to extend her consciousness beyond England with an act of love that was also a rebellious act. After all, Egypt had been the scene of perfidies and infamies characteristic of the British Empire, especially the crushing (with the help of Australian soldiers) of the gathering movement for Egyptian independence in 1919.

The open grief of the British public after Diana’s death led us to believe that the tabloids had learned their lesson and would no longer harass, intrude on and exploit the royal family. We are astonished by our naivety.


Despite his decade-long career in the British army, Harry undoubtedly carries on his mother’s tradition of rebelliousness and internationalism. He is patron of a leading landmine-clearance charity, the Halo Trust, and has called for the world to become free of those weapons by 2025. Twenty-two years after Diana, he retraced his mother’s footsteps in Angola.

After walking along the suburban street that was once filled with explosives, he said it was “quite emotional” to retrace Diana’s steps “and to see the transformation that has taken place, from an unsafe and desolate place into a vibrant community of local businesses and colleges… I’m incredibly proud of what she’s been able to do and meet these kids here who were born on this street.”

A news agency photo shows Harry sitting beneath the Diana Tree, which marks the spot where Diana was pictured in the minefield. “Landmines,” he said, “are an unhealed scar of war.” In 2014 he had established the Invictus Games to support soldiers permanently injured in combat.

Harry and Meghan have also taken a leading role in drawing attention to the needs of people with mental illness. In an interview with Oprah Winfrey in 2021, Harry revealed his own difficulties with mental distress while Meghan discussed her depression, experience of a suicidal state and the shocking refusal of the Palace to offer mental health support when she asked for it during her time in England. Under royal protocol, Meghan was compelled to give up her keys, passport and driver’s licence and only got them back when she returned to the United States.

In the same year, 2021, Harry and Oprah made a series of educational programs entitled The Me You Can’t See exploring mental illness and suggesting ways of alleviating it. In Spare, Harry provides considerably more detail about his struggle with mental illness over several years and how, in therapy, he finally came to terms with his mother’s death.


Throughout these years, the tabloid scrutiny of the couple was intense. In his interview with Oprah, Harry compared his relationship with Meghan to the hounding of his mother “while she was in a relationship with someone who wasn’t white.” He feared that history would repeat itself, that like Diana they would be “followed, photographed, chased, harassed” relentlessly. This fear, and the extent of the persecution of Meghan, is described in much more depth in both Harry and Meghan and, especially, Spare.

Among the key points to emerge in the Netflix series is the relationship between the tabloid press and “the Firm.” Harry’s explanation of how the London tabloids work with the royal family’s media staff to produce stories for the front page is dynamite; in his view it was the Firm as much as the tabloids who sought to destroy the Duchess of Sussex. The underlying racism of the tabloids and the royal family are laid bare.

Spare follows up with a great deal more detail on the toxic interdependence of the Firm and the tabloid media. We learn how the relationship between Meghan and William and Kate seemed to start well enough (William and Kate had loved Meghan in Suits) but soon deteriorated, going from one small conflict to the next.

For Harry, the problem of the British media and the royal family goes back a long way, to his mother’s death and the events preceding it. He is horrified that the paparazzi who chased her until her car crashed stood around photographing her, rather than trying to help, as she lay dying. He is shocked that no attempt was made to arrest the paparazzi involved, a failure he believes has only encouraged the tabloids to intrude into his own and his family’s private life.

Spare is, in fact, a great autobiography, a j’accuse that accumulates damning details to intensifying, almost unbearable effect until Harry and Meghan escape.


As historians, we were surprised by Harry and Meghan, which we hadn’t expected to be so thoroughly informed by recent historical scholarship. But the two people chosen as key commentators give a clue to its quality. David Olusoga, a professor of public history at the University of Manchester, has written, produced, directed or appeared in a string of TV documentaries, including Black and British: A Forgotten History and, most recently, Britain’s Forgotten Slave Owners. Afua Hirsch is a journalist with the Guardian and author of Brit(ish): On Race, Identity and Belonging. Between them, supplemented by archival footage and narrative commentary, they bring the British and world historical context to life.

In episode three of the series, Olusoga comments that “this little island off the coast of Europe was at the centre of the biggest empire the world has ever seen” and goes on to ask at whose cost, pointing towards Britain’s history of slavery. Hirsch comments that “Britain had a ‘deep south’ that was just as brutal, that actually enslaved more Africans than the United States of America did.” Britain’s deep south was the Caribbean, overseas, far away, “out of sight and out of mind.”

After an unseen narrator points out that slavery fuelled the early British Empire in North America, Hirsch says that the first-ever “commercial slave voyage conducted by Britain was personally financed by Queen Elizabeth I. And it continued to be financed by kings and queens, right up until its abolition.” Even in its abolition in the 1830s, Britain sided with the slave owners, many of whom were also members of the British parliament, by compensating them at huge cost.

Olusoga and Hirsch are drawing here on the scholarship of the Legacies of Slavery Project, based at University College London and led by historians Catherine Hall, Nicholas Draper and Keith McClelland. The project’s extensive research has helped change British public awareness and understanding, and stimulated among historians a greater interest in the consequences of the end of slavery in the British Empire. Jane Lydon, Zoë Laidlaw, Emma Christopher and others have been tracing how, after abolition, people, ideas, and finance were transferred from the Caribbean to Britain’s settler colonies.  Australia was obviously among them, as recent research by Christopher and Lydon highlights.

Harry and Meghan also considers the more recent historical context. Olusoga draws attention to the migration of many Black and Brown people to Britain from the mid twentieth century — so much so that London “began to look, for the only time in its history, like it actually was the centre of an empire that was mainly made up of non-white people.” When Harry and Meghan became engaged, he says, the royal family seemed at last to have begun catching up with modern British society.

We see Harry and Meghan at a memorial service to mark the twenty-fifth anniversary of the death of Stephen Lawrence, the eighteen-year-old boy killed by white racists. Only two of his attackers were ever brought to justice. Hirsch says that Harry and Meghan’s attendance was highly significant, speaking to “the pain that many people still feel as a result of the murder of Stephen Lawrence.”

Olusoga and Hirsch reappear in episode five to argue that the failure of the Palace to defend Meghan from press persecution was a huge disaster for the future of the monarchy. “Here was a woman,” says Olusoga, “who just looked like most of the people in the Commonwealth, and they somehow, for some reason, couldn’t find the capacity to protect her, to represent her, to stand by her, to take on vested power in her name, to fight for her.” For Hirsch, the departure of Harry and Meghan “felt like the death of a dream” that a truly inclusive Britain could form and flourish.


In Australia, coverage of the series and the memoir gradually shifted from a kind of can’t-watch-it, won’t-read-it scorn to a very mixed but more earnest consideration of the issues the series and the book raise. One of those issues is the future of the monarchy in Australia.

In Spare, Harry reveals a continuing interest in the Commonwealth, and especially the countries that still regard the British monarch as also their own. He writes about the outstanding success of his and Meghan’s royal tour of South Africa in September 2019, the first since that country returned to the Commonwealth in 1994. They were welcomed there as representing a new direction for the royal family and for the Commonwealth, and they both felt that in this shift they had an important role to play.

Yet the role of the monarchy in the Commonwealth has come into increasing question. The final episode of Harry and Meghan shows the monarchy in trouble in the Caribbean, as member nations continue to reject a past shaped by slavery within the British Empire. With reparations increasingly on the agenda, and aware of the royal family’s historical role in the system of slavery, some Commonwealth nations no longer want the British monarch as their head of state. Barbados declared itself a republic in November last year and Jamaica has declared its intention to become a republic by 2025.

What about Australia? What should our future relationship be with this dysfunctional British family? Does the Harry and Meghan story have any implications for us?

While the Australian republican movement has so far said little about the couple, commentary on their significance for an Australian republic has been growing. We agree with Jenny Hocking when she writes, “This now openly feuding family provides our head of state, imposed on us and fourteen other Commonwealth nations by dynastic succession and inherited title alone, in which we have no say and no relevance. It inevitably reignites questions about why Australia is still a constitutional monarchy.”

Apart from the difficulty in imagining a popular and workable alternative, one of the main obstacles to the move to a republic in Australia has been the popularity of the royal family. We grew up in that environment. John Docker remembers his English mother listening to the coronation of Queen Elizabeth II on radio. Ann Curthoys recalls keeping a scrapbook in 1953 of the coronation, as most schoolchildren did, and being one of the 50,000 schoolchildren marshalled in the Newcastle Showground to spell out Welcome (she was in the W) when the Queen and Prince Philip visited Australia the following year.

Lyndall Ryan remembers that the biggest event in her life until she started high school in 1955 was the Queen’s first visit to Australia in 1954. The Australian Women’s Weekly then kept her up to date on the royal family, and in particular their tours to other parts of the Commonwealth. She didn’t seriously consider becoming a republican until after the Whitlam government was dismissed by the governor-general on 11 November 1975, and until Jenny Hocking published The Palace Letters in 2021 she was convinced that the governor-general’s action had nothing to do with the Queen.

But republicanism has had a chequered history in Australia. It gathered increased support after Whitlam’s dismissal, reached a peak during the 1990s and subsided after the defeat of a referendum on the question in 1999. It has been undergoing something of a revival in recent years, especially as Queen Elizabeth’s reign was drawing to a close. Our prime minister is in fact a republican, though he is insisting right now that the matter of the Voice to Parliament, and indeed the Uluru Statement from the Heart generally, must take priority.

Alongside the essential debates over the Voice and a Treaty, it is time to step up public debate about Australia’s becoming a republic. Indeed, the question of the republic is not entirely separate from those debates: they are all part of a necessary reshaping of modern Australia. While Indigenous commentators have focused on the Uluru statement and its proposals, support has been evident for an Australian republic that truly recognises Indigenous sovereignty.

Harry and Meghan and Spare demonstrate with great clarity how the monarchy continues to be shaped by British history, British concerns and British symbolism, and not at all by Australian or indeed Commonwealth ones. The evolution of the monarchy as an institution is clearly outside our control and always will be. The tabloid British media have deeply compromised the monarchy and the royal family, and sections of the Australian media, especially those that are Murdoch-controlled, have too often joined in. With several Caribbean nations forging new republican paths for themselves, surely it is time for Australia to do the same. •

The post Harry, Meghan and the republic  appeared first on Inside Story.

]]>
https://insidestory.org.au/harry-meghan-and-the-republic/feed/ 2
Arthur Stace’s single mighty word https://insidestory.org.au/arthur-staces-single-mighty-word/ https://insidestory.org.au/arthur-staces-single-mighty-word/#comments Wed, 01 Feb 2023 05:21:35 +0000 https://insidestory.org.au/?p=72859

Why did this shy Sydneysider dot his city with a one-word poem?

The post Arthur Stace’s single mighty word appeared first on Inside Story.

]]>
In my part of the world, fewer and fewer people seem to remember Arthur Stace. Younger friends and colleagues will frown awkwardly at the mention of someone they think they should know about, but really don’t. “The Eternity man,” I prompt. That bloke who wrote the word “Eternity” in chalk thousands of times on footpaths in Sydney. Remember when “Eternity” was illuminated on the Sydney Harbour Bridge on 1 January 2000?

Perhaps it’s understandable: this is a Sydney story, and I live in Canberra. In Sydney his memory seems to be still strong — although, since Stace died in 1967 — fewer people will remember having discovered an “Eternity” inscribed by the man himself in his famous elaborate copperplate. It would be even rarer to find someone who actually glimpsed him at work in the pre-dawn, head bowed, kneeling to leave his one-word message in chalk or crayon.

I became curious about Stace during trips to Sydney in the 1990s, when a highlight was to call in at the Remo store in Darlinghurst to browse all manner of cool stuff you probably didn’t need but was fun to own, including t-shirts, prints and other merch emblazoned with Stace’s “Eternity” in a design by artist Martin Sharp.

Sharp had been incorporating Stace’s “Eternity” into his work for years, and a five-metre rendition on canvas adorned Remo’s Crown Street window — stopping traffic, according to proprietor Remo Giuffré. From beyond the grave, Stace was very good to Remo. “We were never ones to miss a good merchandising opportunity,” he recalled.

Arthur Stace posing for photographer Trevor Dallen on 3 July 1963. Sydney Morning Herald 

Stace’s fame peaked in the 1990s, but Sydney had always been fascinated by him. From the 1930s onwards the discovery of an “Eternity” on pavement or wall was a unique and unifying experience for Sydney residents. Graffiti was still uncommon, and the letters were so perfectly formed, the meaning so tantalising. Who was this Eternity man? No one knew.

By the 1950s there were so many rumours, so much press speculation, and increasing numbers of false claims by impostors that in 1956 Stace allowed his identity to be revealed. By 1965 he estimated he had written “Eternity” 500,000 times all over Sydney.


Stace’s story, as he told it, was that he was born into poverty. His parents, two brothers and two sisters (actually he had three sisters) were all alcoholics, he said, and he himself was a drifter, a petty criminal and an alcoholic for decades before he converted to Christianity. That happened after he joined, by chance, a service in August 1930 at St Barnabas’s Anglican Church, Broadway, on the promise of tea and cake afterwards. The preacher was the Reverend R.B.S. Hammond, famous in Sydney as a “mender of broken men” — men like Arthur Stace who believed themselves beyond help. Years later Stace was fond of saying that he went in for rock cake and came out with “the Rock of Ages.”

He gave up alcohol and was befriended by Hammond, who gave him a job at the Hammond Hotel, a hostel he ran in Chippendale. Stace worked in its emergency depot helping men in need of a wash and a shave, or repairs to their clothes and boots.

Spiritually, though, Stace was more drawn to the services at the Burton Street Tabernacle, a Baptist church in Darlinghurst. There, in 1932, he heard a sermon by a famous evangelical preacher of the day, John Ridley. “Eternity! Eternity!” Ridley cried. “I wish I could sound, or shout, that word to everyone in the streets of Sydney. Eternity! You have got to meet it. Where will you spend eternity?”

Stace was profoundly moved. Leaving the church, he discovered a piece of chalk in his pocket and bent down and wrote “Eternity” there and then on the ground. He joined the community at Burton Street, and that was the beginning of his new life as a reformed alcoholic and self-described “missioner” seeking to convert others.

Arthur Stace (seated) as emergency depot manager at the Hammond Hotel in the 1930s. Courtesy of HammondCare

When an energetic new pastor, Lisle Thompson, arrived at Burton Street in 1951 the two men immediately became friends. One day, after an outdoor service, Thompson spotted Stace at work with his chalk: “So you’re Mr Eternity, Arthur,” he queried. “Guilty, your honour” was the reply. Thompson wanted to share Stace’s story and eventually he persuaded Stace that an account of his conversion, written as a “tract,” would be a good evangelistic tool, an exemplar for others. Titled The Crooked Made Straight, Thompson’s eight-page account briefly noted that Stace was the Eternity man. “This one-word sermon has challenged thousands and thousands,” he added.

The tract circulating quietly among churches was not enough for Thompson, and finally Stace let him take “Mr Eternity” to the press. The scoop went to Tom Farrell at the Daily Telegraph and the story covered six columns in the Sunday edition on 24 June 1956. The mystery was solved, and overnight Arthur Stace, living modestly in Pyrmont with his wife Pearl (they met through church activities and married in 1942), became one of Sydney’s most famous citizens.


In the ensuing years Stace was happy to grant a further press interview now and then. In 1965, two years before his death, he told a journalist that he had tried a few different slogans — “Obey God” for instance — but that “I think eternity gets the message across, makes people stop and think.” It certainly did, but what Stace might not have realised was that with increasing material prosperity, secularism and multiculturalism in Australia, younger people were becoming baffled.

Martin Sharp first spotted an “Eternity” in 1953 when he was just eleven, and was captivated. “What does it mean? Why is it there? Who wrote it?” He didn’t learn the full story of Arthur Stace until 1983 when he was given a copy of Keith Dunstan’s book Ratbags, published in 1979. Along with Percy Grainger, Barry Humphries, Frank Thring and others of that ilk, Arthur Stace was one of Dunstan’s “ratbags.”

In 1958 journalist Gavin Souter had compared Stace to bohemian rebel Bea Miles and other Sydney “characters,” including a man who sat perfectly still on a bench in Hyde Park with an open packet of peanuts in his lap, covered head to foot in pigeons. In 2001 Peter Carey declared that Sydney didn’t love Stace because he was “saved” but because he was “a drunk, a ratbag, an outcast… a slave to no one on this earth.” Clive James in 2003 simply called him a “lonely madman.”

Christians, on the other hand, had little difficulty in interpreting Stace’s message the way he meant it — that there is a life after this one and we need to be prepared for it. For them there was nothing peculiar about a devout Christian wanting to spread such a message. In 1994 the Reverend Bernard Judd, an Anglican rector and long-time friend of Stace, declared emphatically in a filmed interview that Arthur was not a fanatic, not obsessed, and rejected the association with Sydney’s eccentrics. Stace was “a thoroughgoing reasonable rational Christian.”

When a full biography of Stace, Mr Eternity: The Story of Arthur Stace, appeared in 2017 it was published by the Bible Society. The avowedly Christian co-authors, Roy Williams and Elizabeth Meyers (the latter a daughter of Stace’s friend Lisle Thompson), reiterated Judd’s assessment to again counter the idea that Stace was a “weirdo,” or mentally ill. He was unusual, they conceded, but that could be said of any “prodigious achiever” in human history.

Poet Douglas Stewart could embrace the sublime and transcendent in Stace while avoiding the preachy context, and in so doing helped propel Stace’s work into our modern, secular age. The oft-quoted first stanza of his poem “Arthur Stace,” first published in 1969, runs thus:

That shy mysterious poet Arthur Stace
Whose work was just one single mighty word
Walked in the utmost depths of time and space
And there his word was spoken and he heard
Eternity, Eternity, it banged him like a bell
Dulcet from heaven sounding, sombre from hell.

Stewart’s poem helped inspire Lawrence Johnston’s documentary film Eternity in 1994. Parts of the poem are read during the film, and its beautiful cinematography encompassed a similar sense of light, shadow and mystery. The soundtrack was adapted from Ross Edwards’s haunting orchestral work Symphony Da pacem Domine.

Like Stewart, Johnston wanted to explore Stace’s part in the biography of Sydney, and black-and-white recreations feature an actor (Les Foxcroft) as a silent, lonely, Stace-figure in an overcoat and felt hat, head bowed, walking, kneeling and chalking. An assembly of people, including Bernard Judd, Martin Sharp and artist George Gittoes, describe how Stace had inspired or influenced them.

Gittoes was one of few to have witnessed Stace at work. He had been staring idly into a shop window early one morning in 1964 when he became aware of the image of a man reflected from across the street. As he watched, silently, unwilling to interrupt, the man knelt “almost as if in prayer” and wrote something on the ground. Gittoes had never heard of Arthur Stace then and thought that “Eternity” had been written just for him. As a fifteen-year-old boy “looking for signs,” he said, that one word seemed to be “like a whole book of words,” and the experience had remained “like a tattoo on [his] soul” ever since.

There was something else that struck the artist’s eye, and he remembered it even after thirty years: Stace’s shoes were too big. As the man knelt, Gittoes could see clearly into the gap between his sock and his shoe.

I too am fascinated by this detail. Everyone remembers Stace as a carefully dressed man, always in a suit, tie and felt hat, and an overcoat for winter. The few photographs of him attest to this. But Gittoes noticed that his shoes were much too big, clearly not originally his own. This could have been because as a small man, only five foot three, Stace had trouble finding shoes to fit. Or because even in the relative comfort of his later years Stace was still too frugal to buy new shoes. Whatever the case, my imagination gets to work in that gap between the actual man and how he presented himself to the world.

Stace’s adeptness at controlling his own story for public consumption leads me to wonder: what was in the gap? What would drive someone to write “Eternity” on the footpath every day for thirty-seven years? Half a million times. Was Stace “unusual”? Obviously. A “madman”? Unhelpful. “Obsessed”? Yes, I think so.


Stace was always a poor man, but the dimensions and impact of that poverty have until recently been under-appreciated. The trauma that would afflict his life began even before he was born. His biographers have shown that his mother, Laura Lewis, had two children with an unknown father, or fathers, while she was a teenager living at home with her parents in Windsor, New South Wales. The first baby died, and the second, Clara, born in 1876, nearly did too.

One day Laura left the three-month-old with her own mother, Margaret Lewis, while she visited a neighbour. Ten minutes later Laura was called home to find Clara pitifully unwell. Margaret claimed to have given her granddaughter a drink of tea, but a doctor was called and, as he later testified, found the baby suffering from “gastric irritation of the stomach and bowels,” retching and crying incessantly. Soon it was discovered that what Margaret had actually given Clara was carbolic acid, a common disinfectant. A local chemist confirmed that he had sold it, diluted in oil, to Laura during her pregnancy to treat an abscess.

Remarkably, Clara survived, and although Margaret appeared before a magistrate, a murder charge seems to have been dropped. Why? If Margaret had been making a sudden wild attempt to eliminate an unwanted mouth to feed, there seems to have been insufficient evidence for a conviction, but Stace’s biographers, Williams and Meyers, offer the simple conclusion that it was a genuine accident in a chaotic household.

Four years later, in 1880, Margaret’s husband John was imprisoned for assaulting her, and on his release in 1882 immediately sought her out and assaulted her again. Evidence suggests that the whole family lived in fear of this man. Laura escaped to Sydney with Clara but found no real refuge there. She took up with William Stace, an Englishman from a modestly prosperous background, and together they had six children; Arthur was the second youngest, born in 1885.

But William was feckless and, in the deepening depression of the 1890s, could not hold down a job. The family moved frequently among Sydney’s cramped inner-city suburbs, sliding into poverty. By 1892 they were accepting charity, and in November William Stace deserted the family, leaving Laura destitute. Her only option was the Benevolent Society Asylum, a huge institution located near where Central Station is today. After a Christmas spent within those grim walls, Arthur, his older brother William and younger brother Samuel were fostered. Arthur was seven.

Williams and Meyers mention that later in life Arthur would not speak of his time in foster care. The years he blanked out were spent with a family in Goulburn. Later he was placed with families in Wollongong, and as a teenager he found employment in the coalmines there. (With his first pay, he said, he bought a drink: the first step towards decades of alcohol addiction.)

The Stace family was scattered. William and Laura reconciled, but their lives were marred by alcohol and violence, and all their children were fostered or left of their own accord. By the time Arthur returned to Sydney in about 1905, when he was twenty, William had become a chronic and violent alcoholic, and Laura appears to have taken to prostitution. William died in the Parramatta Hospital for the Insane in November 1908, aged about fifty-two. Laura died of cancer in 1912.

Considering all this, the gaps and inconsistencies in Stace’s account of his life are unsurprising. I think he exaggerated his and his family’s criminal associations a little, probably to make his conversion in 1930 seem more powerful. Especially interesting is his claim that as a child he had very little schooling, and that he couldn’t account for his ability to write “Eternity,” and only that word, in perfect copperplate.

Although he implies it was done by some kind of divine guidance, there is ample evidence that he could write quite well and had obviously received some primary education. When Stace said he couldn’t write, the deprivation he was describing was perhaps not education — it was love.


The gap between the shoe and the sock turns out to be vast. It’s not just the gap between Stace the man and what he said about himself, but the gap between the historical sources (newly digitised, many of them) and how we interpret those sources in our own times.

In fact, the gap is so vast I hesitate to approach it because I respect Stace’s telling us only what he wanted us to know. But also, he invited us to wonder, as he wondered, about unfathomable existences beyond our small selves: he walked in the “utmost depths of time and space,” as Douglas Stewart put it.

So let me suggest that what was in that gap was intergenerational poverty, violence, substance abuse and trauma. The twin pillars of Stace’s trauma may have been, first, the poisoning of his half-sister Clara in 1876, and later, his separation from his parents and siblings when he was seven.

What happened on that day in Windsor in 1876? There was baby Clara, “retching and crying.” There was the shock, the panic, the tears and outrage, and the smell of carbolic around Clara’s mouth. All the witnesses would have had their versions of events, but only the baby’s grandmother really knew, and how could Margaret have explained her actions if she herself was a victim of earlier circumstances impossible to describe? How did Laura cope with the memory of that day, and how could she establish a future for herself and her children? Whom could she trust? Not William Stace, as it turned out.

Arthur’s early childhood memories were of sleeping on bags under the house when his parents were “on the drink,” and having to steal milk from verandas and food from bakers’ carts and shops. Fostering might have given him and his brothers proper beds, food and clothing, but at the cost of everything and everyone they had ever known.

Did they know what had happened in Windsor? From under the house they might have picked up bits of it from abusive arguments between their parents, or perhaps it was never mentioned. Either way, spoken of or not, the story was surely always there, impossible to un-remember.

Our increasing knowledge of trauma and its effects on mind and body may offer new insights into Stace’s behaviour as an adult. For years, both before and after his identity as “Mr Eternity” became known, he told his story many times in Protestant services and prayer meetings: how he had been brought up in a “vile” environment, how he had lived a “slothful drunken life,” going from job to job, jail to jail, and how finally, at his lowest point, he had been “plucked from the fires of hell” at St Barnabas’s when “the spirit of the living God” entered his life.

Conversion gave Stace not just a community to belong to — probably for the first time ever — but an audience for his story. It was common in Baptist services for people to give “testimony” by describing their lives before and after conversion, and so here was an accepted language and a template for Stace to craft a narrative of his own. He could stand outside his story and gain comforting distance from it, always with a group of the faithful to take it and hold it for him without judgement.

So then, those half-million eternities could have been another form of repetitious behaviour, born of trauma. His message could have been not only a one-word sermon or a one-word poem, but a one-word trauma narrative. Mightily told, over and over. In those daily pre-dawn excursions around Sydney, the act of kneeling to write “Eternity” every few hundred yards might have put Stace into a meditative state that kept him separated from his past, eternally in the present. Hoping, with his chalk-dry fingers, to convert his suffering into something redemptive for other people. •

The post Arthur Stace’s single mighty word appeared first on Inside Story.

]]>
https://insidestory.org.au/arthur-staces-single-mighty-word/feed/ 1
One-man intelligence network https://insidestory.org.au/one-man-intelligence-network/ https://insidestory.org.au/one-man-intelligence-network/#comments Wed, 01 Feb 2023 01:20:48 +0000 https://insidestory.org.au/?p=72838

For a remarkable quarter-century, Tony Eggleton was the power behind the Liberal throne

The post One-man intelligence network appeared first on Inside Story.

]]>
Picture a country road in the bush outside Canberra. It’s 1965. A black Bentley saloon purrs to a halt by the side of the road. Bob Menzies alights, holding a can of fly spray. A younger man gets out of the back seat and the prime minister hands him the can. The young man squirts a generous burst onto the prime minister’s back. They climb back into the car and drive on.

Menzies, aged seventy, is about to open a new telescope in the Canberra hinterland. Long experience of public speaking in the open air has given him an aversion to flies, and he has hit on the deterrent of shrouding himself in insecticide.

The young man is Tony Eggleton, thirty-three. Just hired as Menzies’s press secretary, he is ambitious, conservative and diligent. If spraying the prime ministerial personage is part of the job, he’ll do it obligingly and he’ll do it thoroughly. And later that day he will type up the incident in a note for his private file.

Here is a puzzle worth unravelling. Aren’t nice guys supposed to come last in politics? Yet that obliging young man ended up as top dog in the Liberal Party organisation. “Neither belligerence nor assertiveness were part of his persona,” according to biographer Tom Frame in his new book, A Very Proper Man; yet he became a prominent player in every twist and turn of the Liberal saga over twenty-five years from Menzies to Hewson: Holt’s disappearance, Gorton’s chaos, Whitlam’s dismissal, Fraser’s supremacy, Howard’s and Peacock’s failures, the Joh-for-Canberra fizzer. He was there through eleven federal elections, including a still-unbeaten record of seven as the Liberals’ national campaign director. And he went on to work at a high level in international affairs, in the Commonwealth during the Whitlam years and in the development assistance organisation CARE International.

Along the way, Eggleton practised a lifelong discipline of typing up notes recording his immediate impressions of events he was involved in. The result, says Frame, is “thousands of documents, memoranda, letters, newspaper clippings and photographs” in thirteen boxes, as well as a “personal chronicle” written by Eggleton for his family.

This remarkable trove of contemporaneous firsthand records sees the light of day for the first time in Frame’s biography. A Very Proper Man contains no startling revelation that reshapes our understanding of Liberal politics; but its deep detail, long span and central perspective will make it a very valuable resource for future historians of Liberal politics.

Frame declares himself a friend of Eggleton, and this is a friendly biography. But while it is thorough and substantial in tracing Eggleton’s progress, I don’t think it fully succeeds in explaining his success and longevity.


Born into a middle-class family in Swindon, England, in 1932, Tony (not Anthony) Eggleton left school at fifteen to become a reporter with the local newspaper. Rapid promotion led to an invitation in 1950 to cross the globe to join the Bendigo Advertiser. Supportive parents paid his passage; the adventure became a career. He joined the ABC in Melbourne the following year; by the end of 1954 he was an “A” grade journalist responsible for morning bulletins of radio news. Then along came TV, and Eggleton was included in the ABC’s first training courses — truly, as Frame notes, a “career-enhancing opportunity.”

When the ABC’s Melbourne office began a TV news service shortly before the opening ceremony of the Melbourne Olympics, Eggleton was chief of staff. In his new role, his working life involved “identifying good news stories, assigning reporters and cameramen, supervising newsroom management and logistics, and assessing the film ‘rushes’ in the viewing room. With his office in a prominent corner of the newsroom, he was close to all the drafting, editing and production.”

And then he joined the navy, as its coordinator of public relations. Why? He had reached the top of the ladder in journalism at the age of twenty-seven; perhaps he saw a path into government, to a life among the news makers rather than the news reporters. If so it was an inspired gamble.

The navy minister was John Gorton, whom Eggleton had profiled for the Bendigo newspaper as a newly elected senator from Victoria. Gorton remembered him and liked his work — not least, perhaps, an opinion piece in which Eggleton had declared his support for Menzies’s proposed Communist Party dissolution bill. (“The local branch of the Communist Party is… an active tentacle of the Kremlin octopus… We must ensure the reds are prevented from infiltrating further.”) Gorton, the most junior minister in the government and not entitled to a staff press secretary, was hungry for profile and looking for someone experienced in the new medium of television.

Gorton overruled his department and offered Eggleton the job, and in March 1960 Eggleton moved to Canberra and into the Liberal orbit. They made a complementary pair: Eggleton initiated the now-standard practice of issuing ministerial announcements on Sundays, typically quiet news days; Gorton got increased coverage and was delighted. Eggleton also set up a navy film unit to produce professional newsreels of ships and sailors, and distribute them to TV stations. This innovation, too, has continued.

Frame, who has written extensively on Australian naval history, suggests Eggleton was perhaps too good at his job, insofar as his “effective promotion” of the navy may have obscured the problems that would manifest in a series of collisions and other fatal mishaps. These incidents culminated on the evening of 10 February 1964 when the aircraft carrier Melbourne collided with the destroyer Voyager. Eighty-two men were killed in the navy’s worst peacetime disaster.

Frame provides a terrific description of how Eggleton battled the bureaucracy to ensure “a continuing flow of accurate information” to the public, for which he received the respect of the media and, as it turned out, the prime minister. Menzies appointed him press secretary in late 1965 and allowed him to organise a live broadcast of the press conference in early 1966 at which the prime minister announced his retirement.

Eggleton was passed down, like a piece of valuable china, to the incoming prime minister Harold Holt. If Voyager was Eggleton’s trial run in crisis management, Holt’s disappearance in the surf off Portsea in December 1967 triggered his supreme test.

Thanks to his press gallery contacts, Eggleton appears to have been the first of Holt’s people to hear rumours of something amiss. He was the first to get to Portsea, travelling with Holt’s wife Zara. While the military and police conducted their fruitless search, Eggleton took control of the external story, filling the leadership vacuum and managing the maelstrom of media and public anxiety by personally conducting six televised press conferences over three days. He also communicated with the governor-general, the Liberal Party and US president Lyndon Johnson. In the process he became famous.

When the Liberal Party met in Canberra in January to elect Holt’s replacement, it was naturally Eggleton who announced to the media that the new prime minister was John Gorton. Gorton’s trainwreck prime ministership provides Frame’s most entertaining and astonishing chapter, informed by Eggleton’s contemporaneous file notes covering Gorton’s divisive and conspiratorial relationship with his staffer Ainsley Gotto, his hatred of the media, and his numerous domestic and international faux pas.

The highlight, deservedly, is the late-night drinks party at the residence of the US ambassador Bill Crook on 1 November 1968 — surely the most infamous and embarrassing incident ever in the Australia–US relationship.

Earlier that day, Crook had met with Gorton to confirm LBJ’s announced suspension of bombing of North Vietnam. The advice was tardy, annoying Gorton, who kept the ambassador waiting. That evening Gotto attended a dinner with others at Crook’s residence, and pressured Eggleton to persuade the prime minister to pay a visit to smooth things over. Gorton went to a press gallery dinner instead, and it was only late at night, well lubricated and in the company of a young journalist, Geraldine Willesee, that he agreed to do so. What could possibly go wrong?

In what now reads like soap opera, Gorton was miffed to see Gotto with another guest and Gotto was appalled to see Gorton with Willesee. Eggleton thought it was “incredible… unreal.” While music and dancing continued, Gorton at some point divulged that he wanted to withdraw Australian troops from South Vietnam but was prevented by Liberal Party policy. Crook invited Eggleton into the study for a private talk about Vietnam. Eggleton finally extracted Gorton “between 2am and 3am.”

Frame asserts that Gorton had “fallen short of every standard of acceptable behaviour,” and that when the story came out months later it was Eggleton’s personal reputation that helped save the PM. This seems fair. The Liberals were spending their inherited political capital like drunken sailors — or ex–navy ministers — and Eggleton proved himself the only adult in the room.

When Gorton was finally replaced by William McMahon in 1971, Eggleton opted to join the Commonwealth secretariat in London. He was lured back to Canberra in 1974 to help the Liberals, now in opposition, as the party’s federal director. In this role he worked very closely with Malcolm Fraser as PM, winning three elections, only to then lose four in a row to Labor’s Bob Hawke and retire in 1990.


So what does explain Eggleton’s longevity and prominence? Part of the answer is his loyalty to the cause. Hardworking, methodical, unflappable, an early riser and a non-drinker, he started out as useful and became indispensable.

Eggleton himself told a press gallery farewell dinner that as press secretary he had been “valet, chauffeur, decoy, bag carrier, sounding board and whipping boy.” He protests too much; he also brought exceptional skills in managing the news flow to suit his political masters, while also retaining the confidence of the working press. Veteran journo Alan Reid (providing Frame with his title) described him as “a very proper man.”

A further part of the answer lies with the old adage that proximity is power. Menzies disliked talking on the phone; he let Eggleton answer his calls. Gorton hated briefing the media; he let Eggleton do it for him. When Fraser campaigned, Eggleton travelled with him on the plane. Eggleton spent his career “in the room,” listening and learning and becoming, in the admiring description of another veteran scribe, Max Walsh, a “one-man intelligence network.”

Importantly, he didn’t seek to wield power or advise on policy outside his area of responsibility. He didn’t take sides and he didn’t blab. (A later Liberal press secretary, David Barnett, described Eggleton as like a built-in wardrobe — invisible and discreet.) Tact and discretion earned him the trust of those he dealt with and extended his influence.

At the same time, as he grew in experience and influence, he didn’t fail to perceive the benefits of centralised coordination of the government’s and the party’s communications. While still press secretary, he suggested the prime minister’s department create an office of public affairs and information to monitor and coordinate media units within the various departments and ministerial offices. In opposition, under Billy Snedden and later Andrew Peacock, he expanded the remit of the party office at the expense of the leader’s office.

Similarly, and more significantly and permanently, he secured appointment, under Fraser, as the Liberals’ first national campaign director, with effective (though often porous and conditional) control over the campaign activities of the nominally autonomous state divisions. Frame’s narrative is a bit light on here and could have devoted more space to the internal workings of the Liberal organisation and the personnel under Eggleton’s long regime.


As noted, this is a friendly biography. Frame’s criticisms, muted and elliptical, are largely confined to the introduction. He suggests that Eggleton should at times have “taken a stronger stand against bad behaviour” without specifying which incidents he is referring to. It seems clear that Eggleton’s tolerance of Gorton, especially his appalling behaviour at the US residence, is one of those occasions.

By today’s less forgiving standards, senior advisers become complicit if they put political or personal loyalty ahead of a higher responsibility to the nation or the government — especially if they are public servants, as Eggleton was at this stage. They have the option of calling it out, or walking away. Eggleton did neither.

Likewise, when Fraser blocked supply to the Whitlam government, Eggleton’s predecessor Tim Pascoe opposed the strategy. He even presented a memo to Fraser in October 1975 arguing that forcing an election for short-term gain would deprive Fraser of long-term moral authority. (Fraser burned the memo and never forgave Pascoe.) But Eggleton had no such qualms. In his own personal file note on 10 November 1975, he wrote that the governor-general would surely soon feel compelled to intervene; meanwhile, Liberal fundraising was ahead of target.

Such are the dilemmas and tensions inherent in the concept of political professionalism, which requires primary devotion to the client but also adherence to objective standards of conduct. It is only easy with hindsight. (For the record, I should note Eggleton’s generous consideration in giving me a lengthy interview for my doctoral research into the Liberal and Labor campaign professionals; he is indeed a very proper man.)

After he retired in 1990, feted and honoured, Eggleton worked in the aid sector with development assistance organisation CARE. Fraser, now chair of the global body, had invited him to apply to become its secretary-general. They travelled extensively and were an effective team, which suggests their close political relationship was based on solid personal sympathies.

Picture this then. A light plane touches down on a tiny airstrip somewhere in Somalia during the civil war in the early 1990s. Malcolm Fraser alights and, with him, a dapper and still obliging Eggleton. They climb aboard a convoy of jeeps, with a machine gunner for protection. Fraser, however, urgently needs to pee. There is no toilet, not even a tree. While Fraser unzipped, Eggleton was, in Frame’s words, “assigned the task of acting as a tree to afford the very tall prime minister a little dignity.” One can’t help admiring the man. •

A Very Proper Man: The Life of Tony Eggleton
By Tom Frame | Connor Court | $49.95 | 320 pages

The post One-man intelligence network appeared first on Inside Story.

]]>
https://insidestory.org.au/one-man-intelligence-network/feed/ 1
The war for the soul of America https://insidestory.org.au/the-war-for-the-soul-of-america/ https://insidestory.org.au/the-war-for-the-soul-of-america/#respond Thu, 26 Jan 2023 22:19:18 +0000 https://insidestory.org.au/?p=72741

The dire state of the Republican Party has decades-old roots

The post The war for the soul of America appeared first on Inside Story.

]]>
When American president Richard Nixon declared “We are all Keynesians now” in 1971 he was summing up a growing consensus among the main political parties in English-speaking democracies. After a generation of sustained economic growth and the spread of home ownership and consumer goods, a Republican leader was endorsing the idea of the mixed-economy welfare state.

The big parties would continue to differ about government’s role in areas like education and health, and would keep disagreeing about desirable levels of taxation, but neither of them was threatening either a large-scale winding back of the welfare state or the widespread nationalisation of industry.

This convergence was fuelled by the increasing priority given to electoral pragmatism over ideology, and not just in the United States. In Australia, observers increasingly argued that winning elections was all about appealing to the middle ground rather than promoting longstanding ideologies.

In the event, though, politics in the English-speaking democracies moved in the opposite direction. Rather than more convergence, polarisation grew. Rather than more moderation, confrontation intensified. Rather than consensus, the lines of conflict became more numerous, deeper and increasingly rigid.

These trends were most pronounced in the United States, where political competition has become increasingly aggressive and coarse. And they were driven, above all, by the Republican Party.

In her new book, Partisans, Nicole Hemmer argues with much fresh and convincing detail that the forces that created present-day Republicanism began shaping the party in the 1990s. They set the course that culminated in Republican attempts to deny the result of the 2020 election and the wide reluctance within the party to criticise the mob that attacked Congress on 6 January 2021.


A large ensemble of strange and often unprincipled characters pass through Hemmer’s pages. But three individuals stand out, each embodying key ways in which the Republican Party has changed.

Interestingly, Ronald Reagan isn’t one of them. Although the Republicans who followed Reagan pay ritual homage to him, their appeal and style is the antithesis of his. He was certainly the apostle of small government they celebrate (“Government is not the solution; it is the problem,” he once said), but where he exuded confidence and optimism today’s party leaders cultivate resentment and fear. When Reagan was campaigning for his second term in 1984 his best-known TV ad proclaimed, “It’s morning again in America.” It’s impossible to imagine a similar appeal to shared hope from Trump and his imitators.

This is where the first of the book’s key characters, Pat Buchanan, comes in. Buchanan, the first to break from the Reagan gospel, was very much Donald Trump 1.0. He had worked for presidents Nixon, Gerald Ford and Reagan, writing speeches advocating free trade and other mainstream Republican policies. But Hemmer describes how, when he sought the Republican nomination against Reagan’s successor, George H.W. Bush, in 1992, he electrified delegates by declaring that year’s presidential election to be “a war for the soul of America.”

Claiming to speak for the “forgotten man,” Buchanan said the United States needed a revolution to reverse its decline. He expressed white grievances about racial quotas and forced integration of schools and neighbourhoods, and called for a Trump-like security fence along America’s southern border.

He ran again for the Republican nomination in 1996, and for a time seemed to have a serious chance of winning. But even as his prospects faded it was clear that what Hemmer calls his “harsh, outrageous and uncompromising” style had evoked a strong reaction in the Republican base. He remained a prominent commentator well into this century, his comments sliding further towards outright racism.

Buchanan was the first figure to appeal to a new Republican base, with its sense of decline and displacement, its multiple resentments, its feeling of living on the wrong side of the country’s deepening regional inequalities.

During this period of disruption the most important Republican in Congress was Newt Gingrich. He was determined to develop a more militant and polarising strategy against the Democrats, and believed that waging political battles, even unsuccessfully, gained publicity, shaped agendas and created loyalties.

Gingrich’s high point came at the 1994 midterm elections, when his market research–driven platform, “Contract with America,” helped produce not only the first Republican majority in the House for forty years but also what Hemmer calls the most conservative Congress in American history.

As she observes, Gingrich pursued a mix of careful legislative compromise, over-the-top rhetoric and extreme procedural obstructionism. His willingness to compromise with Bill Clinton’s Democratic administration was largely, and consciously, hidden by his inflammatory rhetoric and legislative intransigence.

Eventually Gingrich was devoured by his own revolution. He was the first to confront the conflict between animating the Republican base and winning elections. He soon discovered he couldn’t turn the passions of the base and his hardline followers on and off at will. Congress’s shutdowns of the government in 1995 and 1996, popular with hardliners but not with the public, helped Clinton’s sweeping re-election in 1996.

Two years later, in the lead-up to the 1998 midterm elections, national politics was dominated by Clinton’s impeachment for perjury and obstruction of justice following his affair with White House intern Monica Lewinsky. But the Republican stress on polarisation and scandal didn’t bring electoral success. Unusually, the president’s party secured a strong swing, and Gingrich resigned.

Outside the political system another figure, broadcaster Rush Limbaugh — a third key character — had gathered a large following. The abolition of the fairness obligation in US broadcasting law in 1988 had brought an upsurge in right-wing radio shock jocks offering a new kind of political entertainment combining punditry and humour. By the early 1990s Limbaugh, by far the most successful of them, had hit “the sweet spot between outrage and disgust,” Hemmer writes. He neatly avoided outright bigotry, targeting “militant” homosexuals, for example, and “loving” Black people but not the Black leaders or the Democrats taking advantage of them.

As with Gingrich, Limbaugh’s high point was the 1994 congressional election, which he called “Operation Restore Democracy.” After the victory, the Republican caucus christened him “the majority maker” or, for some, the leader of the opposition.

Limbaugh was important not just as an individual but also as a type. Through the 1990s, radio programs like Politically Incorrect brought a new type of political entertainment, producing an array of right-wing pundits with little experience in journalism, politics or the academy but an ability to score political points in ways that entertained and infuriated.

Eventually the right wing established a more important media beachhead than shock jock radio. In 1996 a friend of Limbaugh’s, Roger Ailes, teamed up with Rupert Murdoch to establish Fox News. Within five years it was the top-rating cable news channel and an important player in Republican politics, with prominent commentators feeding the base “red meat.”

No one had foreseen the media fragmentation that began in the 1990s, or the accompanying decline in professional standards. Reinforcing prejudices became more important than weighing evidence; and falsehoods, deliberate or otherwise, went unpunished.

Nor had anyone predicted how the political agenda would change. The affluence of the 1960s had expanded the range of political issues to take in feminism, civil rights, consumer protection and anti-war sentiments. Although a backlash from the right was probably inevitable, what was most notable over the subsequent decades was that different issues excited different constituencies.

Perhaps the most surprising change among Republicans was in attitudes to immigration. Back in 1992, despite Buchanan’s efforts, opposition to immigration had little traction as a political issue; just four years later, as Hemmer shows, even “moderate” Republican Bob Dole was claiming that up to 10 per cent of immigrants might be criminals.

While the coarseness and crudity of Trump’s anti-immigrant rhetoric brought a new low, hostility has not faded with Trump’s defeat. Indeed a new extremist conspiracy theory, the Great Replacement, has been imported from the French far right. Its adherents argue that elites are using mass immigration to force whites into minority status.

Underlying the changing agenda of the right-wing Republican base has been a new absolutism. Abortion, for example, had long been a matter of contention in America, but with the rise of the religious right it became a litmus test of conservative credentials. As late as 1993–94, legislation to control the availability of guns had been supported by presidents Reagan, Ford and Carter, and the overwhelming majority of the public; the National Rifle Association, captured by militant elements in 1977, responded by giving large donations to pro-gun candidates.

Crucial to the growing power of far-right groups was their access to large amounts of money combined with threats to challenge moderate Republicans’ candidacies with more conservative and better-financed rivals.

The absolutist tendency also encouraged the Republican base to see their opponents as fundamentally illegitimate. In 1992 and 1996, as Hemmer recounts, Clinton was attacked not for his policies but for his alleged character. He was “Slick Willie,” a fraudulent, corrupt politician. By the time of the 1996 election a video mixing plausible and clearly false accusations, The Clinton Chronicles, had been widely distributed. Buchanan’s biggest applause came when he declared that, if elected, his first move would be to place Bill Clinton under arrest. Twenty years later, when Trump ran against Hillary, his crowds chanted “Lock her up.”

After a long-time member of Bill Clinton’s staff, Vince Foster, committed suicide, unsubstantiated rumours proliferated. By effectively accusing the Clintons of being accomplices in his murder and suffering no consequences, Limbaugh and others showed just how far boundaries had shifted.


America’s first Black president was also considered illegitimate not because of what he had done but because of who he was. After Barack Obama’s election, Limbaugh simply declared, “I hope he fails.” Fox News’s star recruit Glenn Beck declared that Obama had a “deep-seated hatred of white people.” “We are really, truly stepping beyond socialism and we’ve started to look at fascism,” Beck said when the Obama presidency was fully two weeks old. Obama had a Kenyan anticolonial view of the world, contributed Newt Gingrich.

The longest-running story about Obama’s unfitness to be president was the so-called birther controversy. Fox News was its greatest promoter, and its coverage often featured Donald Trump. The lack of any factual basis for this controversy was no bar to its longevity.

Obama’s first term saw the creation of the Tea Party, which took its name from the Boston Tea Party of 1773, a significant event in the lead-up to the American war of independence. The Tea Party called on “American patriots” to “take back” their country. The talk of patriotism obscures the fact that its targets were other Americans — Americans whose political legitimacy they refused to recognise.

This kind of anger also dates back to the 1990s. In 1993, when America was still riding high globally, Irving Kristol, dubbed the godfather of neoconservatism, declared:

There is no “after the Cold War” for me. So far from having ended, my cold war has increased in intensity, as sector after sector of American life has been ruthlessly corrupted by the liberal ethos… We have, I do believe, reached a turning point in American democracy. Now that the other Cold War is over, the real cold war has begun.

It is hard to know what social or political developments were driving such a dramatic rhetorical escalation.

The view that political opponents are illegitimate and compromise is weak has had a profound impact on how policies are debated and decided. Obama was elected president just after the American economy collapsed in late 2008. By the time he was inaugurated the stock exchange had lost half its value. But when the new administration launched an urgent and necessary stimulus package, not a single Republican member of the House of Representatives voted for it.

The economic record of recent Republican presidencies is a triumph of ideology and short-sighted expedience. Reagan cut taxes but was unwilling to cut popular programs, so he failed to deliver the balanced budget he’d promised. In fact, as Hemmer recounts, the budget deficit rose so sharply that his economic adviser, David Stockman, resigned. Reagan quietly introduced some tax increases late in his term.

Reagan’s successor, George H.W. Bush, sought a campaign advantage over his Democrat opponent, Michaek Dukakis, in 1988 by declaring, “Read my lips. No new taxes.” Faced by the large-scale debt once he was in government, he raised taxes, only to be denounced by his own side, led by Gingrich, for betraying Reagan’s vision.

Bush’s son, George W. Bush, cut taxes and then embarked on two military exercises, in Afghanistan and Iraq, destined to become America’s longest wars. Rising budget deficits inevitably followed. Donald Trump also cut taxes drastically without parallel spending cuts, deepening government debt even before Covid struck.

When Democrats are in office, by contrast, Republicans are animated by the urgency of the debt problem, which must always be tackled by cutting government spending rather than raising taxes. A chance to dramatise their view comes each year when Congress engages in the peculiar American ritual of approving an increase in the debt ceiling. In their classic book It’s Even Worse Than It Looks, Thomas E. Mann and Norman Ornstein document how the Republicans, seemingly willing to force the government to default, sought to hold the Obama administration hostage in 2011. (Another round of debt-ceiling negotiations began this month.)


The most obvious characteristic of the Republicans’ actions is hypocrisy. But their behaviour also directly affects the quality of governance, which they appear not to consider important. The strain on public sector infrastructure and public services in education and health is affecting Americans’ quality of life. Government has also been left without the institutional strength or the political will to deal aggressively with global warming, the Covid pandemic and other major new challenges.

Equally worrying is the effect of the Republicans’ institutional vandalism on democratic processes themselves. America is unique among established democracies in having such highly politicised courts. It is also unique in having one major party whose electoral strategy involves making it more difficult for likely supporters of the other side to vote. After the 2020 presidential election we saw a further escalation: when Trump dismissed the vote as rigged, he created a precedent that could well become a common tactic. If the public loses faith in the integrity of the electoral process, the consequences could be drastic.

Let us hope that in the 2030s Nicole Hemmer doesn’t need to write an equally insightful book about the destruction of American democratic institutions in the 2020s. •

Partisans: The Conservative Revolutionaries Who Remade American Politics in the 1990s
By Nicole Hemmer | Basic Books | $57.25 | 368 pages

The post The war for the soul of America appeared first on Inside Story.

]]>
https://insidestory.org.au/the-war-for-the-soul-of-america/feed/ 0
Double-sided mirror https://insidestory.org.au/double-sided-mirror/ https://insidestory.org.au/double-sided-mirror/#respond Wed, 25 Jan 2023 06:56:52 +0000 https://insidestory.org.au/?p=72733

How anthropology flourished as colonialism began its decline

The post Double-sided mirror appeared first on Inside Story.

]]>
Anthropology’s association with colonialism has generated debate, guilt, self-justification and intellectual crises within the discipline for decades. Critics have emphasised how colonial interests facilitated and perhaps directed research.

During a crucial period, though, from the 1880s to the 1930s, imperialism was foundering while the study of its subjects flourished. In her new book, In Search of Us, historian Lucy Moore traces the evolution of anthropology during that time by profiling leading American and European practitioners.

Franz Boas, her opening subject, is often characterised as the “Father of American Anthropology” and his 1883 stint in the Canadian Arctic with Inuit people is considered the first real anthropological fieldwork. Proud and ambitious, he had studied physics, philosophy and geography — his thesis was on colour perception — and fought duels with several students who insulted him.

As a German Jew, Boas’s prospects of academic appointment in his own country were limited. Having collected and curated items for museum display in Germany, he emigrated to the United States in 1889 to take up an appointment at Clark University. But his research into children’s growth offended local sensibilities and his stay there proved short-lived. Anxious to remain in America, he worked for a time in museums.

Boas’s early works reflect the nineteenth-century concern with material culture and biological difference. He rejected the notion that physiological variation between human populations indicated racial characteristics that could be hierarchically ranked. He viewed physical racial differences as environmentally determined and modified by cultural factors.

Back in the academy and established as a professor at Columbia University, his focus on culture was more marked, and his students were encouraged to study language, myths, rituals and social behaviour rather than material culture. But the association between museology, ethnology and anthropology endured.

Moore documents the “moral murkiness” of anthropology’s subject matter in this pioneering period. Skulls and skeletons, along with sacred objects, clothing, tools, canoes and weapons, were routinely collected — often by ethically dubious means — for museums in Europe and America.

Among the expeditions was a British voyage to the Torres Strait, led by Alfred Cort Haddon, another of Moore’s main characters, which collected many artefacts now residing in the British Museum. Haddon’s group included the polymath William Rivers, whose training in medicine and psychology was gradually integrated into his research. Rivers’s ideas about anthropology were strongly influenced by his scientific studies and he sought to establish methodologies that were rigorous and holistic in scope.

Unlike many of his successors in British anthropology, Rivers was extremely sensitive to the impact wrought by colonialism, including the spread of venereal disease and alcoholism. He surmised, too, that its psychological effects would render people hopeless and could lead to “racial suicide.”

The idea of salvage ethnography, whereby researchers sought to describe and interpret cultures before they were contaminated by colonial intervention or faded away, dominated anthropological research in the early twentieth century. Most of the anthropologists Moore writes about coupled this concern with a concept of culture as a bounded entity, encompassing kinship systems, languages, cosmologies, rituals and myths.

Just what counted as elements of a “culture” was based on Western ideas of human capacities, though, and the teleological assumption that “civilisation” was a historical culmination. Progress from barbarism necessitated the ideal of a primitive society and the gradual evolution of social and cultural formations that constituted a civilised state.


Edvard Westermarck, a Finn who was as much a philosopher as an anthropologist, produced a book on the history of human marriage before going into the field. His work was inspired by questions of universality and cross-cultural comparison — of customs, moral values and understandings of identity.

In spite of his grandiose intellectual aspirations, Westermarck emerges from Moore’s book as a modest, tolerant and sympathetic scholar. He referred to informants as “teachers,” prefiguring the shifts in nomenclature adopted decades later by a generation of researchers eager to acknowledge their sources as at least their equals. But his academic ambitions were to some extent thwarted by his fieldwork experiences in Morocco. Entranced by the country, the people and the way of life, he might have been the first anthropologist to have “gone troppo.”

Certainly he was the first influential scholar to offer criticisms of disciplinary ethnocentrism after coming to the “somewhat disappointing but not altogether unwholesome conclusion that the belief in extreme superiority of our Western civilisation really only exists in the Western mind itself.”

Moore explores the tensions and contradictions inherent in ethnographic research, constantly alluding to the ambiguities in fieldwork experiences. She invokes the image of a “double-sided mirror,” describing how the anthropologist “sees her own society reflected darkly back at her when she looks at another society and the society she observes begins to see itself through her eyes.”

Participant observation and immersion in others’ daily lives were the research processes that distinguished true ethnographic research from the armchair ethnology of predecessors, which had relied on reports by colonial officials, missionaries and travellers. Just what the people who had an anthropologist living among them thought remains mysterious, although a rich literature of memoirs documents diverse reactions, including amusement, astonishment and grudging tolerance.


All the anthropologists Moore writes about believed they were establishing a scientific discipline with strong theoretical underpinnings and rigorous methodologies. Observing communities and interpreting social and cultural activities were construed as activities that would produce a “science of society” based on human universals.

Alfred Radcliffe-Brown and Bronisław Malinowski both used functionalism as the theoretical basis for collecting data and making anthropological claims. But where Radcliffe-Brown’s functionalism followed Durkheim’s theories about social solidarity, Malinowski focused on humans’ biological needs, arguing that all social and cultural behaviour could be understood as rational means of fulfilling them.

Both men demanded admiration and each attracted dedicated acolytes (and enemies). Radcliffe-Brown’s fieldwork in Australia between 1910 and 1912 was conducted with Daisy Bates, an enthusiastic amateur ethnologist who spoke several Aboriginal languages and had spent years collecting and recording linguistic and cultural data on many Indigenous groups. She expected a collaborative research relationship, but Radcliffe-Brown clearly considered her a mere “informant.”

Bates publicly denounced Radcliffe-Brown for plagiarising a manuscript she had given him. Her interest in Aboriginal culture was passionate and untutored, her methods those of the meticulous collector rather than the scientist. She saw herself as an advocate. Being a woman with no tertiary education, her achievements were remarkable, but they earned only disdain from Radcliffe-Brown.

Not that his academic peers fared any better. He viewed the work of most American anthropologists, especially women, as intellectually inferior. Thin-skinned when criticised, he was described as “impenetrably wrapped in his own conceit” by Ruth Benedict, the first woman to be president of the American Anthropology Association.

Malinowski inspired similar devotion from his students but emerges as a far more charismatic figure. More significantly for his long-term influence on anthropology, he supported female students, several of whom had illustrious careers. Moore dubs him “The Hero,” and his anthropological writings indicate that he too saw himself in this light — adventurous, trailblazing, courageous.

His posthumously published diaries and letters reveal that he was also hypochondriacal, self-pitying and not quite so enmeshed in the daily life of Trobrianders as he would have had his readers believe. He took with him into the field an arsenal of medicines, including quinine, cocaine, arsenic, purgatives and emetics, as well as an extraordinary quantity of tinned food.

His influence on British anthropology was immense. A century later, his Argonauts of the Western Pacific continues to inspire debate. His postgraduate students included Raymond Firth, Edmund Leach, Hortense Powdermaker, Phyllis Kaberry and Jomo Kenyatta — each of whom went on to become leaders in their areas of study (and, in Kenyatta’s case, prime minister of Kenya).


This was a time when science was revered as the vehicle of social and economic progress. Anthropologists, struggling to present their work as “useful,” argued that their clear and detailed analyses of specific societies could inform government policies and assist in maintaining peaceful relations and promoting economic development.

It was these claims that contributed to anthropology’s reputation as “the handmaid of colonialism.” But, as Moore and many others have pointed out, there is very little evidence that colonial officials drew on anthropologists’ knowledge or insights. More often, they were considered disruptive of racial boundaries and ignored.

Audrey Richards, one of Malinowski’s postgraduate students, was an advocate of utilitarian, applied research who believed Africans should be trained in all disciplines. She saw the need for sustenance as the primary human function, making food provision, understanding of nutrition and methods of cultivation crucial areas of social inquiry.

Richards worked in several African colonies and in the Colonial Office, including a period as founding director of the Makerere Institute of Social Research in Uganda and time at the African Studies Centre in Cambridge. Richards fostered “a unique period of discourse between high government and intellectuals black and white,” reports Moore, and her reputation as a superb teacher and fieldworker remains unsurpassed.

Although Moore calls her “The Bluestocking,” she emerges as far more down-to-earth, pragmatic and funny than any of the book’s other subjects — perhaps because she was more committed to improving the lives of others than pursuing an illustrious academic career.

Of the three American female scholars included in In Search of Us (the other two are Margaret Mead and Ruth Benedict), Zora Neale Hurston is undoubtedly the most lively, creative and intriguing. Extroverted, charming and, after a considerable struggle, highly educated, she is now better known as the author of novels about the lives of poor Black Americans in the American south, including Their Eyes Were Watching God and Barracoon.

Assisted by Franz Boas and his colleague Melville Herskovits, Hurston obtained grants to work in Florida “to gather materials dealing with the traditional beliefs, legends, sayings and customs of blacks and, implicitly, to demonstrate their richness and beauty.” Notwithstanding the obvious prejudice she faced as a Black woman finding academic employment, she distanced herself from the civil rights movement. She was an individualist, libertarian and often contrarian, and opposed all forms of discrimination, affirmative or negative. She aspired to live in a world in which racial distinction was irrelevant.

If Malinowski’s hypochondria and heroism seemed self-dramatising, Claude Lévi-Strauss’s experiences in the interior of Brazil are hair-raisingly dramatic. Like his contemporaries, he sought to study — via survey rather than immersive observation — a tribe “untainted” by colonial interventions. His first expedition was almost anachronistic, as he intended to collect artefacts for the Musée de l’Homme while documenting the culture of Indigenous Amerindians. He traversed the Mato Grosso with “a caravan of fifteen mules, thirty oxen and an unreliable truck… as well as twenty local youths of Portuguese ancestry,” accompanied by his anthropologist wife Dina, a French biologist and a museologist from the Brazilian national museum, whose interests were more archaeological.

The expedition was besieged by insects, including a minuscule bee, the vector for an eye infection that affected everyone except Claude. His wife returned to Paris while he continued his journey westwards for several months. He studied the Nambikwara people, who had “one of the most rudimentary forms of social and political organisation that could be imagined,” and eventually encountered his untainted tribe, only to find that he could not understand them at all and could “make no use” of his observations.

Despondent and disillusioned, he wrote Tristes Tropiques, an account of the expedition and a personal, melancholic reflection on the human condition. It was acclaimed as a literary work. His influential anthropological writings came later, after he had developed his theory of structuralism.


Moore maintains that she is interested in the motivations — personal and intellectual — of the anthropologists about whom she writes, rather than in the critique of their colonial connections. But each individual’s biography necessarily includes descriptions of their political views, and their fieldwork experiences required interactions with, and responses to, colonialism and racial discrimination.

These twelve people emerge as having liberal, sometimes radical attitudes to controversial contemporary issues. All were humanists who emphasised the dignity of the people they described and sought to represent their cultures as complex but comprehensible to Western sensibilities. They embraced a cultural relativism that emphasised human equality. Lévi-Strauss insisted that “civilisation impoverished humanity as much as it enriched it; anthropology might just as easily be termed entropology. All one could hope to do was to spread humanism to all humanity.” Claims of scientific authority were abandoned.

Humanist anthropology insisted on the dignity and value of human beings, regardless of race, gender, colour or national status. Moore argues that the architects of the Universal Declaration of Human Rights in 1948 “were fundamentally influenced by the work done in anthropology over the previous decades.” Cultural relativism ushered in worldviews that confronted old hierarchies and celebrated difference. In the wake of the second world war, the universal “human family” that had been the foundational assumption of anthropology became the basis of the international legal recognition of “equal and inalienable rights.”

Now, however, as Moore acknowledges in her conclusion, this anthropocentric ideal of “the supreme value of the human person” is being challenged by the problems of climate change, environmental destruction and the extinction of animal and plant species, all of which threaten human survival. •

In Search of Us: Adventures in Anthropology
By Lucy Moore | Atlantic Books | $34.99 | 320 pages

The post Double-sided mirror appeared first on Inside Story.

]]>
https://insidestory.org.au/double-sided-mirror/feed/ 0
China’s forgotten reformer https://insidestory.org.au/chinas-forgotten-reformer/ https://insidestory.org.au/chinas-forgotten-reformer/#comments Tue, 13 Dec 2022 22:18:28 +0000 https://insidestory.org.au/?p=72229

A historian rescues a former leader from the party’s airbrushers

The post China’s forgotten reformer appeared first on Inside Story.

]]>
In Milan Kundera’s novel The Book of Laughter and Forgetting, a Czech communist hero and Politburo member, Vladimír Clementis, appears in a famous photograph of the top brass reviewing a military parade on a winter’s day in 1948. It was a cold day, and Clementis had lent the party leader his fur hat.

Not long after the photo is taken, Clementis commits a grave ideological error. He is arrested as a traitor and hanged. Party hacks meticulously erase every mention of him from the historical record and airbrush him from every photo. And yet his hat, now part of the party leader’s image, remains in that famous photo. Like the sight of the bare wall where Clementis once stood, the hat is an obscure but indelible reminder of the person who has been struck out.

Historian Julian Gewirtz examines another blank space in his new book, Never Turn Back: China and the Forbidden History of the 1980s. This is the stretch of wall where former premier and party general secretary Zhao Ziyang once stood. Gewirtz restores to Zhao’s head the hat of the “architect of reform and openness” that has sat on Deng Xiaoping’s head in the decades since. The result is a landmark work of historical scholarship with profound significance for understanding China today.

When Mao Zedong died in 1976, the Chinese economy was in tatters. Mao’s “continuous revolution” and factional infighting had broken China’s politics, and more than two decades of violently waged ideological campaigns — which had culminated in the 1966–76 Cultural Revolution — had left society scarred, cultural heritage decimated, and educational institutions and intellectual life eviscerated.

The new leadership that coalesced under Deng Xiaoping in 1978 needed to tackle political, economic and social dysfunction. The key, they agreed, lay in modernising agriculture, industry, defence, and science and technology: the “four modernisations.” The central, burning question of the decade was how exactly China would achieve those modernisations.

Deng was prepared to tolerate a certain amount of systemic change to make government more efficient and less top-heavy, to encourage local initiative, to promote professionalism, and to create the legal and other structures under which the economy could grow. He also recognised the need to prevent the emergence of another leader like Mao who would concentrate power in his own hands and rule for life.

He left the details to others. Those others included Hu Yaobang, who helmed the party from 1981 to 1987, and Zhao Ziyang, who was premier from 1980 to 1987 and then, following Hu’s purge from the leadership, party general secretary from 1987 to 1989, when he himself was purged.

In Never Turn Back, Gewirtz focuses on the contribution of Zhao and the advisers he gathered around him. He notes that the assiduity with which the party has sought to erase Zhao from the record only underlines his importance. (Hu Yaobang leaves a similar patch of blank wall, but it’ll require another book to fully restore him to view.)

Fixing the economy was among the party’s most urgent priorities. Extreme poverty was widespread, especially in the countryside, and industry was almost uniformly heavy, with consumer goods from clothing to radios in scandalously short supply. Zhao was a chief proponent of the “coastal development theory,” which promoted light industry in coastal regions, close to ports, with access to cheap labour. The last would be of advantage in encouraging foreign investment, with an emphasis on investment involving technology transfer.

The United States was among the many advanced countries eager to exploit the economic opportunities that came with China’s economic reform and opening up. Export-oriented coastal development helped not only to build China’s foreign reserves but also to drive the country’s transformative economic growth during the 1980s and beyond.

Another driver was the decision to invest heavily in science and technology, in particular information technology and biotech: a plan inspired by Zhao’s reading of Alvin Toffler’s Third Wave that chimed with Deng’s interest in modernisation.

But Zhao and his circle were also interested in systemic change. They saw political and economic reform as two sides of the same coin. Among the ideas they discussed, Gewirtz writes, were methods for “separating the party and the government, building up more independent institutions of the media, the judiciary, and the legislature, and increasing transparency, accountability, and even freedom of speech and debate.”

They even discussed how to adapt the notion of checks and balances to Communist Party rule. Thinkers in Zhao’s circle, including Bao Tong, who described democracy as “a kind of mechanism that can correct its own mistakes,” sought a balance between democracy and the “dictatorship” baked into the definition of governance in the People’s Republic.

Gewirtz cautions against concluding that Zhao or Hu were “inborn liberal democrats.” He stresses that Zhao explicitly and frequently endorsed “methods of dictatorship” and in no way advocated for a democratic free-for-all or an end to party rule. There were obvious tensions and contradictions in the two men’s thinking, and it also brought them into conflict with others in the party.

The 1980s were marked by incredibly vibrant cultural ferment, and ideological and intellectual contestation extending far beyond the relatively closed circles of the policymakers. From artists and poets asserting their right to creative expression to students inhaling Sartre, Nietzsche and rock’n’roll, there was a sense of infinite possibility.

That didn’t mean that anything was possible: when a young electrician called Wei Jingsheng had the temerity to suggest in 1979 that without the “fifth modernisation” of democracy there was no guarantee that Deng himself wouldn’t become a tyrant like Mao, he was slapped in prison with a fifteen-year sentence.

The danger for Zhao and those around him was that conservatives, already hyperventilating over “the sight of people drinking Coca-Cola in the streets of Shanghai,” would associate the reformists with genuine radicals like Wei Jingsheng.

When Wang Ruoshui, deputy editor-in-chief of the party mouthpiece People’s Daily, asked why such tragedies as the Cultural Revolution “happen repeatedly in socialist societies” and suggested that the Communist Party embrace the previously taboo notion of “Marxist humanism,” the hardcore ideologues lost their minds.

The subsequent 1983 campaign against “spiritual pollution,” initially endorsed by Deng, gave expression to the conservatives’ fears that the changes set in train by modernisation and economic reform could bring down the party itself. Like “bourgeois liberalisation,” “spiritual pollution” was a catch-all term that could signify anything from street crime to obscurest poetry, high-heeled shoes, “erroneous trends of thought” and the generally “foul smell” of new ideas from the West.

Deng put the brakes on the campaign when it appeared to threaten modernisation itself. The 1980s were riven by many such ideological tugs-of-war, and Gewirtz argues that they should be taken seriously.

Today, the Communist Party promotes the line that economic development thrives best under authoritarianism. It touts the “China model” of autocratic rule, social stability and economic growth as a logical and historical inevitability. Any other choice by the post-Mao leadership, the line goes, would have led to collapse and chaos — the crumbling of the European socialist bloc and Soviet Union from 1989 being instructive and, for party leaders, frightening examples. To think otherwise is to be guilty of “bourgeois liberalism” or labelled an agent of global capitalism’s “peaceful evolution” plot to dupe the Chinese people into spurning communism.

Yet Gewirtz’s “forbidden history” shows that the party leadership once seriously contemplated comprehensive structural, political reform. He contends that many of the persistent contradictions within and obstacles to China’s economic development in the decades since — including the persistence of corruption — are directly linked to a refusal to contemplate more systemic reforms. The 1980s was a genuine sliding-doors moment; there were other conceivable futures, albeit still under party rule.


Zhao was purged from the leadership in 1989 and spent the rest of his life under house arrest. The immediate trigger for his downfall was his visit to protesters in Tiananmen Square, where he spoke sympathetically and emotionally to hunger strikers. But this was just the final straw as far as his enemies within the party were concerned.

After crushing the student-led movement on 3–4 June that year, the party decoupled economic reform from political reform. Zhao disappeared from the record and Deng got to wear the former general secretary’s hat as the “architect of the reforms and openness.”

There is much rich detail in Never Turn Back, for which Gewirtz delved deep into archives, papers, official accounts, diaries and memoirs, some of which have only recently become available. The result is a provocative counter-narrative to the Communist Party’s account of that era and the reforms that turned China into the economic great power it is today.

“What would happen,” Gewirtz asks, “if the Chinese people were allowed to know this history? Would it produce the terrible chaos so feared by the Communist Party?” In his conclusion, he sounds a note of hope: “Just as greater openness can be found in China’s past, it might well be found again in China’s future.” •

Never Turn Back: China and the Forbidden History of the 1980s
By Julian Gewirtz | Harvard University Press | $55.95 | 432 pages

The post China’s forgotten reformer appeared first on Inside Story.

]]>
https://insidestory.org.au/chinas-forgotten-reformer/feed/ 6
Ambivalent in Arnhem Land https://insidestory.org.au/ambivalent-in-arnhem-land/ https://insidestory.org.au/ambivalent-in-arnhem-land/#comments Mon, 12 Dec 2022 22:02:51 +0000 https://insidestory.org.au/?p=72171

Have a determined anthropologist and a gifted writer come to terms with how differently Yolngu do things?

The post Ambivalent in Arnhem Land appeared first on Inside Story.

]]>
The title of Don Watson’s new book, The Passion of Private White, doesn’t do justice to its dense texture — and I may not either. My niggling discomfort and occasional indignation peppered its pages with question marks.

Full disclosure: like Private Neville White, I began PhD fieldwork as a mature age anthropology student in a remote Arnhem Land community in the 1970s. Unlike White, I was studying social rather than physical anthropology. White gave up his research in favour of providing financial and practical help to a Yolngu community at Donydji. I continued research with Rembarrnga people around Bulman and in the archives.

What keeps me connected to Arnhem Land are friendships that are indistinguishable from family ties, as well as an ongoing interest in the way conditions there are being “developed” by the peculiar, problematic governmental processes that also intrigue the author of this rather baffling book. Arnhem Land is indeed Another Country, as David Gulpilil says in the film of that name.

The Passion of Private White opens in 2005 with Watson’s first arrival at Nhulunbuy, a town on the Gove Peninsula, where he is met by his old friend Neville White. This, he says, is “mining, hunting, and fishing country, and Aboriginal country, so it is also Toyota HiLux country.” The cafe–store is named Captain Cook, a hint of the casual racism that pervades Australia’s remote mining towns.

Watson introduces Tom, Neville White’s main Yolngu friend, as “the senior man” at Donydji, one of many scattered outstations in the Arnhem Land bush and the main site of White’s efforts. He smiles at Watson and climbs into the HiLux beside him. Tom has “only a few words of English,” which made me wonder whether either Watson or White spoke Kriol. After several hours’ drive on the dirt track that is also the Central Arnhem Highway, they reach the outstation.

From the start Watson interweaves his own experiences with accounts of White’s past and present projects, and with fragments of history — Yolngu contact with Makassans and missionaries, the depredations of pastoralists and police — as well as more recent events and scattered quotations from anthropologists (on one page, fifteen are named). The narration is confident, as if the past is settled and known and the present readily understood, but Watson’s diary-like depiction of events and his speculations and evaluations reflect balanda (whitefella) common sense. His stories of White’s efforts are those of a surprised stranger revealing Australians’ colonising passions. The practices and priorities of the colonised remain obscure.

Chapter two provides harrowing details of Neville White’s Vietnam experiences. While critical of the war, he didn’t refuse the call-up and in Vietnam found himself engaged in hideous combat and moral dilemmas that haunt him to this day. Bitter experiences upon his return added damage to body and soul, now evident in PTSD.

White attended university after his tour of duty, eventually undertaking doctoral fieldwork using biological methods and oral histories to ascertain facts about population flows in the deep past. He collected fingerprints and blood samples from 2360 people, and “walked the country,” covering large areas with local guides and informants. His research was applauded in the academy and published extensively. Now, decades later, local rangers are able to rely on his maps.

Gradually, his academic work gave way to a passion for helping. Decades after receiving his doctorate he continues to visit Donydji, building houses, toilets, a school and a workshop, installing water pumps and solar panels and providing equipment, often using money he raises in Melbourne. We are not told of requests from the Donydji residents nor of any negotiation about what is built, or where or when.

White found the work therapeutic and recruited Vietnam vet mates who make short visits, camping separately from the community and working efficiently from dawn to dusk. When Watson first visits, a contractor is building a school with the help of Neville’s volunteers from Melbourne. A few Yolngu participate but most appear as passive observers of White’s projects. There is no mention of whether these strangers, or Watson himself, are properly introduced to Country.

When they arrive at Donydji, Watson sees half a dozen houses, an airstrip, and about eighty Yolngu whom he cannot communicate with. They unload supplies at the various camps. With Rotary funds White has built a workshop that houses vehicles and equipment to enable young men to “learn the trade and make an independent living” as mechanics, although there is no working economy here. Watson later describes the amazing skill of one Yolngu bush mechanic, but as shown in the Walpiri film Bush Mechanics, such skills are usually deployed locally and voluntarily. Donydji women garner sustenance from the bush, men shoot and butcher buffalo, and fish are caught in a distant river. Demanding and dangerous bush expeditions reveal beautiful dramatic country, but its meaning to Yolngu is touched on only briefly.

Tom is Neville White’s close friend and trusted informant at Donydji and authorises White’s activities. Tom tells White that Tom’s authority and plans are being threatened by a Yolngu man known here as Cowboy, who wants to establish a cattle station in the area. We are told that Cowboy is an illegitimate interloper whose plan is a threat to traditions rather than a possibly viable enterprise that could provide an income. Later, though, Cowboy’s presence at Donydji is treated as legitimate. White and the author appear unaware that lengthy negotiations over competing claims and plans are typical of Yolngu politics. The anthropologist Les Hiatt’s film Waiting for Harry illustrates such a process.

Arnhem Land, an area of 97,000 square kilometres, was designated an Aboriginal Reserve in 1931 after a century of intermittent, violent intrusions. Several pastoral and mining ventures had failed, and many and varied missions established. When bauxite was discovered at Gove in the 1960s, the government ignored the reserve’s status and the Yolngu protests and allowed a large mine, refinery and town to be established.

Since the 1970s the federal government has supplied modest funding for outstations to enable a “return to Country” from missions and towns. Outstations like Donydji rely on government support and services that are often appallingly unreliable and inadequate, as we see in Watson’s later chapters. One small but telling example involves the Donydji teacher, who is employed three days a week but spends two of those days travelling.

It is government incompetence that energises White’s work, along with his determination to provide what he sees as necessities. Yolngu live largely outside during the dry season, but houses have become necessary as they have accumulated possessions. In balanda eyes the buildings are the very core of community life, more important than the service and shelter they supply.

Neville White is engaged in a sort of “borderwork,” a term anthropologist Barry Morris coined for work at the interface of the cultural worlds of Indigenous people and Europeans — also known as Yolngu and balanda, natives and settlers, blackfellas and whitefellas, or “them” and “us.” Neville White is intent on improving things in this “remote Indigenous community,” a concept that was cruelly pathologised in the national imagination when the NT Intervention was launched in 2007.


I admired Don Watson’s Recollections of a Bleeding Heart (2002), a kind of ethnography of the cultural realm that politicians create and inhabit in Canberra. Watson understood that setting well and could assume readers’ knowledge of how Australia’s political system works.

He is less comfortable in Arnhem Land. His broader erudition is clear from a multitude of quotations and aphorisms from dozens of local and international anthropologists, along with citations of intellectuals from Lucretius to Simone Weil and Sontag, Sophocles to Hume and Camus, and of the Bible. I wondered if he was offering readers alternatives to his own ambivalence about White’s quixotic efforts. Or was he excusing his own bafflement, which is partly a consequence of his inability to communicate with Yolngu residents?

Watson makes several more visits to Donydji with White, describing several individual veterans and relating numerous exciting, surprising or humorous events. White’s work includes training Yolngu men in building and mechanical skills, but his efforts are up against floods, droughts, distances and wild animals. A septic tank floods, a badly installed generator fails, solar panels cease to work. Government stuff-ups cause further frustrations.

Cooperation seems to be lacking from those White believes he is helping, as evident in an apparent carelessness with the things he has supplied for community use. Expensive new tools are locked in the workshop, but returning from Melbourne White finds the lock broken and tools scattered or lost. He describes this as a break-in, ransacking and stealing. But the workshop was intended for community use, and it turns out that there was a desperate need to repair a vehicle. Most of the equipment is retrieved.

When a house is damaged White tries to shame the owner, saying the house was bought with money he collected in Melbourne. The offended owner responds, “Your money; my house.” White and Watson seem oblivious to this clue to a different system of ownership, responsibility and authority.

Bafflement is not surprising in the face of a traditional society organised in ways fundamentally at odds with those of Europeans. There are other clues to these balandas’ misunderstanding of Yolngu social structure. When Watson says White was “granted the name Balang” and, later, that balang means brother, he shows a common confusion about the everyday language of kinship. Personal names are private and not used as English names are. A “skin name” like balang refers to one of the eight categories that position everyone within a system of relationships Yolngu absorb in infancy. Because I became ngaritjan, my husband became balang, and my children gamarang and gamayn.

Everyone is enfolded within this system and everyone is family. Some are close, others distant, and their roles carry specific but not necessarily strict obligations and expectations. English terms such as brother, mother and cousin mean quite different things here. Concerns are all personal but not individualistic, meaning that an impersonal “community interest” is often absent. Moreover, one does not interfere with others, something “we” balanda do constantly with our opinions and judgements.

Watson is not to blame for misunderstanding Yolngu naming practices and interpersonal manners and protocols that are quite different from those of English speakers. Like an unfamiliar language, they can be learned only with experience.

Similarly, the frequent mention of clans, language groups and owners of Country shows understandable confusion; these are matters of multiple, cross-cutting and often disputed rights and obligations, making the term owner inappropriate. Attachment and responsibility for Country are more useful terms, and these are linked to positions in the kinship system. Shared and competing obligations — to mothers’ country or to fathers’ country — are expressed in the roles taken in the major ceremonies. These are negotiated over extensive periods, and particular individuals’ responsibilities are never settled once and for all. I claim no special expertise in these matters, but their significance and meaning in everyday social interaction becomes apparent in any sustained participation in Yolngu community life.

It was not inaccuracies in Watson’s account but the constantly implied sense of “our” normality that kept me on edge. Watson’s own view of what he is observing is both elusive and intrusive. Early in the book, while trying to describe rather than analyse what is going on, Watson notes White’s zealotry, commenting on his “characteristic short, rapid strides: the driven soul’s indifference to anyone else’s capacities or inclinations.” I was reminded of the self-important way the manager hurried around the Bulman community and the concealed mirth it evoked among Rembarrnga women. They recognised his pedagogic intent as they sat around the fire, but were far too kind to let him know. Women’s “domestic” and “social” work was invisible to him. Private White, too, appears to largely ignore the women, at least until one throws a spear at him, skilfully missing.

Watson’s comments on Yolngu character and behaviour can seem presumptuous. As such descriptions unquestioningly accrue, Watson inadvertently endorses the familiar view that traditional Yolngu practices are no longer appropriate in contemporary conditions.

Yet we can thank him for illustrating the profound contradiction within Australian public discourses; we are urged to celebrate “the oldest culture in the world” while refusing respect to those who carry its original form. In the name of “equal human rights,” Indigenous peoples are being induced to accept the authority of outsiders with their mysterious access to apparently limitless resources. The sophisticated Yolngu system of order and authority, achieved through everyday interpersonal negotiations between people related in embedded, normative ways, is invisible to colonisers.

Later in the book White announces plans for an elected council, with a general manager, an administrator and other positions. Tom accepts these strange ideas and asks White to write an agreement. Tom doesn’t speak or read English, but recognises that writing carries authority. When Watson and White return, though, they find “grass and weeds… halfway up the red, yellow, and blue plastic slide in front of the new school” and Watson says that Donydji is “slipping between Chaos and Eden.”

“Weeds”! Here we see a link between aesthetic, moral and political judgements. Watson’s renderings are morally ambiguous, but his comments on Yolngu attitudes often struck me as misperceptions, perhaps based on White’s understandable frustrations. It is difficult for balanda to respect people who appear to resist the hard work we do for them. But look closer and we see that our insistent concern interferes with Yolngu’s own ways of adjusting to colonised conditions and the strange, intrusive habits of outsiders.

Watson’s dry wit and clever, often ironic phrasing are born of his interest in Private White’s passion rather than his own experiences in Arnhem Land. He wisely limits his explanations and judgements of Yolngu; those he does offer can be disturbing. In the wake of the workshop destruction, he says: “Nothing so grieves a balanda — especially, perhaps, a balanda army veteran — as the casual anarchy and selfishness the [Yolngu] philosophy allows” (my emphasis).

What Watson and presumably White judge as “casual anarchy and selfishness” is better understood as a deeply held belief in individual autonomy, often expressed as “I am boss for myself.” Yolngu people don’t moralise, instruct or interfere with each other in ways familiar and normal among balanda. Nor do they expect interference and instruction, especially from people from elsewhere. Yolngu communal enterprises require careful suggestions and inducement rather than taken-for-granted cooperation or attention to “time constraints.”

The anthropologist Kenneth Maddock described the Aboriginal polity as “a kind of anarchy, in which it was open to active and enterprising men to obtain some degree of influence with age, but in which none were sovereign.” And Hiatt wrote that “few peoples can have placed higher value on altruism and mutual aid than the Aborigines of Australia. The genius of the Australian polity lay in its deployment of the goodwill inherent in kinship as a central principle… Government in these circumstances is otiose.”

Thus, the affront to Yolngu is profound when balanda take it upon themselves to assume authority in Yolngu country. Even as Private White tries to rectify government incompetence he embodies the common sense of the Australian state. Yolngu’s slow and subtle ways of practising politics are frequently interrupted by urgent and arrogant balanda intrusions. But balanda are unavoidable and Yolngu are dependent. The uncomfortable modus vivendi can be seen as an ongoing struggle between cultural norms.

Don Watson is a gifted writer, but his casual wit, irony and poetic style in telling of White’s heroic efforts fail to recognise that Yolngu do things very differently. Their different language and different conceptual framework are up against implacable, pervasive change that some try to embrace and others resist. Even their practical, everyday knowledge of the bush is challenged by balanda equipment and desire for comfort, arrogantly displayed as if unambiguously superior.

An anthropological maxim is relevant here: we are particular, not universal human beings. Our impulsive judgements as well as our deepest convictions are context-bound, cultural, shaped by the social world we assume to be normal, even natural. In other human worlds a different normality exists. The perceptive reader will find much to ponder in Watson’s book. •

The Passion of Private White
By Don Watson | Scribner Australia | $49.99 | 336 pages

The post Ambivalent in Arnhem Land appeared first on Inside Story.

]]>
https://insidestory.org.au/ambivalent-in-arnhem-land/feed/ 2
Science and uncertainty: China’s Covid dilemma https://insidestory.org.au/science-and-uncertainty-chinas-covid-dilemma/ https://insidestory.org.au/science-and-uncertainty-chinas-covid-dilemma/#comments Tue, 06 Dec 2022 01:24:01 +0000 https://insidestory.org.au/?p=72116

Behind the hardline policy is a quest for perfection that dates back to the Communist Party’s founding

The post Science and uncertainty: China’s Covid dilemma appeared first on Inside Story.

]]>
Covid anti-vax conspirators offer a thriving line in coffee and cookies on the east coast of Australia, running alternative cafes from Cairns to Nimbin and down the spine of the Great Dividing Range to Katoomba and points further south. Local customers complain about big government, big capital and (intriguingly) the New York–based Council on Foreign Relations.

A sign in the window of one anti-vaxxer hangout, across from Katoomba’s railway station, reads “We stand united against government tyranny!!” Bill Gates smiles threateningly from a silk-screen print, vaccination needle in hand, alongside an advertisement for Chakra group-healing sessions. In this part of town, conspiracy theories and alternative healing are served with coffee and cake on what appears to be a sustainable anti-vax business model.

Falun Gong pamphlets can be picked up nearby. Adherents of Falun Dafa (as they call their faith) promote healing through religious chants and breathing exercises and work to expose the brutality the Chinese government inflicts on believers back in China. They certainly are persistent, and they were in the news again recently when Communist Party general secretary Jiang Zemin died in Shanghai.

Thirty years earlier, Jiang had taken fright when a peaceful phalanx of Falun Dafa practitioners queued outside his official residence in the old imperial palace district of Beijing petitioning for recognition of their faith as a religious community. Infuriated by their presumption, he branded them a superstitious cult and banished them outright. Thousands perished in the persecution that followed and others languished in prison awaiting forced organ transplants, if adherents’ claims are to be believed.

There’s no viable business model for superstitious belief in China, where science and rational planning carry the day and the apparently irrational desires of common people count for little.

But here’s the thing. China is opening up again after three years of intermittent but severe Covid lockdowns. Over that period it managed to keep the virus in check but failed to prepare the country’s people for a timely transition from epidemic lockdowns to a more flexible model of pandemic management. Why this neglect, if China’s government is as rational, capable and forward thinking as it claims to be ? Why lift lockdowns heading into winter rather in the warmer months earlier this year when the virus was less active? As a result of this series of policy failures, China could be heading for a health crisis on a scale the world has not seen since the crisis that shook India in 2020.

Some analysts say the party erred in abandoning its long-held commitment to science by succumbing to an anti-vax syndrome of its own devising. The historian Adam Tooze recently argued that China’s government undermined public confidence in readily-available Sinovac and Sinopharm vaccines by allowing vaccine conspiracy theories to flourish online. It’s true that the government tolerated efforts to counter criticism from abroad that Chinese vaccines were inferior to Western ones (they are reported to work reasonably well after three doses) but the effect of that campaign was more likely to have enhanced confidence in Chinese vaccines than undermine it.

If anything, China’s problem is that its system of government is obsessed with science and rational planning to the point that it fails to take account of what people want. Doubt is essential to science. The science scepticism of Falun Dafa practitioners and anti-vaxxers may be over the top, but the lack of scepticism in China is no less troubling.

Democracy wears a lab coat

Big Whites, as they are called, are the neighbourhood cadres, volunteers, medics and enforcers who carry out the central government’s Covid lockdown policies clad in bulky white hazmat coveralls. The term refers to the robot in the Disney film Big Hero 6 programmed to perform medical procedures using instruments built into its bulky white-clad body. It’s a neat comparison. China’s Big Whites administer Covid tests, police entry and exit to residential compounds, and patrol streets, distribute food and detain anyone who gets in their way — by force if necessary.

The way Big Whites methodically patrol neighbourhoods in their white hazmat uniforms captures something missing from the Disney film. This is the role of science as a strong arm of politics and public policy in China and in the everyday directives of China’s Communist Party. The party does not just believe in science, it embodies and enforces it.

As a party, the communists trace their ideological foundations to scientific socialism, which they place on the same plane as the science that underpins maths, physics and chemistry. More than that, Marxism is considered foundational to all other sciences: “The intellectual foundation of science is the scientific theory of Marxism,” Xi Jinping told national educators in December 2016. The party’s ideological commitment to science lends science a place in the governance of China matched only by its place in Stalin’s Soviet Union. This is an early-twentieth-century pre-quantum kind of science in which everything is certain and quantifiable and open to precise and predictable explanation.  The connection of science with Marxism can be read to mean that whatever keeps a Marxist–Leninist party in power has to be good science.

This science of political certainty can be traced to the founding of the Communist Party in the early 1920s, when “Mr Science and Mr Democracy” (as they were called at the time) first entered the country. They came smartly dressed, faddishly foreign and unabashedly modern, and seemed destined to set the country aright. Mr Science didn’t arrive lumbered with the doubt and questioning that comes with scientific method but brought instead a modern kind of certainty to displace the older certainties of the Confucian canon. Any democracy modelled on such a science had to be certain too.

One outcome was that the new leaders of modern China expected more of democracy than it could possibly deliver. Winston Churchill’s complacent remark about the imperfections of democracy — “It has been said that democracy is the worst form of government except all those other forms that have been tried from time to time” — is widely read among China’s communists not as a concession to the uncertainties of the human condition but as a shameful acknowledgement of liberal democracy’s failure to achieve perfection.

As the editor of one of China’s leading journals of international relations remarked of Churchill’s comment, “for a statesman of the capitalist class, it must have been really difficult to express this opinion.” That was a revealing comment: only a party committed to scientific socialism would imagine it could achieve perfection or deny it had a problem dealing with the uncertainties of everyday social and political life.

The Communist Party and its state bureaucracy are structured hierarchically to achieve a certain kind of perfection in relaying messages and commands and assigning responsibility up and down a many-layered command structure grounded in scientific socialism. Atop the structure, the central leadership embodies scientific rationality and sits beyond criticism or reproach. Beneath the leadership sit the cadres.

In this system, the rituals surrounding appointment of a supreme leader such as Xi Jinping are performed with such precision that nothing can be allowed to spoil them. The capital all but grinds to a halt ahead of a five-yearly national party congress as industry is silenced, streets closed off and other meetings and events cancelled. This is not the time to announce a national policy shift that could spell uncertainty. So the summer leading into the party’s twentieth national congress in October, when Xi was appointed supreme leader for as long as he liked, was no time for modifying his “dynamic zero” Covid policy.

On this model of scientific government, there’s room for cautious policy experimentation at the margins — on local finance for example — but where experiments succeed all credit is attributed to the farsightedness of the leadership, and where they fail, local cadres are left wondering what happened and picking up the pieces.

With few regulatory or legal instruments to guide them, those cadres often end up making arbitrary decisions about who was at fault, who should be put away, whose property should be confiscated and what should be done next. When things go terribly awry, they themselves are targeted for punishment. Overall, the structure makes little provision for heeding the wishes of common people or holding cadres accountable to the communities they govern.

Science and people

Churchill’s defence of democracy was a comment on human nature as much as it was on democracy: he prefaced it with a comment on the difficulty of governing “in this world of sin and woe.” Scientific socialism has no patience for such a world. Much has been written about elite attempts to remake China’s people or, in party terms, to elevate the “quality” (suzhi) of commoners to match the expectations the leadership places in them. In the meantime, the party makes up for the shortcomings of ordinary people by creating a quasi-nation out of its own cadres, as I argue in my recent book Cadre Country, to be perfected in place of the people as instruments of party rule.

Here’s where things get interesting in Covid-afflicted China. If the party’s tens of millions of cadres are to carry out their leaders’ instructions in every corner of the land — north, south, east, west and centre as Xi Jinping is fond of telling them — then China’s cadres need to be as finely attuned to the commands of the leadership as a lab technician to the instructions of a lead scientist.

The words “science” and “precision” pepper XI Jinping’s speeches on his cadre force. Still they constantly disappoint him. When things don’t work out as the leadership plans, and communities take their anger and frustration onto the streets, it’s not the fault of the leaders who drafted scientific policies and issued clear instructions. It’s the fault of the cadres for failing to follow instructions with the precision science demands.

That’s the way communist cadre systems work. Russian historian Moshe Lewin traced the habit of communist leaders turning on and blaming the cadres beneath them to Stalin’s 1925 lecture to trainee cadres at the Sverdlov Party Institute. “The only problem is cadres,” Stalin told the trainees. “If things are not progressing, or if they go wrong, the cause is not to be sought in any objective conditions: it is the fault of the cadres.” It is this Stalinist science of government that has landed China in the mess it finds itself.

Science and Covid policy

Stalin’s management science has been playing out in China’s recent pandemic policy dilemma. On 10 November, not long after the close of the twentieth national party congress, Xi Jinping presided over a meeting of his new Politburo Standing Committee to receive a report on “Twenty Measures” for the prevention and control of the most recent Corona virus epidemic. The following day, the State Council issued a statement on the Twenty Measures, explaining that they called for “scientific and precise” prevention and control at the local level. The new formula calling for “scientific and precise” prevention and control was not a relaxation, the statement continued, but an unswerving affirmation of Xi Jinping’s standing policy of dynamic-zero control.

Alongside the State Council’s statement, official newsagency Xinhua reported that China’s experience managing Covid had shown the leaders’ polices to be “completely correct” and that all measures taken were “scientific and effective.” The new policy encouraged local officials to calibrate risks according to the needs of their neighbourhoods, and to adjust their behaviour by abandoning “unscientific practices” such as over-prescribing Covid tests. Cadres should strive to “overcome formalism and bureaucratism and put an end to incrementalism and ‘one-size-fits-all’ approaches,” they were told. Even so, this was not a time for relaxation of prevention and control.

What was a provincial leading cadre to make of this central directive? Online is a record of the Heilongjiang Provincial party group receiving the central directive and disseminating it to cities, districts, counties, and townships across the province. On 18 November, provincial governor Hu Changsheng convened the province’s Leading Small Group on Covid Work and urged his subordinates to “resolutely implement the spirit of the important instructions of General Secretary Xi Jinping’s scientific, precise and uncompromising implementation of epidemic prevention and control measures.”

Hu instructed the provincial government to “conscientiously implement the Twenty Measures of the State Council… and adhere to scientific and precise prevention and control.” Local party and government officials were told to adhere unswervingly to the national “dynamic zero” Covid policy and, in case anyone had missed it, to avoid formalism and bureaucratism.

Judging from the published record of the meeting, Heilongjiang provincial officials had no idea what central authorities expected of them under the new Covid management regime, other than to repeat its vague formulas and implement them as bureaucratically as possible.

Some local governments thought they knew better. The city government of Shijiazhuang, in Hebei Province, tried to escape formalism and bureaucratism by lifting restrictions on local residential compounds, only to find itself in trouble when local citizens objected to its “scientific, precise and uncompromising” measures for implementing the new policy. The city’s party secretary and mayor were reportedly dismissed and lower level officials were left baffled.

At the neighbourhood level, mixed signals from on high were paired with disincentives for local officials trying to implement the new policy with something like scientific precision. “We were told to relax the overly strict Covid prevention rules [but] could still get fired for not stamping out cases on time,” the Financial Times reported a county-level cadre as saying.

If local cadres were hoping to find sympathy or support from on high, they were disappointed. It was all their fault. On 1 December, Xinhua issued a detailed explanatory note on what central authorities had intended with the announcements coming out of the Politburo Standing Committee meeting of 11 November. “The correct meaning of scientifically precise prevention and control,” according to Xinhua, could be summed up in the phrase “quickly seal and unseal — and unseal wherever possible” (快封快解 应解尽解). For good measure, it added that any confusion over the centre’s directive was the fault of local cadres, not the fault of the policy or the leadership, since implementation of the “quickly seal and unseal” policy was the “responsibility of cadres.” It all came down to the quality of cadres and cadre management in Xinhua’s authoritative account.

Around the same time, the recently appointed head of the party’s Central Organisation Bureau, Chen Xi, published a major statement in People’s Daily extolling Xi Jinping’s visionary leadership and calling for the recruitment of a higher calibre of local cadres — “loyal, clean, and responsible” — than those Xi found himself commanding at present. Again, the solution lay in science: what the country now needed was a “scientific path [科学路径] for cadre management.” Heading into winter, the situation was set up for failure, with cadres trying to implement a central policy for which no one was prepared to take responsibility (certainly not Xi Jinping), while party leadership doubled down on its search for a more scientific path for cadre management.

To achieve the precision his policies require, Xi Jinping has to replace the cadre force he actually commands with a body of more faithful and responsive cadres. For the past three years he has been railing against the forty million cadres under his direction, complaining that they are prone to “kneeling before their leaders to flatter and fawn” while mistreating their subordinates; sitting on their hands rather doing anything useful; and complaining that their workload is too onerous and higher management too demanding.

By 2019 Xi had endured enough of their complaints. He called for a thorough cleansing of cadre ranks and their replacement by a team of “loyal, clean and responsible cadres” who could be relied upon to follow his directions unswervingly.

China has an enduring problem if its supreme leaders go on pretending that they are all wise and that their system of government can attain perfection. A little more doubt and lot more democracy would probably go some way to fixing it. •

The post Science and uncertainty: China’s Covid dilemma appeared first on Inside Story.

]]>
https://insidestory.org.au/science-and-uncertainty-chinas-covid-dilemma/feed/ 6
Before it was time https://insidestory.org.au/before-it-was-time/ https://insidestory.org.au/before-it-was-time/#comments Thu, 01 Dec 2022 18:45:53 +0000 https://insidestory.org.au/?p=72040

A young Western Australian catches a glimpse of Gough in 1969

The post Before it was time appeared first on Inside Story.

]]>
The fiftieth anniversary of the election of the Whitlam government prompts me to recall my first sighting of Gough Whitlam in action. Seeing the Labor leader speak during the April 1969 Curtin by-election campaign didn’t require much effort on my part: the event was at the Subiaco Civic Centre, a five-minute stroll from my home on what was probably a balmy Perth autumn’s night.

The by-election had been brought on by the resignation of the sitting Liberal member, external affairs minister Paul Hasluck, to become governor-general. At any other time, Labor would probably not have bothered to run in this very safe Liberal seat. Indeed, Labor had not run a candidate for Curtin even in the 1963 general election.

Such a cop-out would have been anathema to Whitlam. He had campaigned impressively in two by-elections in 1967, his first year of leadership, and regarded such events as opportunities to spread the party message to a citizenry that had not elected a federal Labor government since 1946.

Nineteen sixty-nine was also a federal election year. Having narrowly won a self-inflicted caucus ballot to reassert his leadership the previous year, Whitlam needed to perform strongly and pull off a decent swing at the election. While a Labor victory was almost in the realm of fantasy, winning just a few seats here and there was unlikely to cut it: too many enemies in his own party were ready to use a weak result as a good reason to turn up the heat on Whitlam.

Australia’s involvement in the Vietnam war, and the use of conscripts to fight there, remained major issues, and while it is almost certain that Whitlam referred to them that night in 1969, my only abiding memory of his address was his criticism of the inequities and inefficiencies of Australia’s federal system. What especially stuck in my mind was his scathing description of how different state governments ordered different railway rolling stock from different countries when some coordination and cooperation would make more economic and practical sense. It didn’t exactly bring the (sparsely populated) house down, but it wasn’t without impact either.

Whitlam is associated so greatly with emotion and passion (especially after 1975) it is easy to forget that in opposition he spent much more time criticising the government for its inefficiency and ineptitude than decrying its moral failings (although sometimes it was both) — or that his enduring critique of Australian federalism’s shortcomings was something of a magnificent obsession. Even on conscription, his criticism was often as much about its inherent inefficiency (a view traditionally shared by many in the military) as about its violation of liberty and its cruel impact on those whose lives it took or damaged beyond repair.

What of the Curtin by-election? The seat was retained by the Liberals’ Vic Garland, who would go on to achieve ministerial office in the governments of William McMahon and Malcolm Fraser. But Labor achieved an estimated two-party-preferred swing of 7.9 per cent, closely matching the national swing of 7.1 per cent that Whitlam secured later that year in the general election.

That result set the stage for victory in 1972, although to regard it as inevitable is to ignore the risks Whitlam had to take, the best examples being the decision to launch a federal intervention into Labor’s left-controlled Victorian branch in 1970 and his visit to the People’s Republic of China in 1971, when Australia still recognised Taiwan as the real China.

The “inevitable” tag also ignores the modest nine-seat majority Labor achieved in 1972: the win was no landslide, and it is near certain that only Whitlam within federal Labor’s parliamentary ranks could have brought the conservative domination to an end.

That night in April 1969, I walked home reasonably impressed. But my impression would have been of little use to Whitlam: the voting age was twenty-one and I was too young to vote in 1969 — and indeed, even in 1972. •

The post Before it was time appeared first on Inside Story.

]]>
https://insidestory.org.au/before-it-was-time/feed/ 3
A party for the people https://insidestory.org.au/a-party-for-the-people/ https://insidestory.org.au/a-party-for-the-people/#comments Thu, 01 Dec 2022 16:42:42 +0000 https://insidestory.org.au/?p=72042

Beer and scuffles open The Making of an Australian Prime Minister, the classic account of the 1972 election

The post A party for the people appeared first on Inside Story.

]]>
There are close to 500 people in the back garden, and it seems all of them must be chanting. “We want Gough! We want Gough!” The noise is deafening. The crush is at its worst near the sunroom door, where the new prime minister is expected to appear any minute to make a victory statement. Radio and television reporters and newspaper photographers are scuffling among themselves and with party guests to get close to the doorway. A huge, bearded man from the ABC is trying unsuccessfully to move the crowd aside to clear the area in front of the camera which will take the event live across Australia.

“Get your hands off me,” an angry photographer in a pink shirt snarls at a television reporter. Punches are thrown. Blood spurts from the nose of a radio journalist. “Come on, simmer down,” people shout. Someone warns the pink-shirted troublemaker: “This is going all over the country, you know.” More punches are thrown. “Go to buggery, punk,” the photographer screams at a member of the ABC crew. “This is not the ABC studio.”

One of the Labor Party’s public relations team, David White, is pleading with the mob to “ease back, make room for the camera.” Tony Whitlam, the six foot five inch son of the Labor leader, moves in to try to break up the scuffles. He is patient at first, then he flushes angrily and clenches a fist. A Whitlam aide, Richard Hall, places a hand on his shoulder and says, “Easy, Tony.” David White motions to a rather large member of the Canberra press corps and whispers, “Stand in the doorway and look imposing while I get some policemen.”

Inside the house, oblivious to the violence on the patio, Gough Whitlam and his wife, Margaret, are cutting a victory cake. In the icing are the words “Congratulations Gough Whitlam, Prime Minister, 1972.” The party workers sing “For He’s a Jolly Good Fellow,” but the noise from the garden drowns them out.

“We should have thought of barricades,” mutters Richard Hall, as he and other members of the Whitlam staff hurriedly make new arrangements for the prime minister–elect to face the television cameras inside, away from the mob. Party guests are cleared from the sunroom. The big ABC camera is lifted through the door. Lights are set up. Mrs Whitlam appears and is questioned by radio and TV men, but her answers are inaudible more than a few feet away. Whitlam’s driver, Bob Miller, fights his way through the crush with a white piano stool for his boss to sit on.

About forty media people are packed into a room that measures no more than twenty feet by fifteen feet — together with the TV camera, the lights, the microphones. The heat is overwhelming. Television reporters sweat under their make-up. Then, at 11.27pm, Gough Whitlam squeezes along the passage and takes his place on the stool.

Radio reporters lunge at him with microphones as he begins to speak. “All I want to say at this stage is that it is clear that the majority given by New South Wales, Victoria and Tasmania is so substantial that the government will have a very good mandate to carry out all its policies. These are the policies which we have put in the last parliament, and throughout the campaign we did not divert from them, we were not distracted from them, and we are very much reassured by the response the public gave to our program… We are, of course, very much aware of the responsibility with which the people have now entrusted us.”

The TV and radio men begin to fire questions about the actions he plans as prime minister, but he stops them. “I can’t go on answering questions like this… I have to wait for a call from the governor-general.” But it is enough. He has claimed victory, and now he moves out into the garden to mingle with Labor supporters, friends and neighbours who have attended similar parties at the unimposing Whitlam house in Albert Street, Cabramatta, every election night since he moved there in 1957.


Twenty-five miles away, at Drumalbyn Road, Bellevue Hill, a far grander residence in a far grander suburb, William McMahon has watched the Whitlam performance on television. He had been about to go outside to face the cameras himself, but now he must wait another ten minutes or so. Early that afternoon one of his press officers, Phillip Davis, anticipating a Labor win, had drafted a statement conceding defeat. At 10pm he and speechwriter Jonathon Gaul had retired to the family room in the McMahon home to dictate a final version to a stenographer.

Now McMahon reads it over, scribbling in a note at one point to “thank government supporters.” Then he says, “All right, let’s get it over with.” A staff member asks if he is sure he knows what he is going to say, and he nods. Davis asks Mrs Sonia McMahon if she minds accompanying her husband. “Nothing would stop me going out with him,” she says.

Outside the door are the cameras and a tunnel of pressmen. To the right nearly 200 well-wishers — neighbours, party supporters, curious sightseers — are gathered. McMahon walks out. His wife, looking strained but dry-eyed, follows. “Mr Whitlam has obviously won and won handsomely,” says the politician who has led the Liberal–Country Party Coalition to its first defeat for twenty-three years.

“There can be no doubt about the trend in New South Wales and Victoria, and they show a decisive majority for him. I congratulate him, and I congratulate his party, too. For my own part, I accept the verdict of the people as I always would do… Mr Whitlam must also accept the fact that we are an opposition that will stick to our Liberal principles and will give him vigorous opposition whenever we feel that he is taking action which is contrary to the interests of the Australian people.”

He thanks those who voted for the Coalition, and then adds, “Above all, I want to thank my own staff who have been driven relentlessly over the last few months and have stuck with me, they’ve helped me, and they’ve never wilted under the most heavy and severe oppression.” Finally: “The election is gone, it is over, and Mr Whitlam is entitled to be called upon by the governor-general to form a government.”

It has been a dignified statement, delivered with scarcely a tremor in his voice. The man who has gone through an election campaign reading speeches from an autocue has departed from his prepared text, and improved on it. He has been more generous in his references to his opponents than Davis and Gaul had been. The appreciative remarks about his staff are totally unscripted, coming as a shock to people who in the past have felt themselves to be little more than numbers to their employer.

McMahon refuses to answer questions on the reasons for the Coalition’s defeat. “That’s something for deep consideration,” he tells the reporters. Then he turns away and, with Mrs McMahon, plunges into the crowd clustering around the wrought-iron double gates and across the gravel driveway. For several minutes his diminutive figure is lost from sight as he moves among the well-wishers, shaking hands and accepting condolences.


Gough Whitlam and William McMahon spent polling day, Saturday 2 December 1972, touring booths in their electorates. There are forty-one booths in the sprawling electorate of Werriwa in Sydney’s outer-western suburbs, and Whitlam, accompanied by his wife and the Labor Party’s radio and television expert, Peter Martin, visited all of them.

Mr McMahon, too, visited all thirty-three booths in his seat of Lowe, not far away but closer to the city. On his way home he dropped in on several booths in Evans, one of the marginal seats the Liberals feared they would lose. The sitting Liberal member, the navy minister Dr Malcolm Mackay, was one of his closest supporters, and McMahon wanted to help him if he could, even at that late stage. Then McMahon returned to Bellevue Hill, had a swim in his pool and settled down to wait with his staff and a few friends. Whitlam went back to Cabramatta to prepare for the party.

The Whitlam election night party is by now a tradition in Werriwa. For several months before the 1972 election, members of Whitlam’s staff — particularly his press secretary and speechwriter Graham Freudenberg — had been trying to persuade him to change the venue, to hold it at a club or a hotel. With a Labor victory likely, they foresaw security problems.

The crowd, they warned, would be too big for the small cottage and its pocket-handkerchief garden. But Whitlam insisted the function would be held at the house as usual. The party workers in the electorate expected it, he said, and that was that. But he made one concession. He agreed that, while the figures were coming in and he was studying the count, he would retire to the Sunnybrook Motel two blocks away.

Mrs Whitlam supervised the arrangements for the party. A bar was set up in a corner of the back garden. In another corner was a makeshift toilet labelled “gents.” She explained proudly to early arrivals, “It’s a two-holer. Have a look at it.” At various places in the back garden were five television sets, their cords snaking among the shrubs to power points inside the house. On the roof, television technicians set up a microwave link disc, giving the house a science fiction appearance. There were three television outside broadcast vans in the street near the front gate.

On the patio, the television men had placed a ten foot high microphone to pick up the sounds of the party. It produced considerable amusement. “Have you seen it?” Peter Martin kept asking people. “It’s the Gough Whitlam microphone stand, the first one we’ve found that’s tall enough for him.” There was one television camera set up high, near the bar, which could sweep the whole garden. The other, on the patio, was to record whatever Whitlam might say in either victory or defeat. One of the bedrooms had been taken over as a press room, with half a dozen telephones on a long table.

Preparations in Bellevue Hill were more modest. At the insistence of Phil Davis the Liberal Party provided a tent for the press beside the swimming pool, with a few tables and chairs. Davis had stocked it with a car fridge and $30 worth of beer. There were no television sets until the TV men themselves set up monitors, but Davis left his transistor radio with the journalists mounting the vigil which, as the night wore on, they dubbed the “death watch.”

In the lounge room were two telephones and two portable television sets for McMahon and his close advisers. In the family room another set had been provided for the stenographers and Commonwealth car drivers on his staff.

McMahon appeared briefly to talk to the press and the cameras before the figures began coming in. He was confident of victory for the Coalition, he told them, and had no worries about his own position in Lowe. No one could be sure whether he believed it, but he appeared jaunty enough, immaculate in his freshly pressed blue suit, white shirt, crimson tie and carefully polished shoes. With a wave he disappeared into the house, not to be seen again for more than three hours except in silhouette through the lounge room windows.

It was a strange atmosphere inside the house, tense but not emotional. McMahon settled down at one telephone. John Howard, a vice-president of the NSW Liberal Party, remained glued to the other. Davis, Gaul, McMahon’s private secretary Ian Grigg, Mrs McMahon, and several friends of the family watched the results on the television sets. Little was said.

McMahon received frequent reports on his own seat from scrutineers, and remained outwardly calm even when it looked as though he might lose it. Only once did he snap at a party worker when conflicting figures were phoned in from Lowe. He was in constant contact with electoral officials in Canberra, and with the federal director of the Liberal Party, Bede Hartcher, who was in the national tally room. From time to time Howard handed him figures. Davis and Gaul kept him up to date with the figures coming up on television. He scribbled on a notepad, calculating the government’s position and appreciating it far better than anyone else in the room.

McMahon has the ability to “feel” a political situation before most other people. It is one of the reasons he was able to survive so many crises in his turbulent career, the talent that earned him a reputation as a political Houdini. He got the “gut” feeling that told him the government was heading for defeat almost as soon as the early figures began to come in. He was ready to concede by 10pm, but wanted to make sure Lowe was safe before he faced the questions of the press.

Hartcher and other Liberal officials urged him to wait, telling him there was still a chance the government could scrape back, but he knew better. Then [Liberal frontbencher] John Gorton appeared on television, admitting that Labor had won. The customs minister, Don Chipp, also conceded. And the treasurer, Billy Snedden. McMahon knew he had to go out on the lawn, where the cameras and the journalists were waiting like vultures. But first he had a cup of tea. Mrs McMahon handed it to him, and those in the room watched as he spooned in the sugar. His hand was steady.

The ordeal of the statement over, McMahon returned to the small group in the lounge room and sat quietly for a time. Then he perked up. “Oh well, that’s it,” he said. “We’ve got some champagne. Let’s open it.” From then on the mood was almost one of relief that it was all over. Party workers from Lowe dropped in, and some NSW Liberal Party officials. Outside, their work over for the night, journalists and TV men were drinking in the tent. Davis and Gaul joined them.

At about 1am a young woman broke through the security screen around the McMahon house by clambering over a fence from next door. She joined the press group, and gushed over McMahon and his wife when they emerged soon after for a final, off-the-record chat. Only once during the night did McMahon lapse into introspection and ask rhetorically, “Where did we go wrong?” He did not offer an answer to the question. Later he said, “At least we didn’t lose as many seats as in 1969.”


Whitlam’s staff spirited him away from his home to the motel soon after 8pm. Very few people knew where he had gone. It was well over an hour before a group of journalists and photographers tracked him down, and they were kept locked out of the room where he was studying the results.

Around the room were four television sets tuned to different channels. At one end was a table with a bank of seven phones. Richard Hall was constantly on the phone talking with scrutineers and candidates round the country, getting figures before they were posted in the tally room. Clem Lloyd, Lance Barnard’s press secretary, phoned through figures from the national tally room at regular intervals. David White was also manning phones.

Mungo MacCallum, the Nation Review journalist, had been co­opted to work a calculating machine. Whitlam sat in an armchair facing the television set tuned in to the ABC, but frequently he screwed himself around to watch the other sets as they showed new figures. Peter Martin was there. Graham Freudenberg sat on the bed listening intently to the analysis of British psephologist David Butler on Channel 7. Another of Whitlam’s press aides, Warwick Cooper, hovered in the background. His private secretary, Jim Spigelman, was making calculations on a notepad.

Bob Miller poured glasses of beer and orange juice for the workers. Also present was Ian Baker, press secretary of the Victorian opposition leader, Clyde Holding. There was whispered conversation. “It’s starting to look as though the DLP vote is down in Victoria,” said Martin at 8.45. “A trend is developing to us in the outer suburbs,” Hall told Whitlam a few minutes later. “We’ve got Phillip,” Freudenberg announced at 8.50. Placing his hand over the mouthpiece of his phone, Hall read out the first figures for MacArthur and told Whitlam, “It looks like Bate is going to poll well.” Less than a minute later he interrupted another phone conversation to tell Whitlam, “There’s no doubt about it, the DLP is ratshit in Victoria. They’re going down.”

But Whitlam remained cautious. When the ABC showed figures for Mitchell and compere Robert Moore told viewers that Liberal member Les Irwin was trailing his Labor opponent, Whitlam commented, “It’s still not marvellous.” Hall announced. “There’s a clear absolute majority to us in Hume,” but Whitlam replied, “Later figures always go against us. We’d have to have a very good lead.” One of the TV screens showed Liberal Alan Jarman trailing in Deakin, but Whitlam said, “He’ll still get in, though.”

Whitlam showed little emotion as he stared intently at the television screen, until a little after 9pm when Hall told him, “I reckon we’ve won Casey, Holt, Latrobe, Diamond Valley and Denison.” On the TV set tuned to Channel 9, [journalist] Alan Reid was saying, “If this trend continues I’d say Labor is home and hosed.” Then Whitlam allowed himself a smile, and sprawled back in his chair clearly more relaxed.

At that point he knew he had almost certainly won. There was irony when one of the channels rescreened McMahon’s earlier interview, showing him saying, “I feel more confident than I did this morning.” But there was bad news, too. At 9.05 MacCallum looked up from his calculator and remarked, “In Bendigo David Kennedy is only on 48 per cent. He’ll go to preferences.” Whitlam became sombre again as he said, “But will he get them?” At 9.15 Whitlam gave Hall permission to phone the Labor candidate in Denison, John Coates, to congratulate him on a certain win.

Then Hall reported that Labor scrutineers had no doubt the party would win Evans. At 9.18 one of the staff let out a cry of “Jesus!” as figures for Flinders on one of the TV sets showed the labour and national service minister, Phillip Lynch, fighting to hold the seat. Then at 9.20 MacCallum performed some more calculations and announced to the assembled company, “I think we can send the white smoke up the chimney now.”

From then on, the mood in the room was one of elation. “Welcome home Victoria!” said Spigelman as one of the television computers came up with a printout showing a swing of 6 per cent to Labor there. NSW party officials had told Whitlam there was a good chance of a Labor win in the Country Party–held seat of Paterson, but he had not believed them. At 9.26, when Freudenberg said, “They were right about Paterson,” he sprang out of his chair with an astounded cry of “What?”

He rubbed his hands together gleefully when Freudenberg hold him a few minutes later, “Look. Race is in.” Race Mathews, his former private secretary, had a clear lead over the minister for the environment, Aborigines and the arts, Peter Howson, in the Victorian seat of Casey. At that point, Bob Miller was sent to fetch Mrs Whitlam, and as soon as she arrived Hall popped the cork from the first champagne bottle. Glasses were clinked all round. “Many happy returns,” said Mrs Whitlam.

Only the news from the South Australian seat of Sturt, where Labor’s Norm Foster had been defeated, interrupted the celebratory atmosphere. “We can’t really do without Norm,” said Mrs Whitlam. “We need someone with that sort of tenacity and ferocity.” Her husband was quickly on the phone to Foster, offering commiserations and promising to find a job for him.

But Whitlam was possibly more upset by the bad result for Labor in Bendigo, and he phoned David Kennedy too. The Labor leader has what his staff describe as “a thing” about by-elections. They have played an important part in his political career. It was his role in the Dawson by-election in Queensland which saved him from expulsion over his fight with the ALP machine on State Aid in 1966. In 1967 the by-election victory in Corio in Victoria was his first triumph as party leader, and gave him the leverage to secure reforms to the structure of the federal ALP conference and executive. In the same year the Capricornia by-election success helped him to “break” Harold Holt. In 1969 a by-election in Bendigo had shown his mastery over the then prime minister, Gorton. The possibility of losing one of the seats to which he had devoted such time and effort in a by-election campaign appeared to affect him deeply.


Whitlam had been hoping McMahon would go on television first to concede. But soon after 10.30 he decided further delay would be fruitless, and prepared to return to the house and the waiting cameras and pressmen. But first he and Mrs Whitlam posed for the photographers who were gathered outside the motel room. In typical fashion, they hammed it up. “This is my best side,” said Whitlam. “Well, my nose is too big on this side,” replied his wife, “but I’ll do it for you, dear.” Their eighteen-year-old daughter turned up and gave her father a hug. “Are you happy now, Dad?” she asked. “Yes, Cathy,” he said. “I hope you are.”

Back at the house a British journalist was phoning a story to his paper in London. “Australia has a new prime minister,” he dictated. “Yes, I’m quite serious.” In the back garden the party guests were milling around the television sets, sending up loud cheers as each new set of figures confirmed the Labor victory.

The NSW ALP president, John Ducker, wandering through the crowd beer in hand, did not seem to quite believe it. “There’s no doubt, is there?” he kept asking people. “Billy McMahon’s going to lose his seat,” a gloriously drunk party worker shouted at the top of his voice. Laughter rippled from one end of the garden to the other.

Then the word was passed excitedly through the crowd: “Gough’s coming. He’s here.” Whitlam’s tall figure could be seen slowly forcing its way through the crush as people tried to shake his hand or simply touch him.

Photographers held their cameras above their heads, trying to get shots. “Good on yer, Gough,” people shouted. And then the chanting started. “We want Gough! We want Gough!” Slowly he made his way to the sunroom door, stood there a moment smiling, and then disappeared inside.

Sometime later, when he had made his television appearance and done the right thing by his party guests, Whitlam returned to the motel and the stock of champagne for a quieter celebration. And there, away from the cameras and the crush, he was more expansive in his comments to journalists. The Liberals would have lost under any leader, he said, adding, “It’s just too silly for them to blame or for us to thank Bill McMahon. The whole show was running out of steam.” Then, a little wearily, “It’s been a long, hard road.” •

This is an extract from The Making of an Australian Prime Minister, published by Cheshire in 1973.

The post A party for the people appeared first on Inside Story.

]]>
https://insidestory.org.au/a-party-for-the-people/feed/ 3
“God save us all!” https://insidestory.org.au/god-save-us-all/ https://insidestory.org.au/god-save-us-all/#comments Thu, 01 Dec 2022 16:42:32 +0000 https://insidestory.org.au/?p=72044

Doomed to defeat in 1972, did prime minister William McMahon show more initiative than he’s given credit for?

The post “God save us all!” appeared first on Inside Story.

]]>
In May 1972, six months before that year’s election, the editor of the Melbourne Age enjoyed a surreal lunch with prime minister Billy McMahon. Describing him as “really dazzling company,” Graham Perkin was nonetheless staggered by the prime minister’s summary of the political scene and his government’s future.

“The funny little man,” Perkin told a colleague, “has convinced himself that he is a brilliant success and sees himself winning handsomely in November and remaking the nation in the following three years; leading them” — the Coalition — “to victory in 1975, and then retiring with honours thick upon him. God save us all!”

To modern readers, McMahon’s hopes seem as preposterous as they did to Perkin. Most accounts of his government use the same adjectives — incompetent, reactive, hapless, embarrassing — and follow the same line: nothing of consequence was achieved between 1969 and 1972, and the election of the Gough Whitlam–led Labor Party was never in question.

This view has several effects. One is to diminish Labor’s genuine achievement in 1972, when a party scarred by twenty years of discord and electoral failure convinced voters that the vision, policies and leadership Australia needed were to be found among its MPs. Another is to render the years from 1969 to 1972 as a shapeless interregnum between the going of prime minister Robert Menzies and the coming of Gough, an antipodean Dark Ages during which nothing really happened. The last is to leave our understanding of those years profoundly incomplete by failing to take seriously the efforts of the Coalition government to govern during a period of immense change.

While confident of victory, Whitlam always insisted the 1972 campaign was a live contest. And while he was never backward in adducing McMahon’s flaws, he also perceived an opponent more wily than popularly imagined. As prime minister, Whitlam argued, McMahon had tried to “bestride two horses”: “He claimed to be the real heir to Menzies, yet he also claimed to recognise and accept the need for change in a changing world.” And the result? “This balancing act he did with some skill.”

A “balancing act” is one useful way of understanding the Coalition government’s actions during 1969–72, of seeing how it tried relentlessly, first under John Gorton and then under McMahon, to manoeuvre itself into a position where another election victory might be possible.


At a distance, the events following the 1969 election are confounding: the leader of the victorious party was immediately challenged by two of his own ministers.

With the benefit of hindsight, the 1969 election result — which resulted in the Coalition’s loss of sixteen seats — confirmed the waning fortunes of a government in office for two decades. At the time, though, it seemed more like a stern rebuke to prime minister John Gorton. Vaulting him from the Senate into the prime ministership after the unexpected death of Harold Holt, Gorton’s colleagues had elevated him in the belief that he possessed sound and sorely needed political judgement, and that his ability to perform on television would be compelling to voters.

The two years that followed brought both beliefs into question. Gorton’s ambivalence towards some of his colleagues and his tendency to unilateral decision-making antagonised many within the government and increasingly alarmed those outside it. Strong-willed and confident, he rarely backed down: “John Grey Gorton,” he rounded on one impertinent senator, “will bloody well behave precisely as John Grey Gorton bloody well decides he wants to behave!”

After a strong start, moreover, Gorton’s abilities as a public speaker seemed to desert him over the course of 1968–69. Tortuously convoluted prime ministerial statements became so much the norm that Whitlam took to ridiculing Gorton simply by quoting him verbatim. As one famous example ran, “On the other hand, the AMA agrees with us, or, I believe, will agree with us, that it is its policy, and it will be its policy, to inform patients who ask what the common fee is, and what our own fee is, so that a patient will know whether he is to be operated on, if that’s what it is, on the basis of the common fee or not.”

Amid these personal shortcomings were more serious policy disagreements. During the 1969 campaign, Gorton had gestured towards traditional Coalition strengths as well as “new horizons”: alongside hawkish statements on national security and tax cuts, he promised increased spending on education, a new Australian film school, and reforms to healthcare. But his statements about defence did little to assuage suspicious hardliners in his own party and in the avowedly anti-communist Democratic Labor Party, which generally backed the government. And his moves to withdraw Australian troops from Vietnam failed to mollify the anti-war protesters who took to the streets in successive moratorium marches.

Gorton’s domestic policies, meanwhile, many of which included an empowered Commonwealth reaching into matters traditionally the purview of the states, antagonised state premiers and colleagues whose fidelity to federalism was a matter of faith.

All this fed into the leadership challenge launched less than two weeks after the election. While treasurer McMahon and national development minister David Fairbairn failed in their bid to displace Gorton, the fissures their challenge exposed didn’t close over. A ministerial reshuffle to blood a younger generation of MPs — including Malcolm Fraser, Billy Snedden and Andrew Peacock — spurred suggestions of cronyism. Backbenchers attacked government legislation in the privacy of the government party room and the public spaces of the House and Senate.

A poor showing at the half-Senate election, late in 1970, was followed by an unsuccessful party-room motion for Gorton’s resignation; then a murky series of press reports in March 1971 spurred Fraser to resign as defence minister and savage Gorton in the House. A confidence vote on Gorton’s leadership tied; Gorton resigned as prime minister; McMahon was elevated to the top job; and — farcically — Gorton was elected, if only for a short time, to the deputy party leadership. As one reporter exclaimed after the last of these events, “You must be joking.”

The bitterness engendered by these developments lingered. Trust was non-existent, whispers of further leadership spills continued, and policy disagreements were so pronounced that the break-up of the Coalition was even broached. In McMahon, the government had a leader who had done much to sow the seeds of this turmoil and who, in office, would sow more still; but, again in McMahon, it had a politician with twenty years of experience at the highest levels of government who was willing to do all he could to stay in office. As governor-general Paul Hasluck wryly remarked, McMahon would “not be cumbered either by ideals or principles” in pursuing that goal.


McMahon’s at-all-costs attitude surfaced conspicuously when he began shifting and tacking on the question of whether Australia should extend diplomatic recognition to the People’s Republic of China, abandoning its long recognition of the Taiwan-based Republic of China and the fiction that the latter remained the sole, legitimate government of China.

In 1958, as a relatively lowly minister, McMahon had argued that the People’s Republic should not be admitted into the international community until it had renounced the use of violence; as minister for external affairs, in 1970, he agreed that the country could not forever remain on the periphery but insisted on putting conditions on any kind of recognition or engagement. His view was influenced more by domestic political circumstances than any moral or strategic factor: “Remember, please, that we have a DLP,” McMahon told deputy secretary Mick Shann, “and that its reaction must be considered!”

By the time the Gorton cabinet reviewed its relationship with the People’s Republic, in February 1971, its resolution was similarly timid: it accepted that the government in Peking (as Beijing was known) was engaging with the international community and that Australia’s policy of diplomatic recognition would have to be reappraised — but decided that it would, for the moment, follow the lead of the United States.

The consequences of this hesitant ambivalence began to play out a month after McMahon became prime minister, when Whitlam sought an invitation to visit Peking. McMahon attacked him on grounds of naivety for engaging with a government that had not yet renounced violence; then, when Whitlam’s invitation to visit was granted, announced that his government would “explore the possibilities of establishing a dialogue” with Peking.

In the space of a month, McMahon had put his government astride two horses, of opposition and of engagement. He still believed the government to be riding high when Whitlam visited China in July. Criticising the Labor leader for his “instant coffee diplomacy,” he told a gathering of Liberal Party members that China “has been a political asset to the Liberal Party in the past and is likely to remain one in the future.”

That future was terribly short-lived. Within days of Whitlam’s visit, US president Richard Nixon announced he would visit China the following year. McMahon sputtered. He told the press that “normalising relations with China,” as Nixon was doing, had been his government’s policy all along, but in private he was angry and embarrassed, aghast that he had been so publicly undercut. Lashing out, he sacked his foreign minister and criticised Nixon. In the eyes of the Americans he was “on edge and almost frenzied in trying to stay on top of his job”; to the British, McMahon knew already that he was “not much good in the part” of prime minister.

McMahon eventually conceded that his government had failed on China. He was aware that Whitlam had won considerable plaudits and that he himself had looked a fool. Yet he continued to try to ride the two horses. He explored accompanying Nixon to Peking; he tried to find a halfway point between complete aversion and the diplomatic recognition Whitlam had promised. Rebuffed by the Chinese, he was then rebuked by DLP leader Vince Gair, who denounced the contest over who was more “ahead” on the issue of China. Stung, McMahon refused an invitation for army minister Andrew Peacock to visit China as part of an unofficial business party.

When the People’s Republic was admitted to the UN General Assembly and took a seat on the Security Council late in 1971, McMahon’s attempt to reconcile opposing pressures finally came to an end. Resiling from engagement with China was no longer an option, and yet China would not accept anything less than diplomatic recognition. The horses had bolted.


Another attempted balancing act came in the middle of 1971 when the South African government sent an all-white Springboks rugby team to Australia. Foreshadowing an October tour by South Africa’s cricket side, the Springboks became a barometer of how fast public opinion could turn on an issue. A Gallup poll taken in March 1971 had found that almost 85 per cent of Australians thought the South Africans should come, and most members of McMahon’s government believed, as Menzies did, that the cancellation of a South African tour of England in 1970 had been a surrender to the “threats of a noisy minority” and were not willing to do likewise.

McMahon genuflected to respectable opinion by making much of his disappointment that South Africa had sent a whites-only team, but he baulked at any real response. “We believe that the [all-white] policy in respect of teams is unfortunate, but it is nevertheless a South African matter, and not our matter,” he said privately during what happened to be the UN International Year for Action to Combat Racism and Race Discrimination.

Having effectively condoned a racially selected team, McMahon’s government then directed that Australia abstain from voting on a UN resolution condemning the application of apartheid in sport. It then helped sustain the tour by making available an RAAF aircraft to ferry the Springboks around the country after the ACTU and its president Bob Hawke promised to impose a “black ban” on the tour. “We are not going to be beaten here,” McMahon said privately.

Disruptive protests met with furious responses from Liberal–Country state governments. Victorian premier Henry Bolte called the demonstrators “louts and larrikins”; Queensland’s Bjelke-Petersen government declared a state of emergency so as to more easily crack heads. Amid the barbed-wire barricades, smoke bombs and police batons, McMahon mused about calling an election with a law-and-order theme.

By the time the South Africans left, the weight of public opinion had shifted completely. McMahon’s own ministers were against an early election and dreaded the prospect of a repetition of the controversy when the South African cricketers arrived in summer. Not willing to admit defeat, the government refused to decide whether that tour should take place. It threw the ball to Sir Donald Bradman, chair of the Australian Board of Control for International Cricket, leaving it to him to make the necessary decision to call off the tour.

Yet another example of McMahon’s balancing act emerged at the end of 1971, when he made clear to a cabinet committee that he supported applications from Indigenous communities in the Northern Territory for leases on consolidated lands, provided they could satisfy criteria related to their association with the land. Had this been translated into government policy, it would have been an acknowledgement that a traditional association with the land should be a basis for land rights claims. His view diverged from those of the cabinet committee members considering the government’s approach to Indigenous issues. The fact that McMahon’s subsequent wavering failed to bring them around was reflected in their decision in late December 1971.

When McMahon issued a statement on Aboriginal policy on 26 January, it featured a gaping hole. The new objectives, though laudable, were overshadowed by the government’s failure on land rights. McMahon announced the creation of a new form of lease but ruled out land claims made on the basis of traditional association. The reason? To do so would introduce a “new, probably confusing component, the implications of which could not clearly be foreseen, and which could lead to uncertainty and possible challenge in relation to land titles elsewhere in Australia which are at present unquestioned and secure.”

The attempt to hew to a conservative course — rejecting a traditional association with the land — and simultaneously announce updated objectives for government policy fell flat. The timing hardly helped: McMahon’s statement came on a day traditionally considered a day of mourning by Indigenous peoples. The statement spurred one of the striking images of that year: four Indigenous men sitting beneath an umbrella as the sun rose on the lawns outside Parliament House the following day, a sign strung up beside them reading “Aboriginal Embassy.”


Failures like these left the government far from the “first, fine, careless rapture” that Menzies had suggested was necessary to stay in office. “There is an imminent feeling of decay about the place,” recorded Liberal MP Bert Kelly when parliament resumed late in February 1972.

Blame for the government’s woes fell almost entirely on McMahon. As Kelly asked his diary, “What the devil do we do next? We’ve got Billy McMahon elected as our leader and obviously he is not doing it at all well and everybody knows this. What we can’t think of is, how do we get rid of him? I suppose the only hope we have is that he suddenly drops dead one day.”

The unrest stirred by dire polling, as well as whispers that John Gorton might try to supplant him, didn’t bring out the best in McMahon. “Christ, he must be mad,” said one MP, after one blundering parliamentary debate by the prime minister. “What is wrong with him?” asked another.

Everything the government and its prime minister did seemed to end in disaster. McMahon’s late-1971 trips to the United States and Britain had been memorable for a mangled toast to his hosts, his wife’s revealing dress and Richard Nixon’s inability to remember his name. A swing through Southeast Asia early in 1972 became an “excursion to blunderland,” declared a Canberra News journalist, extinguishing any hopes of making defence and foreign affairs a centrepiece of a re-election campaign.

But ministers also shared in the blame, with no small number of blunders and public spats occupying headlines. Some ministers dithered; others were disengaged. David Fairbairn regarded the five months he spent as education minister, in 1971, as hard and unrewarding, and departed the portfolio admitting he had not achieved anything.

Environment minister Peter Howson, meanwhile, citing the lack of an explicit directive, did himself no favours when he refused to lend Australia’s support to New Zealand’s criticism of renewed French nuclear testing in the Pacific in 1972, putting the government at odds with public opinion. (A belated move that mainly suggested the government was going along with the public for craven reasons.)

The economic outlook also proved difficult for the government. The Coalition had been nearly broken by a currency revaluation forced upon it when the Smithsonian Agreement — which pegged currencies to the US dollar — came into operation in December 1971. Slowing economic growth and rising inflation spooked treasurer Billy Snedden and McMahon, who were soon at loggerheads over how to get the economy moving in time for the election. The government was caught between the competing objectives of economic rigour and voter-attractive spending.

After a tough budget in 1971, the increased pensions and reduced personal income taxes in the government’s April 1972 mini-budget suggested a new focus on the pending election. As deputy prime minister Doug Anthony admitted, “I wouldn’t be very honest if I said that this [the election] isn’t in the back of our minds.” The budget proper, issued in August, was even more electorally focused: “Taxes down; pensions up; and growth decidedly strengthened,” as Billy Snedden remarked.

The attempt to find a way between change and stasis often saw progress. Under customs minister Don Chipp, the government liberalised censorship policy yet also refused to authorise the publication of Philip Roth’s controversial novel Portnoy’s Complaint in Australia — only for a monied publisher to embarrass the government by evading its jurisdiction and publishing the book anyway. The government was ignominiously forced to remove its ban on Portnoy in 1971, and the following year an attempt to hold the line on the banned Little Red Schoolbook foundered when activists smuggled it into the country and began distributing free copies. Chipp insisted that the government remove its ineffective ban, but Malcolm Fraser and other ministers continued to protest that the book “undermined family and society.”

Other initiatives came too, on an unexpectedly broad front. Writing a decade later, Donald Horne wondered whether McMahon was too busy “plucking policy out of passing straws” to know what he was doing. But in terms of results, Horne conceded, the government modernised the political agenda in a significant number of ways.

Although the government resiled from passing a wholly new Trade Practices Act, it did initiate new laws preventing foreign takeovers. It withdrew the last Australian combat troops from Vietnam, leaving only 128 members of the Australian Army Training Team in the country. It joined the Five Powers Defence Arrangements and the OECD. It passed the Childcare Act, which allowed the Commonwealth to intervene in the childcare sector and helped transform it into a profession supported by research and grants. It increased education spending and the number of scholarship places at universities and TAFEs.

The government also adopted the “polluter pays” principle for environmental protection, and began giving the Commonwealth the capacity to intervene in environmental matters. Howson, for all his grumbles that he had been given responsibility for “trees, boongs and poofters” as minister for the environment, Aborigines, and the arts, was nonetheless the first person to be appointed with explicit responsibility for these policy areas.

Notably, too, the government released its own urban and regional development policy. This was partly in response to Whitlam’s well-established interest in this area, but also a recognition of public demand for Commonwealth action. Meeting that demand required the government to overcome its longstanding aversion to Commonwealth intervention in state responsibilities.

Housing minister Kevin Cairns’s priority was “to seek agreement at all levels that an urban policy is needed” — rather than to actually devise a policy — but McMahon pushed for both the agreement and a policy. He reserved to his authority and his department responsibilities traditionally held by state governments, and then, in September, pushed cabinet to create the National Urban and Regional Development Authority to foster a “better balance of population distribution and regional development in Australia.”

When he introduced the legislation, McMahon stressed the significance of the change that was now manifest: “It marks our recognition that there is a direct contribution that the Commonwealth government can make in national urban and regional development.” It also showed that the government had an answer to Labor’s policies in this area.


“We should be able to tell people where we stand and where we are heading,” McMahon had written in August 1971. Here, perhaps, was the government’s approach in a single phrase: stasis and movement. When McMahon went to Government House to seek a dissolution of parliament, he felt sufficiently confident that his two-pronged approach would be enough to see the government returned. To Paul Hasluck, he predicted the Coalition would pick up two seats in Western Australia and two seats in New South Wales — and perhaps even three in Victoria. He didn’t envision losing any seats except, perhaps, that of Evans, held by Malcolm Mackay.

That prediction was somewhat redeemed: the Coalition picked up two seats in Western Australia and one seat apiece in Victoria and South Australia. But it lost six seats in New South Wales, four in Victoria, and one apiece in Tasmania and Queensland, with the result that Labor took office with a nine-seat majority.

It was a closer result than many would like to think. The rural gerrymander meant that around 2000 votes distributed across five seats could have allowed the government to cling to office. In such an event, the first steps in McMahon’s forecast to Perkin may well have been vindicated.

Why the close result? Some have pointed to the electorate’s innate conservatism, especially after twenty-three years of Coalition rule. Few have suggested that McMahon might have been a factor in limiting the swing — but one of them was his successor as prime minister. Without McMahon’s skill, tenacity, and resourcefulness, Whitlam later wrote, Labor’s victory in 1972 would have been “more convincing than it was.” •

Funding for this article from the Copyright Agency’s Cultural Fund is gratefully acknowledged.

The post “God save us all!” appeared first on Inside Story.

]]>
https://insidestory.org.au/god-save-us-all/feed/ 6
The matriarchs https://insidestory.org.au/the-matriarchs/ https://insidestory.org.au/the-matriarchs/#comments Wed, 30 Nov 2022 02:13:49 +0000 https://insidestory.org.au/?p=72035

How three extraordinary Tasmanian Aboriginal women fought for their people

The post The matriarchs appeared first on Inside Story.

]]>
To be a Tasmanian Aboriginal person is to know ourselves from the words of others. Over the past 200 years thousands of books, papers, journals and diaries have been told by those who peer at, gaze through and dissect our minds, bodies and country from knowledge traditions that write about us as aliens in our own lands.

It is a brave act, then, to see ourselves as Indigenous authors and researchers responsible for telling histories from a first-person perspective, and to radically decolonise that writing by others. Linda Tuhiwai Smith, the Ngāti Awa and Ngāti Porou godmother of Indigenous methodologies, forged these two powerful Indigenous methodologies and taught us that our voices need to “talk back to” and “talk up to” research. The academic drive to speak for and about us through a Western cultural lens is starting to be deeply interrogated and increasingly found unhealthy.

In Tasmania, the need for local Aboriginal voices to “talk back to” those who profit from our dead, our dispossession and our trauma is imperative. When the strangers in the boats appeared on our trouwunna shores in 1803 to claim our country — by any means — in the name of a faraway British Empire, they really meant it.

In less than thirty years our plentiful peoples were reduced to a handful by planned massacres and declared war, theft and slavery of women and children, and the impact of being treated as less than human. By 1876, with the death of our countrywoman Trucanini, we were subject to mythical extinction, extirpation and elimination.

Yet we did survive after 1876. We survived in the pockets and shadows of colonial Tasmania to raise families and communities. We survived in out-of-the-way places like Flinders Island in the Bass Strait, and in the middle of townships like Latrobe in the northwest.

Among these survivors, one amazing family was never far from the colonial capital city of Hobart — the family of the matriarch Fanny Cochrane Smith. Fanny, who died in 1905, was the ultimate survivor of the abuse that the colonisers so freely gave in return for taking our lands. Now, one of her great-great-grandchildren, Joel Birnie, has decided to tell her history, and his family story, of surviving colonisation.

Joel’s reclamation of important ancestral and familial women in My People’s Songs: How an Indigenous Family Survived Colonial Tasmania is a shockingly rare example of a Tasmanian Aboriginal history told through the research of a Tasmanian Aboriginal person. From the narratives of others, he retrieves Fanny, her sister Mary Ann Arthur and their mother Tarenootairer — women who shaped colonial moments and spoke their own truths even while captive and exploited — and returns them to Country and power.

Joel offers readers hospitality to join him on his family’s journey. His introductory section regenerates the “Song of Welcome” that Fanny recorded on a phonograph in 1903. From there he returns to the first years of the damnation wrought on our people, when Joel’s many-times grandmother Tarenootairer was stolen as a young girl for slavery, her life thereafter shadowed by the evil that men did to her as a chattel. By creating her own family, Tarenootairer is one of the only women to have create generations that survived the colonial genocide.

Joel restores one daughter, Mary Ann, from the periphery to her rightful place as a central figure in the first Australian call for recognition of land and sovereign rights in 1846. While her death in 1871 was a solitary and degraded affair, her afterlife now becomes rich with heroic dimensions of meaning and survivorship.

Mary Ann was able to use her “educational instruction” to talk back to the colonisers through her letters, and to leave a legacy of women’s advocacy as a natural way of being Tasmanian Aboriginal. Joel follows in her footsteps, for in his writing he too talks up and back and recovers the space to write and speak, in reshaping both Mary Ann’s legacy and his own as an Indigenous academic.

Tarenootairer’s younger daughter, Fanny Cochrane Smith, while suffering the same kinds of degradation as her sister in early childhood — stripped, whipped and tied to a kitchen table, locked in a crate, separated from her mother to be housed in an “orphan” school — veers away from Mary Ann’s organised marriage and childlessness to become an eminent member of colonial Tasmanian society.

Fanny’s reputation and standing as an “industrious” Christian and an Aboriginal woman meant she lived apart as one of the first Tasmanian women to be given a land grant south of Hobart. Parliament debated the paradox of granting land to a recognised Aboriginal woman when the colonial government, vociferously defending their extermination of Tasmanian Aboriginal people, could not acknowledge what she was known for.

All three women lived lives of hardship and poverty. They were all under an intense colonial gaze, though neglected in every other way, but Joel is able to emblazon the spaces where they resisted colonisation. He shows us Tarenootairer laughing, smoking and sharing her life with a group of kinship women in the mission gulags of Wybalena and Oyster Cove, privately communicating in ways that cannot be known by the colonisers, and yet out in the open as a resistance taunt.

He shows us Mary Ann’s feverish writing at her desk, her contributions to political discussions led by the men, and her absolute care for her younger sister Fanny, all hard fought for, on her terms and in defence of her right to belong to Country as a free woman. He shows us Fanny and her husband, newly married, opening a boarding house in Hobart that became a refuge for her family and others, a place where she gave privacy and comfort to our peoples outside the colonial gaze.

In every way, these three women, while subject to deprivations, survived on their own terms to poke back just a little bit.

Joel’s work is an exceptional piece of accessible and vivid writing that smashes the colonial, racist depictions and brings to the surface stubborn, vital Tasmanian Aboriginal women. He has given back to Tasmanian Aboriginal communities a story of ourselves and a template of how we might proceed to think of other men and women who need to be reclaimed.

A small quibble is that the book is written as a third-person narrative of “them” and “theirs,” even when these women are his family. It is disconcerting to read — a reminder of the unconscious adoption of the academy that displaces Indigenous peoples into the “you” not “us,” but this speaks to the infancy of an industry of Indigenous authors and academics within Western spaces rather than a deficit within us. Our minds were conditioned all that time ago, when the boats came and the strangers took our lands, a reckoning that Linda Tuhiwai Smith suggests is still ongoing; but at least we now have a way to proceed to untangle the colonial from the Indigenous and to tell our stories.

My People’s Songs is a book that should evoke pride in Tasmanian Aboriginal people, helping us see ourselves and speak to the courage of our survival. It may not be joyous to read of the horrors of what happened to us, but in Joel’s decolonising of the old narratives we find a space to simply be, to breathe easy in having confirmed what we already knew — we come from warriors, we protect fiercely what we love, and we will always be strong Tasmanian Aboriginal peoples. •

My People’s Songs: How an Indigenous Family Survived Colonial Tasmania
By Joel Stephen Birnie | Monash University Publishing | $34.95 | 256 pages

The post The matriarchs appeared first on Inside Story.

]]>
https://insidestory.org.au/the-matriarchs/feed/ 3
Inside the wire https://insidestory.org.au/inside-the-wire/ https://insidestory.org.au/inside-the-wire/#comments Wed, 16 Nov 2022 23:24:02 +0000 https://insidestory.org.au/?p=71741

Eighty years apart, a private diary from the Tatura internment camp and dispatches from the Manus detention centre recount the experiences of refugees held prisoner by Australia

The post Inside the wire appeared first on Inside Story.

]]>
It’s not uncommon for publishers to try to cash in on an author’s sudden fame by following up an award-winning bestseller with the (re)publication of older writing, particularly if that author isn’t ready to produce a sequel. Freedom, Only Freedom seems to fit that bill: this collection of Behrouz Boochani’s dispatches from the Manus detention centre will presumably be marketed — and read — as a companion to his No Friend But the Mountains, published to much acclaim in 2018.

It is clearly more than that. That’s because Boochani’s journalistic pieces, which first appeared mainly in the Guardian and the Saturday Paper, are accompanied by essays written by his translators and other “invited researchers, writers and confidants.” It’s also because Boochani’s journalism — a frightening and detailed view of what was officially called the Manus Regional Processing Centre — deserves to be republished. At the same time, the journalism allows the reader to understand why it was possible to survive detention on Manus: namely because of “love, friendship and brotherhood.”

Some five or six years ago, Boochani’s dispatches would have been read mainly for the information they contained about what went on in the Manus detention centre at a particular moment in time. Now that’s of interest mainly for historians. But it’s possible to appreciate these texts for their careful, indeed delicate and poetic, portrayals of other detainees.

My favourite is “The Man Who Loves Ducks,” an article about Boochani’s animal-loving fellow prisoner Mansour Shoushtari that first appeared in the Guardian in 2017. “Getting to know Shoushtari has been a blessing and an inspiration,” Boochani writes, and such is the power of his writing that I too feel privileged to have met Shoushtari. Shoushtari — and, I guess, writing about Shoushtari — helped Boochani to survive:

For the short time I was in his presence I forgot about all the violence and hardship associated with this prison; my love for life increased after I spent time with him. I was reassured by the fact that there were warm people like Shoushtari in our close company. I think I’ll keep these memories of him with me for years to come.

Some of the book’s other essays help the reader to contextualise Boochani’s texts. Among them are articles by Moones Mansoubi, who arrived in Sydney as a student from Iran the same year Boochani was deported from Christmas Island to Manus, and went on to become one of his translators, and by Ben Doherty, who has reported on Australia’s detention archipelago for the Guardian. For Doherty, Boochani morphed from a source into a fellow journalist who filed articles from inside the Manus camp, written on his mobile phone and transmitted to the Guardian in WhatsApp messages. This process took a while: it was only in late 2015 that Doherty published one of Boochani’s poems, and it took another three months for his first article to appear in the Guardian.

For Doherty, Boochani’s ability to survive had less to do with his relationships with fellow detainees and more to do with his professional ethos. He “was, and saw himself as, a working journalist on Manus,” Doherty writes. “He was a man with a mission, every day, a reason and a rationale in that place. Journalism kept him busy, kept him focused, gave him a resolution and a cause… [J]ournalism — a sense of mission to bear witness, an unshakeable belief in his ‘duty to history’ — gave him a purpose that many others held in that place were denied.”

I was puzzled by the inclusion of some of the other essays, particularly those written by academics. Their affirmation of Boochani’s views seemed to me to be unwarranted, and I found the at times gushing tone embarrassing. If the editors were afraid that a republication of Boochani’s journalistic pieces on their own could not have been justified, then their concerns were groundless. If the intention was to invite authors to critically engage with Boochani’s ideas and prose, then some of the invitees were not up to the task. Maybe they thought that critique necessarily amounted to criticism?

A critical engagement with Boochani’s writings, rather than hero worshipping, could be for another book project. Its editors ought to draw not only on fellow refugees, forced migration scholars and refugee activists, but also on writers who could give voice to the people of Manus.

In Australia, too often, their voices haven’t been heard. On the few occasions when they were, they weren’t properly listened to. Note to future editors: if you found it difficult to identify somebody able to give voice to Manus Islanders, you could do worse than to republish Michelle Nayahamui Rooney’s wonderful 2018 essay “The Chauka Bird and Morality on Our Manus Island Home.”


The pieces by Boochani assembled in Freedom, Only Freedom were written for publication. They were designed to let the public, particularly in Australia, know about the Australian government’s disregard for the human rights of those whose imprisonment on Manus it had authorised. The diary of Uwe Radok, which has now been edited by his daughter Jacquie Houlden and the historian Seumas Spark, was not meant for publication. In fact, it was probably not meant to be read by anybody except its writer.

What links Freedom, Only Freedom with Houlden and Spark’s Shadowline is the fact that Boochani and Radok were refugees held prisoner indefinitely without having been charged with a crime. Only the historical circumstances were different: one was detained by Australian authorities and imprisoned in Papua New Guinea, the other detained by British authorities and held captive in Australia.

Uwe Radok, born in 1916 in Königsberg (today’s Kaliningrad) to a non-Jewish mother and Jewish father, left Germany in 1938 for Britain, where he worked as a mechanical engineer. On the direction of MI5, he and his brothers Jobst and Rainer were arrested in September 1939. Unlike the majority of German refugees living in Britain at the time, the three brothers were considered threats — not on account of their political views (they had no sympathy for Nazi Germany) but presumably because their brother Christoph was serving in the German airforce.

On 30 June 1940, together with more than 1200 other internees and some German POWs, the three brothers were put on the SS Arandora Star, which was to take them to Canada to be interned there. On 2 July, a German U-boat torpedoed and sank the ship. Some 800 of those on board died.

The brothers survived and were taken back to Britain by their rescuers, and on 10 July 1940, together with more than 2000 other internees, embarked on the HMT Dunera. This time the destination was Australia. On arrival there, the Radoks were interned in Tatura in northern Victoria and, unlike the majority of the Dunera internees, who had not been regarded as an immediate threat to Britain, they were only released in May 1942.

The published diary covers the period from Uwe Radok’s embarkation on the Arandora Star until 12 February 1943. Half of it is about his time as an internee, the other half deals with his life as a member of the Australian army’s 8th Employment Company. It’s not made clear why the published text ends several months before Radok stopped writing a diary. It’s also not evident to me why it has been necessary to omit Radok’s notes on books he read — they might have told us more about him.

The two books also have in common that Boochani’s texts and some of Radok’s were translated, and that all were copyedited. We can only wonder how much of Boochani’s voice got lost or altered when his WhatsApp messages were turned into English and then attended to by a Guardian subeditor. I am curious to see the first book written by him in English.

Radok wrote some of his diary in English, some of it in German, and some of it in a mixture of both languages. The editors opted for a smooth English text, with some German expressions and misspellings deliberately left in place “to retain the ring of the original.” It’s to their credit that they illustrated the text with images of parts of the original diary. But rather than mere illustrations, they actually show how many liberties were taken with the original.

Radok’s diary says much about one man’s experience of internment (and more about his infatuation with a fellow internee) but much less about camp life. It is hard to warm to its author, who comes across as self-obsessed and arrogant. While Boochani’s portraits of his fellow detainees are remarkably generous, Radok’s depictions of others (including the man with whom he was infatuated and even more so the woman who was to become his wife) are often mean-spirited. Unlike Boochani, Radok didn’t assume the role of a witness. But since he probably didn’t intend others to read his diary, he can’t be blamed for that.

In their introduction, Spark and his Monash colleague Christina Twomey write that “The stereotypes that now envelop the Dunera boys and their place in the history of post-war Australia have conflated individual stories into an increasingly homogenous narrative, a singular triumph of good citizenship and material success.” They cite films and a book from the 1970s and 1980s in support of their argument. But Spark himself was involved in two recent books about the Dunera that attempted to open up that narrative, and Uwe Radok’s diary shouldn’t have been needed to warn us “against the comforts and conceits of generalisation and mythology.”

At least as problematic as the conventional narrative about the contribution of the “Dunera boys” to postwar Australia is the idea that their internment experience was somehow emblematic of civilian internment in wartime Australia. More than half of those interned in what were sometimes labelled, at least initially, as “concentration camps” by the Australian authorities were interned by Australia (rather than by Britain, as in this case) and not just in Australia. They included German and Austrian refugees who, unlike the Radok brothers, were sometimes interned with committed Nazis, and sometimes for far longer than most of the Dunera internees.

Also unlike the Dunera internees who were in a position to talk publicly after the war about the injustices suffered at the hands of the British (particularly aboard the Dunera), the refugees interned by Australia had no ready audience for their stories. Their postwar lives, which were sometimes marked by the traumas of internment, were not the subject of celebratory books and films.

Boochani’s experience wasn’t typical either. There were no women and children in the Manus camp when he was there. He was eventually able to leave and has since been granted refugee status in New Zealand. Others who were with him on Manus weren’t so lucky.

We must also keep in mind that Boochani’s dispatches don’t describe a phenomenon that’s now in the past. The current Labor government has been as wedded as previous Coalition and Labor governments to the punitive treatment of refugees and the wretched system of preventing people seeking Australia’s protection from submitting their claims in Australia. In fact, the recent federal budget included a $150 million increase in funding for off-shore processing. And the British government is determined to emulate the Australian example by deporting asylum seekers to Rwanda. But I am confident Behrouz Boochani will keep reminding Australians from his exile in Aotearoa New Zealand that he was describing unfinished business. •

Freedom, Only Freedom: The Prison Writings of Behrouz Boochani
Edited by Omid Tofighian and Moones Mansoubi | Bloomsbury | $32.99 | 344 pages

Shadowline: The Dunera Diaries of Uwe Radok
Edited by Jacquie Houlden and Seumas Spark | Monash University Publishing | $34.99 | 181 pages

The post Inside the wire appeared first on Inside Story.

]]>
https://insidestory.org.au/inside-the-wire/feed/ 2
Do leaders matter? https://insidestory.org.au/do-leaders-matter/ https://insidestory.org.au/do-leaders-matter/#comments Mon, 14 Nov 2022 23:11:41 +0000 https://insidestory.org.au/?p=71733

It depends, says historian Ian Kershaw

The post Do leaders matter? appeared first on Inside Story.

]]>
Do individuals make history or are they compelled by forces beyond their control? This is the question Ian Kershaw tries to answer in Personality and Power, his new study of twelve twentieth-century European leaders. This is not just an academic exercise: understanding agency, or lack thereof, is crucially important to those who want to shape the world as politicians, businesspeople, intellectuals, soldiers or “influencers.” And, of course, it should matter to the rest of us, who have to live with the consequences of decisions taken by those in power.

Despite the misgivings of professional historians, this quest for understanding — the hope that we can learn from history — is indeed one of the reasons why historical works continue to attract readers well beyond academe.

Philosophers have long debated the question of agency and constraint, and Kershaw’s introduction is a clear-sighted guide. For determinists, leaders are driven by passions and hindered by their location and time, the consequences of their actions building up into processes and systems no individual controls. For romantics, at the other extreme, history is made by “great people” (or, mostly, “great men”) who shape events with vision and willpower.

In any polarised debate, there’s a golden middle. Remarkably, here it was formulated by a radical: Karl Marx. From a determinist starting point, he tried to make sense of how, at certain historical moments, individuals can prove decisive. He distilled this insight in a statement that has become something of a platitude: people make history, but they make it under conditions they neither created nor fully control.

Kershaw is a historian, not a philosopher. And historians can’t say in the abstract whether a person can “make history.” It depends on context. Certain situations and certain institutional arrangements lend themselves more to romantic-looking action, while others might constrain even the most headstrong and visionary individual. Power, of course, plays a role here: it’s the ability to make others do as you please. If you are in a position of power, your decisions — good or bad — tend to have more of an impact than if you are not.

Even among political leaders, though, circumstances matter: the domestic and international context, the problems needing to be solved and the means available to do so, and whether or not they are lucky. Some contexts will allow the leader’s personality to have a massive impact; others will not. And a strong personality’s impact on the course of events can be altogether different from their long-term legacy.

The two extreme cases in Kershaw’s sample are Yugoslav dictator Josip Broz Tito and German chancellor Helmut Kohl. Tito clearly impressed Kershaw. He appears larger than life on these pages: a man of strong intellect, physical and political bravery, expansive sexual appetites, brutality, callousness and charm. Able to suffer stoically during the war, he embraced the good life once he could: this was partisan leader turned head of a communist nation who liked to cruise the world on his private yacht.

Contradictory as Tito was, he was the ultimate example of a charismatic leader: without him, the Yugoslav partisans could not have done the incredible and defeated the German occupiers with only minor outside help, a unique achievement during the second world war. Without Tito, likewise, communist Yugoslavia could not have stayed out of the Soviet orbit after 1945, taking a leading role in the movement of non-aligned states during the cold war. And without Tito, Yugoslavia, that somewhat artificial creation of the aftermath of the first world war — neither a nation-state nor an empire — would have disintegrated long before it did.

At the other end of the spectrum is Kohl, German chancellor for an incredible decade and a half, 1982–98. Something of an unimaginative nonentity, he was often clumsy on the world stage and entirely lacking in vision beyond what he could see from his reserved table in his local pub, where he would talk to his cronies and eat the most ghastly of all German dishes, Saumagen. The butt of jokes by the German intelligentsia throughout his years in power, his ordinariness was what attracted many voters: this was a dull country longing for stability rather than excitement.

To his credit, Kohl was a far cry from the neoliberal disruptors in Britain, Australia and the United States. His government reacted to the same crisis of postwar welfare capitalism that brought Margaret Thatcher to power by adjusting the welfare state rather than destroying it.

Had Kohl resigned in 1988, he would have been no more than a footnote in the history books: in Kershaw’s words, “an entirely unexceptional democratic leader.” But then came 1989, the year the crisis erupted in Eastern Europe and the Soviet empire imploded. The wall dividing East and West Germany, that concrete symbol of the cold war division of Europe, came down and Germans celebrated on its ruins the reunification with their cousins on the other side.

Kohl had little to do with these stunning developments. Propelled by events, though, he took the initiative. His somewhat dull personality proved useful in establishing good relations with world leaders: he was the personification of a Germany that was too interested in local cuisine and a quiet life in the provinces to threaten Europe ever again. He became the “chancellor of unity,” a major figure in the history books with a legacy even his later entanglement in a corruption scandal could not blemish.

Ironically, then, the type of person the romantics would champion as a genius, Tito, has left no legacy. Not only does communism no longer exist; Yugoslavia has gone too. Meanwhile, a nonentity interested in little beyond maintaining power left a united Germany within a still relatively united Europe.


Tito and Kohl are only two of the twelve leaders Kershaw explores. He begins with Vladimir Lenin, Benito Mussolini, Adolf Hitler and Joseph Stalin, then shifts to Winston Churchill, Charles de Gaulle, and Konrad Adenauer before exploring the lives of Francisco Franco, Margaret Thatcher and Mikhail Gorbachev.

Each chapter is informed by the latest historiography and provides a sure-footed and balanced interpretation. Anybody who wants a quick, readable and well-informed introduction to the life and legacy of any of Kershaw’s twelve protagonists can reach for Personality and Power. Anybody who wants a well-constructed political history of Europe in the twentieth century can read the book cover to cover, as Kershaw skilfully links his narratives with a thread of themes and findings.

As a history of Europe, this is “history from above,” the kind of history social and cultural historians have rebelled against since the 1960s. These critics have condemned the writing of history as a chronicle of the actions of “pale males” in positions of power. Is Kershaw suggesting we resume ignoring women, people of colour, and common folk? No, he is not. His focus is certainly on those in power, and most of his case studies are of men. And all of them would be regarded as “white.”

But this is not a book about “great men who make history.” Kershaw is dismissive of the entire idea of “greatness,” an analytically vacuous and politically dubious concept. The reason most of his actors are male and white is because, in twentieth-century Europe, Margaret Thatcher was one of very few women who made it to the top of a country’s power structure, and only recently could people of colour make political careers in that part of the world.

Moreover, Kershaw’s analysis is rooted in a deeply understood social and cultural history of Europe. He explores the lives of leaders who emerged from and remained embedded in the societies they led. This history of (not so) great men thus combines social and political history in a genre Kershaw helped shape with his pioneering two-volume study of Hitler, published in 1998 and 2000.

Some of the social history that shapes Personality and Power is couched in somewhat old-fashioned language, particularly the hazy concept of “modernisation,” a shorthand for all those larger social forces beyond the control of any one individual. Lenin’s family, we learn, supported “modernising reforms that would make Russia more like the more enlightened societies of western Europe.” Did they mean the same thing as Lenin’s successors, who fought over “the question of how quickly the Soviet Union should modernise”? Or Adenauer, who as lord mayor of Cologne during the Weimar years “introduced modernising improvements to the amenities of Cologne”?

A little later, Mussolini struggled with the “inadequate modernisation” of Italy’s armed forces, while Hitler oversaw “economic modernisation” only in the armaments industry. De Gaulle was unable “to bend to his brand of authoritarianism the forces of modernisation… which were coursing through France and the rest of Europe in the 1960s.” Likewise, in Spain, “modernisation increasingly showed Franco and his regime to be obsolete leftovers from a bygone age” and, later, “the forces of modernisation were overtaking the outmoded authoritarianism of the system.”

In 1970s Britain, meanwhile, “the spread of global modernisation” meant that other countries had caught up with Britain’s level of development, leading to a perception of decline. In the Soviet Union “the economy had to be modernised,” a fate that also befell “German conservatism” under Kohl.

What does any of this mean? It’s clear enough what “technological modernisation” entails — the embrace of newer and better cars, tanks, can openers and telephones. But what is political, social or cultural modernisation? Here the word simply stands for “change,” often with the implicit meaning of “change for the better.”

“Modernisation theory,” once popular among non-Marxist social scientists and historians, was a way to talk about directed change without having to embrace Marxian principles. But the premise of this framework has undergone rather severe critique since the 1970s and it’s somewhat baffling to find it in the writings of a historian as sophisticated as Kershaw.

The absurdity of the concept can be illustrated by applying it to recent political developments. Has Donald Trump “modernised” US politics, for example? His popularity was founded on his status as a TV celebrity — “modern,” no doubt — and his campaigning and even governing (if that is the right word) relied heavily on Twitter — also very “modern.” Hence, should we call him a “moderniser”? Or was his undermining of democratic governance and even the very concept of truth a “de-modernisation”?

The other somewhat old-fashioned aspect of this book is its Eurocentrism. This is a history of personality and power in twentieth-century Europe rather than the wider world. This constraint is regrettable because readers would have benefited if Kershaw had more widely applied his judgements and ability to carefully synthesise the historiography. A chapter on Mao, for example, would have been enlightening, as would one on Indira Gandhi. This would also have made the book more useful for the undergraduate classroom, as political history today is more likely to be taught in a world-historical frame than a European one.


What, then, can we “learn” from this history? Kershaw tries to distil his conclusions into a series of seven propositions. They appear a bit clumsy, a somewhat pedestrian form of social science, and stand in sharp contrast to the sure-footed and well-written interpretations of the twelve individuals.

More powerful is his own personal preference, stated early in the book: “I would be happy to avoid ‘charismatic’ personalities altogether in favour of leaders who, if less colourful, can offer competent, effective governance based on collective deliberation and well-founded, rational decisions aimed at improving the lives of all citizens.” But, as his book shows, circumstance can throw such leaders into situations where they will have to act in less deliberative, quasi-dictatorial ways. And it is in these circumstances that their personal qualities will matter most. Hence, we should all worry about those who lead us.

In ordinary circumstances, personal qualities might not matter much; in extraordinary circumstances (pandemics, wars, natural disasters), they do. •

Personality and Power: Builders and Destroyers of Modern Europe
By Ian Kershaw | Allen Lane | $55 | 512 pages

The post Do leaders matter? appeared first on Inside Story.

]]>
https://insidestory.org.au/do-leaders-matter/feed/ 1
Ticking like a bomb https://insidestory.org.au/ticking-like-a-bomb/ https://insidestory.org.au/ticking-like-a-bomb/#comments Sat, 12 Nov 2022 06:05:24 +0000 https://insidestory.org.au/?p=71722

Two new books show what Australia’s involvement in the Vietnam war left in its wake

The post Ticking like a bomb appeared first on Inside Story.

]]>
What are the odds? Two books about the war in Vietnam landing on my desk in as many weeks. Curious that they’ve appeared when the median age of Australians is 38.4, which means that a sizeable chunk of us weren’t even born when the war was being fought. Vietnam has become a country Australians visit, not the site of brutal devastation.

As is often noted, this small Asian country — a country that most Australian conscripts had never heard of before they were drafted — was the target of three times the tonnage of bombs dropped during the second world war. The exact figure varies, though not significantly, and the bombing was just a part of it. The land was heavily drenched and its people poisoned by the chemical defoliant Agent Orange.

Note, too, the word “conscript.” After Vietnam there were no more conscripts, and both the books, each in its way, tell us why. Though they come to the subject from different angles, reading them together is like entering a long-overdue discussion about an ugly yet largely forgotten war, the painful reverberations of which extend to this day.

Bronwyn Rennex is the daughter of a man she scarcely knew, a man who was sent to Vietnam in 1965 when she was only a year old. Fifteen when he died in 1980, she noted in her schoolgirl diary: “Everyones really upset. I was crying all night. He died of a heart attack about 1/4 past 12 last night.” Now a woman in her fifties, an artist, curator, former part-owner of a Sydney gallery, she documents her search for the cause of his death at fifty-two and her inability to reach him when he was alive.

Life with Birds: A Suburban Lyric is also a record, as its subtitle suggests, of a kind of suburban life that has passed from contemporary reckoning. There’s the modest house set on a reasonable-sized block on the city fringes. There’s the male breadwinner and the mother who looks after the house and three daughters.

But there’s also an embroidered green silk coat that Bronwyn, the youngest, and her two older sisters think is a kimono, though the father had sent it from Vietnam. “Coming from the suburb of North Ryde — a land of wood panelling, beige carpets and wall units filled with clown statues and crystal trinkets — the coat didn’t just seem from another place, it was from another planet.”

By the time her father returned, Bronwyn was four, old enough to pretend to be a dog, “growling and tearing at the bottom of his trousers with my teeth, while he tried to hug Mum.” She didn’t think much of this stranger, and a stranger he largely remained.

John Rennex’s early death was not just a shock but also placed a great strain on his widow. At one level, though, she must also have been relieved. In the letter Elsie Rennex wrote to the veteran affairs department in pursuit of a war widow’s pension, she claims that her husband had once been “a loving outgoing type of person” but after his service “the close communicative relationship we enjoyed before his departure had changed considerably, he became quite withdrawn, he refused absolutely to speak of his term in Vietnam, and our everyday problems were left for me to resolve.”

He couldn’t sleep, took to smoking heavily and developed a persistent cough. Despite a doctor’s warning, he kept smoking. He had pains in his arms and was treated for rheumatism instead of the heart condition ticking away like a bomb. Elsie leaves it to her last paragraph to explain the reason for her request for help: “I still have two daughters dependent on me.”

She never got her pension. The repatriation board found no direct connection between John Rennex’s service and his death. He was overweight and smoked too much: end of story. Her pursuit of a pension is a biting illustration of the Kafkaesque maze she had to contend with, as her daughters did later.

Years down the track, after finally locating the relevant section of the veteran affairs department, Bronwyn requested a photocopy of her mother’s letter, eventually receiving one with the bottom chopped off. Asking about the missing lines, she was informed by a department office that the original was foolscap and they only had an A4 printer. She then asked why they couldn’t print the letter out on two pages. The second page would only have two lines on it, came the reply. Was she exasperated? Was she angry? Of course she was. My head spun just reading about it.

Yet Life with Birds is a beautiful poem of a book. I’ve given you the bones but little of its spirit, which is lyrical and quirky, if laced with piercing irony. Accompanying the text we have the author’s photographs, mostly underexposed and blurred, mimetic of Rennex’s defective memory of her early years and her long, slow awakening to her father’s story.

Like a Greek chorus, they offer a running commentary on the action. With few exceptions, only the reproduced documents are sharp enough to determine: the girlish diary pages, some army report sheets, a curious photo of a delayed christening with the tops of the heads missing (perhaps reiterating the missing lines in the departmental photocopy?).

So why the birds — creatures already so pregnant with symbolic import that they resist simplistic interpretation here? Birds crop up in songs and poems quoted, and they are resonant in the author’s own flying back and forth overseas. Home again after her mother’s death, she finds a myna bird trapped in the garage while hunting for her things stored there. The dehydrated bird is given water and released, but takes its time remembering how to fly. “Do birds have knees?” is the kind of question Rennex is prone to ask.

Life with Birds is both an idiosyncratic and a resolutely personal book. Its focus is on one family but the circle of its light spreads further.


As we 1970s feminists repeatedly insisted, the personal is political. These three explosive words became the movement’s central tenet as well as its most effective slogan. Nothing illustrates this more than Biff Ward’s The Third Chopstick, a book of great breadth and depth that answers many of Rennex’s questions yet is every bit as personal.

Here I must mention that Biff is a friend of mine, and that we met through Canberra Women’s Liberation. We both appear in Brazen Hussies, the award-winning documentary of the 1970s women’s movement released in 2020 and later screened on ABC TV.

All this is to say that we go well back, and one of the many things that struck me when reading Biff’s book alongside Life with Birds is that both of us are old enough now to be Bronwyn Rennex’s mother. Indeed we both have daughters around her age, neither of whom were left in the dark about Vietnam.

Though they had little choice in the matter, our girls were exposed to the radical movements of the day, for the war John Rennex went to was the one we vehemently protested. Our vocal, passionate opposition was the crucible in which grievances that had simmered through the fifties boiled to bursting point in the sixties.

The Third Chopstick’s first chapter vividly describes such a protest. It’s 1965. The woman who would come to write this book is walking past the Commonwealth Offices in Sydney’s Martin Place handing out leaflets and crying, “Get Out of Vietnam.” To begin with, only a gaggle of protesters had met there every Friday, but their numbers have steadily grown, and on this particular Friday, a group of 200 starts marching along Pitt Street until they meet a police blockade on the King Street intersection.

Prevented from going further, dozens of them sit down on the road, blocking the traffic. The police try to move them; the protesters resist. It is a rough confrontation; some are injured. Pushed against a jewellery store window, they have entered a dangerous new phase.

“I hadn’t seen this before,” Ward writes. “Australia had not seen this. My eyes raced from sitters to police to the onlookers collecting on the pavements. The tone of the surround sound had changed, the car horns now cut through with screams and voices barking on police walkie-talkies.” The next morning the papers were filled with reports of the incident. Forty-seven protesters had been arrested. A doctor, who’d gone to the jail to bail out his son, was arrested as well.

What I’ve left out in this summary is the extraordinary immediacy of Ward’s depiction, so skilfully sustained. For The Third Chopstick is a perfect blend of memoir, history and biography, beautifully and sensitively written. In its short introductory chapter we have the beginning, the veil of smug suburban complacency irreparably torn. What followed were the draft resisters, an easing of censorship, women’s liberation, the freedom rides and the Aboriginal Tent Embassy, the Whitlam government and its dismissal.

But the question to ask is why, after so many years have passed and events have erased the intensity of that day, Ward’s visceral connection with Vietnam remains. She has had a busy life teaching, running a practice for facilitating dialogue, attending other demonstrations and protests, and publishing poetry and two other books of non-fiction. But as soon as she could she visited Vietnam, and it wasn’t long before she was conducting tours through the country herself.

Occasionally I was tempted to sign up for one of those tours but circumstance and other priorities stopped me. I knew she was working on a book about Vietnam and was interviewing veterans for it, but I watched, intrigued, from the sidelines, never understanding the importance of what she was attempting.

In our time of niche politics, deepening polarisation and plummeting trust in governments, it’s easy to forget that we have been here before. We may find comfort in imagining that things were different back then, but The Third Chopstick reminds us that we’ve been divided, and angrily so, before.

It also shows us what it takes to walk across that divide, to establish genuine connections with people whose experiences and views on life are radically different from your own. You need imagination, compassion, commitment and, admittedly, the special skills Ward acquired in years of mediation work. As such, the book is as much about that process as it is about the veterans she meets, and her reactions to them and theirs to her.

Bit by bit, she stepped forward. She approached a speaker at a conference of former protesters, veterans and Vietnamese Australians. He introduced her to another man and so her involvement grew. She learned about Granville, where a federation of Vietnam veterans had its headquarters in a rundown community centre of the kind that resembled women’s refuges she had worked in.

Gradually this one-time Vietnam protester turned radical feminist turned professional mediator acquired the trust of many men who ended up allowing her to record their stories, and in the majority of cases welcomed the opportunity. Even now, I’ll be damned if I know how she did it. All I do know is that The Third Chopstick is a wonderful achievement, a book unlike any other, though I can understand why it took so long to bring to fruition. The stories the men told are painful. Not all of the connections went smoothly; one of the most important was arguably the most difficult.

Wars are hell for humans, whether they are the bombed civilians, those who are conscripted to fight or choose to enlist, or the families left behind to suffer long-lasting consequences. To acknowledge this in the face of war’s glorification and its industrial-scale infrastructure is essential. Australian governments have drafted no conscripts since Vietnam, conscription having proved too politically risky, yet they have eagerly signed up for so-called “coalitions of the willing,” and there’s a serious risk we’re heading for war on a scale much larger than those we’ve participated in so far, one with the potential for striking, literally, home.

Ironically, Ward’s deep involvement with Vietnam, the country itself and its people, has made her question whether she’s a pacifist after all. What were the odds that a small Asian country would bring the might of America and its allies to their knees? In their place, or a similar one, wouldn’t she have fought the invaders as the Vietnamese did?

That may be so for any of us. But I see it another way. These two books have convinced me, if I needed any convincing, that nothing is more important, or conducive to peace, than suspending judgements in our search for understanding. •

Life with Birds: A Suburban Lyric
By Bronwyn Rennex | Upswell | $29.99 | 204 pages

The Third Chopstick: Tracks through the Vietnam War
By Biff Ward | IndieMosh | $42.95 | 315 pages

The post Ticking like a bomb appeared first on Inside Story.

]]>
https://insidestory.org.au/ticking-like-a-bomb/feed/ 6