29/11/2019

Media Coverage of Violence Against Women





Last Friday, a New Zealand jury returned a guilty verdict in the Grace Millane murder trial. The 21-year-old British woman was killed by a 26-year-old man in his Auckland apartment in December 2018.
The high-profile trial drew criticism for its attention to Millane’s sexual history and its attempt to argue that she had consented to the violent act that caused her death.

I attended the trial and argue that bold misunderstandings about gender, power, sex, and violence were allowed to shape a defence case in ways that disrespected Millane, likely harmed courtroom witnesses and may have come close to thwarting justice.

The murder trial for Grace Millane’s killer poignantly demonstrated it is still possible to question a woman’s behaviour without understanding the specifically gendered nature of violence against women. I observed many ways this happened, as defence lawyers crafted a case that was heavy on the myth of egalitarian sexual adventure and light on the reality of gendered power that still shapes the contemporary heterosexual landscape.

Contrary to the picture they painted, study after study shows that both young men and young women still expect that men will pressure women to have sex when they don’t want it, and to engage in sexual acts they are not keen on.

Failing to understand these dynamics of men’s violence against women makes the path to courtroom justice unnecessarily rocky. It makes it potentially traumatic for surviving women who take the witness stand, and in a high profile case like this it feeds into social attitudes that misrepresent and minimise gendered violence.

A month before he killed Grace Millane, the man convicted of her murder non-fatally suffocated another woman. With his knees on the side of the bed, facing her feet, he pinned down her forearms with all his weight and sat on her face with so much force that she could not breathe. She was terrified she was going to die.

This woman was cross-examined relentlessly by the defence lawyer, who dissected her behaviour during and immediately after the attack, and in the month afterwards. He told her she had “exaggerated” what happened and asked her repeatedly why she did not leave earlier. He interrogated her about why she maintained regular text message contact with him over the following month.

The woman on the witness stand was impressive, articulate and bold. At one point, after being told yet again that she was not a reliable witness to her own experience, she told the lawyer, “You can’t minimise what happened to me. It happened.”

As the lawyer persevered with this line of questioning, it became clear just how little he understood about the dynamics of men’s violence against women. He judged her experience from his own masculine point of view, perhaps imagining what he would have done in that situation.

He failed to understand how threat manifests differently for a woman who has been violently assaulted by a man. He appeared not to comprehend that a woman might behave differently in assessing danger, safety and risk.

As this woman pointed out through her evidence, the defendant knew a lot about her and her movements. She knew he would be able to find her, and she worried that if she cut him off cold he would show up in her life. As a New Zealand Law Commission report notes, strangulation or suffocation “is a uniquely effective form of intimidation, coercion and control”, as it demonstrates “he can kill”.

Encountering a violent man on a Tinder date is a modern form of risk. We need to better understand what survival skills would look like in that particular kind of relationship, formed in isolation from off-line social networks and through communication technologies that can leave a person exposed through the trails of their online activity.

As this woman later worked to placate the man who attacked her, keeping him at a safe distance, while managing to avoid him in person, she was able to diffuse or delay the risk of him stalking her or turning up in person. These actions make perfect sense as a modern form of self-defence for our technology-mediated social world.

What other choices did she have? Keeping in mind that, as we now know, she had accurately assessed the defendant as a tinderbox man, prone to unpredictable, explosive, and as it turned out deadly, violence.

Moving forward, we have the opportunity to learn from this case and work for improvements. Many voices are calling for a rethink of the rules of evidence that place no limits on publicly airing deeply personal private information about the victim in a murder trial.

We also need to insist that lawyers and judges get up to date with the current state of knowledge about the nature and impact of men’s violence against women.

While defence lawyers have a duty to zealously defend their clients’ legal rights, we must debate the ethical questions about how far that can reasonably be pushed, and at what cost to justice for innocent and victimised others.


Grace Millane’s Murder Trial Shows Social Attitudes Continue to Minimise Gendered Violence. By Nicola Garvey. The Conversation , November 27, 2019.






In the first 48 hours after the guilty verdict in the Grace Millane murder case, Fiona Mackenzie received 50 interview requests from the media in the UK, US and New Zealand. In the previous week, as Millane’s killer claimed she had died in a “sex game gone wrong”, Mackenzie, founder of We Can’t Consent to This, which campaigns against the “rough sex” defence, had managed to recruit a voluntary press officer to help with the deluge. Between them, they had juggled BBC Breakfast, ITN, Sky, Channel 5, local TV and radio stations, with Mackenzie also covering her full-time corporate job as an actuary.


“My employers have been very accommodating,” she says. “I’m still claiming that this doesn’t take up much time – though I’ve had to give up computer games and reading novels.” She’s not complaining about the extra work. “For the first half of this year, it was a real struggle to get much interest at all. I think the Grace Millane murder has really changed things.”

It wasn’t just the murder that has sparked rage – though the details are horrifying. Millane was a British graduate who arrived in New Zealand on a round-the-world trip; she agreed to a date with someone she matched with on the dating app Tinder, then was strangled. Later she was contorted into a suitcase and buried in the woods – though not before her killer had photographed her naked body, watched some porn and gone on another date. Despite these horrifying facts, the trial focused on Millane herself, her sexual history and use of dating apps like Tinder and Whiplr, shifting responsibility away from the murderer and on to the victim.

Newspapers ran headlines such as “Naïve and trusting”, “Strangled tourist liked being choked” and “Grace Millane ‘encouraged date to choke her during sex and apply more force’”. “After the murder, the trial and the way it was reported,” says Mackenzie, “I think there’s a new understanding of the ‘rough sex’ defence and a level of anger that wasn’t there before.”

What Mackenzie calls the “rough sex” defence is the claim in murder cases that the victim consented to violent sex, which led to their death.

Mackenzie launched We Can’t Consent to This last December, building the website over Christmas. “To be honest,” she says, “I’d gone on holiday on my own and was quite bored.” She was soon joined by a handful of volunteers – an old school friend, some fellow Lib Dem campaigners and some women she had met on Mumsnet: “I don’t have children but had originally gone on looking for running tips and found an incredible group of feminists.”

The trigger for Mackenzie was the case of Natalie Connolly who was brutally killed in December 2016 by John Broadhurst. Broadhurst, her partner of a few months, left the 26-year-old bleeding to death at the bottom of the stairs in the home they shared with her eight-year-old daughter. Connolly suffered multiple blunt-force injuries but Broadhurst claimed it was the result of “rough sex”, and was found guilty of manslaughter. He was sentenced to three years, eight months. Earlier this month, he appealed to have his jail time cut but was unsuccessful. “Some people raged against the verdict but in the main, there seemed to be astonishing levels of acceptance,” says Mackenzie. “People somehow believed it, saw it as a weird one-off and thought the victim must be to blame for the riskiness of her behaviour.”

For Mackenzie, this stirred difficult memories of a case that took place near Aberdeen, where she was at university studying maths, in 2000. The body of Mandy Barclay, a 32-year-old mother of two, had been found in the local woods. She had died of asphyxia and severe rectal injuries, and her husband, Niall McDonald, was charged with murder. In court, the defence claimed that Barclay had died while practising the kind of sex “that narrow-minded people would call kinky”. As a student, Mackenzie followed the details offered up by McDonald (whip, French maid outfit, rooftop sex) and when he was found guilty of the lesser charge of culpable homicide and sentenced to seven years, Mackenzie admits that she believed on some level the victim was somehow culpable.

“With the Broadhurst trial, I felt people were reacting the way I had,” she says. “I wondered how many other cases there were and wanted to bring them together to show they weren’t isolated incidents.”

This wasn’t easy. There are no official statistics. Cases aren’t collected or categorised under the defence, so Mackenzie has trawled through media reports and legal archives to find them. By the end of that first Christmas holiday, she had 35. She is now aware of 59, although the true figure is probably higher – and doesn’t include the vast number of non-fatal alleged assaults in which “rough sex” was argued as a defence. The homicides listed on We Can’t Consent to This stretch back to 1972, each one a terrible testament to lessons not learned. “Though pleading consent has no status in English law, juries seem to accept it,” says Mackenzie. “We’ve found that it has resulted in a lesser charge, lighter sentence, or acquittals in 45% of the cases.”

In 1979, to pick one example, 19-year-old Vivien Scott was strangled by 21-year-old DJ John Dudgeon (aka John Taylor) in what he claimed was “slap and tickle”. Dudgeon’s defence argued that his “one mistake” had been not seeking help immediately. He was found guilty of manslaughter and sentenced to four years. After serving just 17 months he was out, and attempted to rape and murder a woman in her home. When released a second time, he murdered 32-year-old Susan McNamara.

In another case from 1991, Stuart Williamson received a three-year sentence for the manslaughter of his girlfriend Honor Matthews, 20. Despite his previous convictions for violence, his defence claimed that the “deeply attached” couple had been engaged in “pseudo-masochistic’ ‘neck compression” aimed at “giving pleasure”. After his release, Williamson abused his new partner and killed his mother.

One of the earliest recorded “rough sex” defences took place in 1961 and even this carries overtones of the Grace Millane murder. It took place in Kenya and involved a man called Sharmpal Shikh who claimed his pregnant wife Ajeet had been killed during a “sexual embrace” – she had died from internal injuries to the neck and chest. He had then embarked on an elaborate deception, taking Ajeet’s body to the courtyard and stabbing her to make it resemble a robbery. As in the Millane case, the prosecution argued that the behaviour after the killing was the action of a murderer, while the defence countered that it was done in fear, shock and panic. Shikh was believed and the verdict was manslaughter. Rather like Broadhurst, he made an unsuccesful appeal to be acquitted of this.

Seeing parallels and repeats through the decades, it’s clear that the “rough sex” defence often sits within a long history of blaming women for their own killings, of looking away from perpetrators to see how victims brought it on themselves. Claiming they actually consented is the logical endpoint.

Hallie Rubenhold, social historian and author of The Five, the untold lives of the women killed by Jack the Ripper, has no doubt there’s a pattern here that has been going on for centuries. In her book, which last week won the Baillie Gifford prize, Rubenhold re-examines the lives of the ripper’s victims. She finds no credible evidence that three had ever worked as prostitutes, but shows instead how all, born female and working class, lived and died with the cards stacked against them.

The press coverage at the time was often wildly inaccurate, and the entire mythology around “ripperology” since has painted the victims as vulnerable prostitutes, faceless “fallen women” who lived recklessly. After one of the murders, a letter to the Times from a senior civil servant even thanked the “unknown surgical genius” for clearing the East End of its “vicious inhabitants”.

“What it came down to, just as it did with Grace Millane, is victim-blaming,” says Rubenhold. “It’s so deeply ingrained, it’s almost in society’s DNA. Instead of looking at what happened and why, we’re geared to look at the women instead. Just as rape victims are asked about their sexual history, what they were wearing, what they had been drinking, the knee-jerk reaction with Grace Millane was, ‘Oh well, she was far away, on Tinder, looking for a hook-up – what kind of woman does that?’ The whole point is it doesn’t matter. They don’t deserve to die.”

Why, finally, are we waking up? Mackenzie believes it is partly the rising tide of cases. There has been a 90 per cent increase in the last decade.

More than this, though, is the growing sense among women that this could happen to them – we might all be one “bad date”, one “Tinder match” away from dying like this. While the Millane trial was dominated by male voices – the judge, defence, prosecution, former boyfriend – the fury on social media came from women. A typical tweet with the #gracemillane hashtag read: “If I’m ever murdered by someone I’m having sex with, I’d like it on record that I will not have consented to being choked so hard that I DIE.”

In the last fortnight, Mackenzie has added an “our stories” section to her site with women sharing their own experiences of assault and “rough sex”. Over 100 have contributed. One describes falling in love with an “educated, gentle, kind professional gent” who could only climax with his hands round her neck. Another only managed to free herself from a choke-hold by hitting the man with a lamp: “He was really upset – had thought we were having ‘great sex’,” she writes.

Change is coming, however. The MP Harriet Harman is confident that her two amendments to the domestic abuse bill – designed to reinforce the fact that consent can be no defence for death – will be seen through in the new year, despite the unlawful suspension of parliament in September and the election causing frustrating delays.

Meanwhile, Mackenzie is chasing the Crown Prosecution Service to properly log and track these cases and enforce Harman’s amendments when they are passed. “Changing the law is the easy bit,” she says. “The hard part is making sure it works in practice. I also want to keep raising awareness and pushing for policy responses to the appalling normalisation of unbidden violence against women during sex.” Next year, she thinks she will formalise her campaign with official charitable status, because relying on a handful of volunteers is becoming tricky. “I’m not a professional campaigner,” she says. “I’m learning as I go.” How does it feel to know people are now listening? “I’m just really relieved,” she replies.

This article was amended on 28 November 2019 to remove an incorrect assertion that the founder of We Can’t Consent to This uncovered “two so-called ‘rough sex’ killings from 1996. A decade on, she has found 20 a year.'” Those figures related to cases involving deaths and injuries to women, not just deaths, in 1996 and 2016.





'There's a New Level of Anger': the Women Fighting to End the 'Rough Sex' Defence. By Anna moore. The Guardian , November 27, 2019. 




WE CAN'T CONSENT TO THIS. website








The murder of Grace Millane in 2018 seized front pages of media outlets worldwide, with article after article fixated on details of her personal history. These details implied that the sexually violent nature of Millane’s death was somehow a product of her own actions, and this treatment is itself part of a much larger media trend in how violence against women is represented.

From the day that her family reported her missing on December 5 to the discovery of her body on December 9, media outlets reported unremittingly on the circumstances surrounding Millane’s disappearance, in a manner that should no longer be acceptable.

On November 4 2019, the trial began and the man charged with her death entered a “not guilty” plea. The media latched on to the trial, covering the defence’s attempt to form an alternative narrative that could throw doubt on just “what kind of girl” Millane had been.

They argued their client had not intended to kill Millane. Instead, he had simply followed her instructions and it was she who had initiated violent sex, as she was “a fan of 50 Shades of Grey” and had learned from an ex-partner.
From this moment, the media coverage of Millane’s murder trial focused almost exclusively on details of her sexual preferences, previous partners, and alleged proclivity for BDSM.

Headlines broadcasted interviews with Millane’s friends and ex-boyfriends who had been interrogated by the defence. Articles were preoccupied with her “kinky fetishes,” “other sexual partners” and described her as a “naive and trusting girl.”

By sensationalising Millane’s murder and directing focus towards her intimate preferences and not her killing, media outlets continue to contribute to and perpetuate societal attitudes of victim blaming.

Unfortunately, this kind of journalism isn’t new. It was seen in the coverage of the murder of Mary Nichols, the first victim of Jack the Ripper, more than 100 years ago. Papers included extensive descriptions of Nichols’ injuries – even mentioning that her body was “warm” when discovered - and focused upon her alleged prostitution and alcoholism. It continues to be seen today, as within a 2018 listicle of the most horrific Valentine’s Day crimes, almost all of which featured the murder of a woman by a male partner.




Journalistic efforts to responsibly cover stories of violence against women have unrelentingly failed. Sadly, profit is and always has been the solitary pursuit of any given news outlet, and cultural appetites for stories featuring details of violence against women are seemingly insatiable.

It is known that the media’s treatments of female victims of violence differs radically to male counterparts. For instance, research by Marian Meyers in 1996 demonstrated that women are more likely to be infantilised and referred to in articles by their first names (a journalistic practice predominantly reserved for pets and children).

Articles also rarely present violence against women as a systemic societal issue. Instead, the focus is largely episodic, which implies that violence against women occurs within individual situations, instead of as part of women’s everyday lives. Data published in 2018 by the Office for National Statistics found that the majority of victims are female, with about 560,000 female victims and 140,000 male victims in 2017.

As in the case of Millane, there is a tendency within the media to sensationalise and make abstract the bodies of abused, assaulted, and murdered women. These women are dehumanised by reporting that decontextualises them from their day-to-day lives as loving daughters, dedicated students, and loyal friends. This often means that they are remembered through the limited lens that the media places on them after death.

These women, who are physically and metaphorically voiceless, cannot defend themselves, and so myths around women who “ask for it” are subsequently upheld. By focusing on her sexual preferences, and the way in which she “naively” messaged men on fetish dating apps, articles surrounding Millane’s murder framed her as partially responsible for her own brutalisation.

The implicit journalistic messaging throughout the coverage of the trial was that “women should be more careful” as opposed to “men should not murder.” This is evident within headlines surrounding Millane’s murder which prioritise the defence’s argument that she encouraged her killer to apply more force and that her “ex-lover” knew she liked to be choked. This is also seen within the publication of photographs of her in supposedly “revealing” clothing alongside the suitcase her body was buried in.

It is also unmistakeable in the kind of linguistic model used to describe her murder. Instead of cultivating phrases that refer to the actions of the perpetrator who killed Millane, articles repeatedly use passive syntax to describe Millane as having “been killed”. By employing this kind of language, women themselves are consistently framed as inactive victims with no individualisation or agency. Continually, the focus of this kind of media coverage shifts the blame back to the victim, instead of the perpetrator.

Between 2000 and 2015, the number of articles concerning violence against women rose from 2,000 to 25,000. As the frequency of this kind of reporting has increased, groups have formed to highlight the injustice and indignity of this kind of coverage.

We Can’t Consent to This is a campaign striving to dispel myths around “the increasing numbers of women and girls killed in violence claimed to be consensual”. While We Level Up focuses on lobbying media for more responsible reporting of domestic abuse cases. Such cases often depict male perpetrators as the “perfect family men” despite their histories of violence and their crimes.

In fact, a 2015 study found that media outlets not only distort depictions of domestic violence cases, foregrounding the most provocative details, but are also more likely to report on female perpetrators of domestic violence. This is despite the fact that two women per week are murdered by male partners in the UK.

As women’s charity Our Watch states: newspapers focus on the method of the murder rather than histories of violence, as if it is “more important for readers to know how but not why men kill their partners”.

Despite arguing that her death was a “sex game gone wrong”, Millane’s murderer was found unanimously guilty on November 22 2019. As Brian Dickey, crown prosecutor for the trial, concluded: “You can’t consent to your own murder.”

These women also cannot consent to the ways in which their intimate histories are manipulated by press coverage. The inclusion of irrelevant details within stories of women who have been murdered by men preserves the idea that these women are not real people. Instead, it paints them as a composite of body parts designed to be ogled, and a series of private decisions designed to be scrutinised.

Women like Millane are wrongly immortalised by news outlets obsessed with sexualising and objectifying their every move even after death, while articles focused on male victims somehow manage to offer the deceased a relative degree of respect. The name of Millane’s killer was surpressed (some outlets have gone on to leak his name) while hers has been tarnished. This is wholly and unashamedly wrong. Millane deserved better, and the media must do better.

Grace Millane’s Trial Exposes a Dark Trend in Media Coverage of Violence Against Women.
By Daisy Richards. The Conversation , November  26, 2019. 




Violence against women is one of New Zealand’s most significant and pressing social issues. Every day police respond to hundreds of family violence incidents, and women continue to die as a result of men’s violence. In December 2018 New Zealand recognised the severity of a specific offence – strangulation – and implemented legislative reform to address its pervasiveness. Five arrests for strangulation were reported a day in February 2019 . I mention all of this because of the Grace Millane murder trial.

On 21 December 2018 she was strangled to death while visiting New Zealand. Her body was later found in a suitcase, buried, in the Waitakere ranges in Auckland. The man accused of her murder claimed her death was the result of consensual rough sex that had “gone wrong”. After a three-week trial, a jury of five men and seven women found him guilty of murder after less than six hours of deliberation.

While a guilty verdict has been established, this does not detract from the distressing nature of this murder trial – distressing for myriad reasons: distressing because a young woman lost her life in a country where she should have been safe; because it quickly became a trial about a young woman’s sexual history and interests instead of the actions of a violent man; because while the defence said Millane was not to blame for what happened that night, the case it built suggested she was somehow blameworthy.

Millane was violently murdered by a man who strangled her, took intimate photographs of her, lied to police and buried her in the bush. It is these actions that should have been the focus of the trial. Unfortunately his despicable actions seemed to get lost in a sea of interest in her sexual history. None of this should have happened because the issue this case is actually about is men’s violence against women. Women’s sexual histories have no place in a murder trial.

As the trial progressed, the world became privy to some of the most intimate aspects of Millane’s life in a way that no woman would ever wish to experience. There were many aspects that felt like a rape trial. In rape trials victims report feeling blamed and shamed for their experiences of trauma and violence due to gruelling cross-examinations by defence lawyers. Women’s sexual histories are carefully dissected in ways that allude to the type of woman she might be. Women are asked about the clothes they wore, who they had previously had sex with, and why they allowed themselves to get drunk.

The experience in a rape trial is so traumatic that some women report it feeling like a second rape. Is it any wonder that only a minority of sexual violence cases are reported to police, fewer result in charges, fewer again make it to court and a dismal number result in a conviction?

The same issues of blame, shame and harrowing cross-examinations were present in Millane’s murder trial. They should not have been. The defence’s cross-examination of a brave young woman who had previously had sex with Millane’s murderer was shocking. She was accused of being “melodramatic” and it was suggested she was making up her disclosure of his sexually violent behaviour. All of this was to discredit her evidence and to help shape a version of Millane’s murderer as a good guy who just “panicked”.

The treatment of Millane’s sexual life was just as disturbing. The defence heard evidence from Millane’s long-term ex-boyfriend, a previous sexual partner, and men she had connected with – but never met – on dating sites. Millane’s sexual history was dissected in public for everyone to see, but she had no ability to respond. The jury heard about Millane’s supposed interest in “rough sex” or BDSM. The jury was told about Millane’s presence on dating sites such as Tinder, Fetlife and Whiplr, as if to suggest her presence on the latter two was evidence of her rough-sex interests.

Millane’s sexual history, her dating life and her use of dating apps should not, and will not, be used against her. Whether Millane liked rough sex or not is irrelevant because nobody can consent to murder. The jury confirmed this in their delivery of a guilty verdict. Yet Millane’s sexual history remains deeply embedded in online and print media coverage, serving as a painful reminder of the problems involved in the “rough sex gone wrong” defence. While removing Millane’s digital footprint is now impossible, what we can ensure is we remember that women are not the ones responsible for violence they experience at the hands of men.


Women’s Sexual Histories Have No Place in a Murder Trial, as Grace Millane Case Shows. By Samantha Keene. The Guardian,  November 24, 2019. 




The winner of the UK’s most prestigious literary award for non-fiction has hit out at the media, singling out their “appalling” coverage of Anastasia Yeshchenko’s recent murder.

Hallie Rubenhold, who won the Baillie Gifford prize for her study of the women murdered by Jack the Ripper, said that today’s media continue to focus on murderers and the gory details of their crimes, disregarding their victims’ lives. “We can see this pattern occurring over and over again,” said Rubenhold.

Speaking the morning after her book The Five won the £50,000 award, the author suggested that the media were almost “going for the comical side” in their coverage of Yeshchenko’s killing in Russia.

“Here’s this crazy mad professor of Napoleonic studies who dresses like Napoleon, who was found wading in a frozen river in St Petersburg trying to dispose of a bag of body parts of his lover,” she said. “Then it goes on and on about who he is – I’m not even going to mention his name. I was literally just going through everything I could find in the English language press and I found nothing about her and I couldn’t believe it. I thought: ‘Here it is again, this is what’s happening again.’”

The historian said her next book would look at Dr Crippen’s notorious 1910 murder of his wife Cora from the latter’s point of view. “Especially with women, reports go in for the salacious angle – how was she carved up, how was she disposed of, how did he kill her?” she said. “What about: ‘Who was she? What effect did her death have on her family and her community?’ That’s a much more important story. We have to start asking those questions. At the moment we’re just following the narrative of the murderer; it’s a whodunnit rather than a whydunnit.”

In The Five, Rubenhold shows how, despite popular belief to the contrary, “there is no hard evidence to suggest that three of [the Ripper’s] five victims were prostitutes at all”. The chair of the Baillie Gifford judges, Stig Abell, hailed the book as a “great moral act, reclaiming the voices of these women”.

Rubenhold said her approach has led to a swathe of attacks on her and her book from people interested in the Ripper murders.

“In spite of the fact my book is footnoted,” she said, “in spite of the fact there are so many Victorianists and experts in the history of prostitution and women’s history who have read my book and said it absolutely stands up, there are a hardcore group … who say what I have done is to doctor documents … They say I lied, that I suppressed evidence, redacted evidence.”

People who are interested in the murders care about the women’s occupations, she explained, “because they have built their egos on the back of trying to figure out who Jack the Ripper was.”

“The one thing the Ripperology community can agree on among themselves is that he killed prostitutes,” she said. “I’m pulling a thread out of something and everything comes unravelled. In effect what I’m doing is saying Ripperology is unviable because we will never solve these murders … if you look at it from a historian’s point of view, there isn’t evidence.”

Rubenhold added that to have experts and academics validate her work with the prize felt like vindication, after so much abuse. “I’m sure [my critics] are going to be going absolutely mental when they find out,” she said.


Jack the Ripper Historian Says Media Still Disregard Murder Victims. By Alison Flood. The Guardian, November 20, 2019. 



Russian Historian Found with Body Parts Accused of Murder. By Sarah Rainsford. BBC, November 11, 2019.







In July 2013 a woman called Tracy Connelly was murdered in Melbourne. It made headlines in most of the national newspapers, all of which used some variation of the phrase “St Kilda prostitute killed”. The story dropped off the front page of every website within a day.

Less than a year before Tracy’s murder, another woman, Jill Meagher, was murdered. Remember Jill? How could we forget her? So young, so beautiful, so beloved, so normal. The reporting reached saturation with a gorgeous photo of her happy, smiling face. We saw footage of her poor, heartbroken husband and heard the shocked and trembling voices of her colleagues at the ABC. The coverage went on for weeks. It made her a real person to everyone who read about her murder.

But what about the “St Kilda prostitute”? Was she not just as much a person as Jill? Tracy Connelly was 40 years old when she was murdered. She lived just a few streets from my house. Tracy was real; she was a person, she had a community who valued her and a boyfriend who loved her. Why was she so dehumanised in the coverage of her murder?

The answer is, of course, all too obvious. She wasn’t a person, she was a “prostitute”. The reporting on the man who is charged with murdering Michaela Dunn mostly avoided the pejorative term “prostitute” and replaced it with “sex worker” but the dehumanising of another woman killed by a violent man showed nothing much has changed in seven years. News stories sensationalised the work both women did and ignored the parts of their lives that made them a person; the coverage of Jill Meagher’s murder, on the other hand, humanised her thoroughly.

The way the media chose to frame Tracy’s story, by constantly referring to her as a “prostitute”, suggested that in some way she deserved what happened to her. That she should have known better, and that she was asking for trouble by doing the work she did.

The opening lines of a story about her murder in the Age in July 2013 read:

   “Tracy Connelly had walked St Kilda’s red light district for at least a decade and knew her work was dangerous. In 2005, her minder was run over by a man who was angry that she refused to get in his car, Ms Connelly once told a court.”

What if, instead of “St Kilda prostitute brutally murdered”, the headline had been: “Tracy Connelly brutally murdered in her home”?

What if they’d led with a photogenic image of Tracy’s beautiful, pale, smiling face and this paragraph:

   “Tracy Connelly’s traumatised boyfriend discovered her body in their home yesterday afternoon. There was no sign of forced entry and police believe she may have been killed by someone she knew. Tearful friends talked about what a caring, loving person Tracy was, and how devastated their community is by this horrible crime.”

Would Tracy be a person to us then? Would she be so easily forgotten? Or would we have to wait until her killer, like Adrian Bayley, attacked a white, middle-class woman for the world to remember that a murdered woman is a person? That no person asks for or deserves murder, or any other form of attack. That blaming the victim is never acceptable, regardless of their profession, clothing, activities or housing circumstances.

Being a sex worker is dangerous. But it’s not as though sex workers are surrounded by dangerous chemicals or heavy machinery or wild animals. It’s dangerous because they are working with men. Their work makes them vulnerable to the sort of men who want to be violent to women who have little means of defending themselves. It’s not the people who do sex work who cause the danger. It’s the men who take advantage of their circumstances to commit violence. But the underlying assumption, that sex workers are responsible for the violence done to them, is reproduced and exacerbated by news media reports.

The media (both mainstream and social) are easy to blame because they are our only source of information about what happens in the wider world, beyond our immediate circle. The way they frame that information – the words they use, the level of coverage and importance given to a story, the type of details that might be emphasised or omitted – influences how we think of it. Jill Meagher, ABC staffer, was one of their own. They reported her death as the tragic event it was. Tracy was not one of their own – her life was very different from the vast majority of mainstream journalists’ lives – so she was reduced to a stereotype.




The simplicity of the Fixed It project, where I take a red pen to headlines, hadn’t occurred to me back then, but I wrote for the King’s Tribune about Tracy and how her murder had been portrayed in the media. The response was overwhelming. People in her community got in touch to tell me about the rage they felt seeing Tracy dehumanised by every newspaper in the country. Women from all over the world sent me headlines from their local news outlets, saying journalists turned murdered women into salacious, sensationalised clickbait. The most heartbreaking were the people who knew and loved a murdered woman, and had to watch as the media blamed her for her own murder and made excuses for the man who killed her.

“Why don’t journalists think women are people?” read the subject line of one email.

After Tracy’s murder and the contemptible way it was reported in the press, I started seeing it everywhere. Not just in crime reports but also in political reporting, sports reporting, even articles about musicians and artists. Women are not people in the eyes of the news, at least not the way men are. Women are tits and arse, they’re glamorous or fat, they’re wives or mothers or stupid or demanding or nagging or annoying or sweet or pretty. Men, on the other hand, are fully-rounded, complex people – as long as they’re not too “womanlike”.

After responding to the treatment of Tracy in the media, I continued writing articles and blogposts about it, but nothing ever really cut through. Then, in September 2015, one of the major news sites in Australia published an article about a man who murdered his ex-girlfriend under the headline: “Townsville police say selfie could have led to alleged stabbing murder”. I pulled out my phone, fixed the headline and snapped it back on Twitter. Fixed It was born.

Within a few months I was regularly scanning news sites and by the beginning of 2016 I had set up daily Google alerts for any news story about men’s violence against women. I was making the fixes on a daily basis and posting them on a website I had originally set up as a focal point for my freelance writing. Over the next two years Fixed It gained a strong social media following and I found myself regularly speaking at public events and writing articles about the way the media reports men’s violence against women. I incorporated it into my master’s degree and spent hundreds of hours researching the cause and effect of this kind of reporting. My book of the same name is the culmination of all that work.

There have been hundreds of headlines in Fixed It over that time. Drunk teenagers getting themselves raped, lying sex workers, houses committing rape, brooms beating women, loving fathers killing their children, Susan Sarandon being old in public, broken hearts causing murder, women too stupid to understand superannuation, Bill Clinton’s wife running for president, 40-year-old men in “sexual relationships” with 12-year-old girls, the prime minister of England’s legs, women too old to be hot while playing football, domestic violence “stunts”, “sex romps” killing MPs, countless invisible murderers and endless victim blaming.

In all that time, only one editor has ever got in touch to ask how they could write stories about women differently. Otherwise, journalists and editors don’t engage. And I understand. No one likes to be publicly smacked down, and it can be frustrating and humiliating to be accused of sexism or victim blaming – particularly if the journalists involved think I don’t understand the pressures they’re under or the legal restraints they must work within. This was certainly true in the early days of Fixed It. But now I have a much better understanding of and more sympathy for court reporters and editors of online news who have to do far too much with far too little. I also recognise the limitations of reporting on crimes that have not been through the court. You can’t call someone a rapist if he has not been convicted of rape.

But even taking into account these limitations, there are ways to rethink how we report such crimes. For example, you can (and should) call it an “alleged rape” instead of “sex”. The presumption of innocence does not prevent someone describing an alleged crime. No reporter would ever write that an accused car thief was driving their own car home because it hadn’t yet been proven in court that he stole it.

Sexual violence, however, appears to present reporters and editors with difficulties that don’t occur in any other criminal act. Far too often, alleged rape is reported as “sex”, which is not a crime. Rape is not sex and no one has ever been charged with having consensual sex. “Kris Kafoops faces court over sex claim” is as inaccurate in reporting on an alleged rape as “Kris Kafoops faces court over driving a car claim” would be in reporting on an alleged car theft.
The rules of sub judice contempt require that journalists cannot report someone is guilty of a crime before they are convicted, which is why the word “alleged” is so ubiquitous in crime reporting. This does not explain or excuse the way rape is so often described as sex, as if the words are interchangeable. They’re not. It happens because all the myths about violence are so deeply embedded in our culture, and further entrenched by journalism.

There was a vast difference in how Tracy Connelly and Jill Meagher were treated. Tracy was dehumanised, Jill was not, and this is not unique to Australia or even to modern reporting. For millennia, women have been divided into “good women” – wives and mothers, sweetly pretty, conservatively dressed nice girls – and “bad women” – sirens and sex workers, drug addicts, page three models and drunken, promiscuous sluts. Good women are helpless victims but bad women ask for trouble. The reality is that there is no type of woman who could conceivably deserve violence but this entrenched division of good and bad women still strongly influences how traditional media report on complex issues, and reduces women to these arbitrary categories.

Journalism needs more voices and more faces. It needs to widen its perception of the world and understand that women, people of colour, people with disabilities and people of different genders and sexualities are all news consumers. And they are not interested in news that ignores their existence or dismisses them as archaic stereotypes.

Rape is Not 'Sex', and 'Broken Hearts' don't Cause Murder. Women are Dying – and Language Matters. By Jane Gilmore. The Guardian, August 31, 2019.



Fixed ItFixing media reports of male violence against women















27/11/2019

Nikhil Pal Singh on The Settler Mindset and The Cold War






In the spring of 1774, two members of the Shawnee tribe allegedly robbed and murdered a Virginia settler. As Thomas Jefferson recounts in Notes on the State of Virginia (1787), “The neighboring whites, according to their custom, undertook to punish this outrage in a summary way.”

In their quest for vengeance, the white settlers ambushed the first canoe they saw coming up the river, killing the one, unarmed man as well as all of the women and children inside. This happened to be the family of Logan, a Mingo chief, Jefferson says, “who had long been distinguished as a friend of the whites,” but who now took sides in the war that ensued. The Mingos fought—and lost—alongside the Shawnees and Delawares against the Virginia militia that fall, and Logan’s letter to Lord Dunmore after the decisive battle is, according to Jefferson, a speech superior to “the whole orations of Demosthenes and Cicero.”

“There runs not a drop of my blood in the veins of any living creature,” Logan says of his decision to fight the white men. “This called on me for revenge. I have sought it: I have killed many: I have fully glutted my vengeance. For my country, I rejoice at the beams of peace. But do not harbor a thought that mine is the joy of fear. Logan never felt fear. He will not turn on his heel to save his life. Who is there to mourn for Logan? Not one.”

Logan’s speech went viral by eighteenth century standards; it was reprinted in newspapers across the country and admired for its tragic eloquence. Its popularity and resonance among white colonialists illustrate a defining aspect of settler storytelling: an acknowledgement of the injustice of Indian killing alongside an affirmation of its inevitability and salience as a guide to action. In their authenticity, Logan’s words validated a structuring precept of the white settler colony: that those who are violently displaced and eliminated are distinct from kin, whose passing should be mourned, and also opaque to posterity because they are sundered from webs of social relatedness.

Through this sleight of hand, the settlers achieved a unique perspective—one that justified violence because it afforded them a certain freedom, the productive freedom of a blank slate. As historian Patrick Wolfe famously described it, settler colonialism is thus a “structure, not an event.” Its mindset is not backward but forward looking as it consciously blurs the lines between preemption and self-defense, allegation and retribution, dispossession and property right.

Consider Jefferson’s words in the Declaration of Independence. A defining feature of life in the “free and independent” states, he wrote, was constant warfare with the denizens of a vast territorial frontier, “the merciless Indian Savages, whose known rule of warfare, is an undistinguished destruction, of all ages, sexes and conditions.” From its inauguration, then, American freedom was founded on this unrelenting vision of a frontier populated by unjust enemies. Jefferson’s founding brief for continuous expansionary warfare in the name of collective freedom has animated the country’s sense of itself ever since. It is, as political theorist Aziz Rana has noted, a foundational yet unexamined precept within U.S. accounts of political liberty—one that continues to define practices, institutions, and American ways of living that exact a violent toll.



Not least, the Indian wars bequeathed a lasting military orientation—one that extended and codified ethical, legal, and vernacular distinctions between civilized and savage war as a core national experience and conceit. Settler militias invested expansive police power in ordinary citizens as a corollary of collective security, and justified practices of extirpative war focused on populations and infrastructures, without distinction between combatants and civilians. Through the more than two centuries of frontier and counter-insurgency wars that the United States has fought (and continues to fight) the world over, the elimination or sequestration of “savages” has been represented, Jodi Byrd argues, as integral to the transit and development of American security, power, and prosperity.

At the end of the Civil War, Abraham Lincoln’s Secretary of State, William Seward, claimed that “control of this continent is to be, in a very few years, the controlling influence in the world.” Following the last Indian wars and the closing of the territorial frontier at the end of the nineteenth-century, President Theodore Roosevelt romanticized “the winning of the west” as the arc of progressive history. He ridiculed the anti-imperialists of his day who criticized brutal U.S. counter-insurgencies in the Philippines and Cuba as sentimental dreamers who would give Arizona back to the Apaches. The settlers’ outlook and its understanding of freedom poses the question “would you want to give it back?” to demonstrate an absurd proposition. The idea that there could be such a thing as settler decolonization is not only impossible, but also unthinkable.

By connecting the concept of democratic self-rule with a continual project of expansion, the settler narrative shaped collective institutions, ways of war, visions of growth and prosperity, and conceptions of political membership that still run deep. Indeed, our own period has not been immune. Describing the supposedly unmatched achievements of liberal-democratic society at “the end of history” in 1992, Francis Fukuyama reached back to the frontier allegory: “mankind will come to seem like a long wagon train strung out along a road. . . . Several wagons, attacked by Indians, will have been set aflame and abandoned along the way. . . . But the great majority of wagons will be making the slow journey into town, and most will eventually arrive there.” A decade later, following 9/11, the mood had shifted, but not the narrative reflex. As George W. Bush put it on October 6, 2001, “Our nation is still somewhat sad, but we’re angry. There’s a certain level of bloodlust, but we won’t let it drive our reaction. We’re steady, clear-eyed, and patient, but pretty soon we’ll have to start displaying scalps.”

Defending the launching of the global War on Terror, U.S. diplomatic historian John Gaddis gave scholarly imprimatur to the settler idiom: the borders of global civil society were menaced by non-state actors in a manner similar to the “native Americans, pirates and other marauders” that once menaced the boundaries of an expanding U.S. nation-state. Foreign affairs writer Robert Kaplan concurred: “The War on Terrorism was really about taming the frontier,” as he heard U.S. troops in Afghanistan and Iraq repeat the refrain, “Welcome to Injun Country.”




The reference, Kaplan insists, echoing Jefferson’s homage to Logan, “was never meant as a slight against Native North Americans.” It was merely a “fascination,” or an allusion to history—indeed, one that fits nicely with our aptly named Tomahawk missiles and Apache helicopters. But these comments and this history reflect a deeper, more sinister truth about the American dependence upon expansionary warfare as a measure of collective security and economic well-being.

The history of the American frontier is one of mounting casualties and ambiguous boundaries, of lives and fortunes gained and lost. In the settler narrative, “collective security” never meant just the existential kind of safety, that is, situations where material survival and self-defense were mainly at stake. Freedom is essential to the equation, and freedom in this conception is built once again upon dreams of a blank slate—this time cheap, empty, exploitable lands and resources that must be cleared of any competing presence. Indeed, the settlers’ conception of freedom belies the commercial interests in protecting an investment prospectus: the speculative value of the land itself—what surrounds it and what lies beneath it—is of paramount importance. 

The main colonial enterprise, after all, was risky and speculative land merchandising. Early American governance was arguably more preoccupied with mundane simplifications of deed and title, mapping, parceling, and recordkeeping than it was with Indian fighting. From inception, the U.S. founders envisioned the land west of the Alleghenies as a great commercial estuary, one that was gradually emptied of any other human claimant. As George Washington, the land speculator turned general, wrote upon resigning his command of the victorious continental army in 1783 (which included organizing a campaign of ethnic cleansing against the Iroquois), “The Citizens of America, placed in the most enviable condition, as the sole Lords and Proprietors of a vast Tract of Continent, comprehending all the various soils and climates of the World, and abounding with all the necessaries and conveniences of life, are now by the late satisfactory pacification, acknowledged to be possessed of absolute freedom and Independency.”

Geographer Thomas Hutchins echoed Washington’s sense of America as a world brimming with valuable resources and directing human enterprise toward uncertain boundaries. In An Historical Narrative and Topographical Description of Louisiana and West Florida (1785), he takes stock of the land’s bounty: grapes, oranges, lemons, cotton, sassafras, saffron, rhubarb, hemp, flax, tobacco, and indigo. Although enslaved Africans, the producers of most of these agricultural commodities, go unmentioned, Hutchins pauses impassively every few pages to observe a curious feature he also attributes to the landscape: this or that “once considerable” nation of Indians “reduced to about twenty-five warriors,” or “only about a dozen warriors.” Indigenous expiry is thus quietly inscribed as necessary to the continent’s supposedly inexhaustible riches.

U.S. military pacification was only one tool for the diminution of Indian sovereignty and the subsequent sequestration and marginalization of tribal remnants. Extensions of federal plenary power, the legal recasting of Indian political life as a peculiar subordinated status of domestic dependency, and redefinitions of indigenous resistance and counter-violence as crime were also central. Woven throughout was the settlers’ forward-looking framework: there is no alternative. In his 1835 letter to the Cherokee people, for example, President Andrew Jackson framed Indian removal as an essential by-product of commercial growth. “Circumstances that cannot be controlled and which are beyond the reach of human laws render it impossible that you can flourish in the midst of a civilized community.” The true nature of those circumstances was revealed five years prior in an address Jackson made to Congress: “what good man would prefer a country covered with forests and ranged by a few thousand savages to our extensive Republic, studded with cities, towns, and prosperous farms, embellished with all the improvements which art can devise or industry execute, occupied by more than 12,000,000 happy people?”

As the Indian wars began drawing to a close in the late nineteenth century, the North American territorial frontiers closed as well. Understood to be an event of world-significance, it forced settler thinking to confront new challenges. Diplomat Paul Reinsch, who was a student of Frederick Jackson Turner and an early theorist of U.S. global reach, observed that expansion through overseas colonialism would be uniquely difficult: “we have to deal with a fixed element, the native population, long settled in certain localities and exhibiting deeply engrained characteristics; a population . . . that cannot be swept away before the advancing tide of Caucasian immigration as were the North American Indians.”

In the ensuing decades, then, a host of morbid symptoms arose from similar perceptions that while expansion was necessary, it would never again be so easy and unproblematic. For thinkers such as Madison Grant and Lothrop Stoddard (both prominent eugenicists) the limitation of territories for future white settlement and a “rising tide of color” threatened the supremacy, even survival, of Western civilization. This meant that the United States itself needed to seal its borders against unwanted detritus from the outer world. The Chinese Exclusion Act of 1882, along with the subsequent Immigration Act of 1917 (which established an “Asiatic barred zone”), conjured fears of a “yellow peril” that threatened to reverse the ordering virtue of white settlement in western lands. Ruling on Chinese exclusion in 1909, the U.S. Supreme Court was explicit, describing “foreigners of a different race” as “potentially dangerous to peace” even in the absence of “actual hostilities with the nation of which the foreigners are subject.”

In the lead up to World War I, the challenge of how to continue a dynamic of economic expansion in the wider world without becoming corrupted politically by proximity to savage and inferior, non-white subjects was a central preoccupation of U.S. thinkers. John Carter Vincent, a confidante of the Roosevelt family and later a U.S. foreign service officer, suggested a vision of the western hemisphere as the model for a “painless imperialism,” where nominal sovereignty and separation from mestizo populations was underwritten by strategically placed Marine barracks. This would ensure the smooth passage of commerce and security for propertied interests and what Woodrow Wilson called the election of “good men.”







In May of 1942, after the United States had entered World War II, the editors of Fortune, Time, and Life magazines published a joint statement titled “An American Proposal” that echoed Vincent and Wilson. They observed that the United States was not “afraid to help build up industrial rivals,” which they saw as a virtue: “American ‘imperialism,’ if it is to be called that” is “very abstemious and high minded . . . because friendship, not food, is what we need most from the rest of the world.”  These leading business ideologues laid their cards on the table: an age governed by aviation and “the logic of the air,” Fortune’s editors observed, would need an extensive network of strategic bases and technical facilities similar to “the colonies and dominions” that supported imperial Britain during its age of maritime power. “In the world-to-be,” they warned, “a dozen or more equivalents of Pearl Harbor may be simultaneously possible. . . . Our problem, therefore, is not to restore the status quo ante, but to break out.”

Settler colonial narratives thus needed to be rewritten to suit extra-territorial and global purposes. To be clear, rising U.S. globalism and imperialism were not simply an extension of settler freedom, but nor should we lose sight of how they were intertwined with it. As Fortune’s writers insisted: “The U.S. economy has never proved that it can operate without the periodic injection of new and real wealth. The whole frontier saga, indeed, centered around this economic imperative.” As such, “The analogy between the domestic frontier in 1787 when the Constitution was formed and the present international frontier is perhaps not an idle one.” Franklin Delano Roosevelt himself viewed the 1940 “destroyers for bases” agreement with Great Britain—which saw the exchange of U.S. naval ships for land rights on British possessions—as the most important action in “the reinforcement of our national defense . . . since the Louisiana Purchase.”

A decade later, as historian Megan Black has recently shown, engineers from the U.S. Department of the Interior—with longstanding expertise charting Indian reservation lands for hidden energy and mineral resources—were dispatched the world over to survey sources of strategic minerals required to defend “the free world.” In short order, U.S. military forces were calling Vietnam “Indian Country,” forcibly sequestering its peasants on reservations, while fighting to ensure its reserves of tungsten and tin didn’t fall to the red tide of international communism.



U.S. imperialism abroad, however, did not erase the influence of settler ethics and practices closer to home. As Time Magazine magnate Henry Luce suggested, even as non-interventionist sentiment ran high in the run up to World War II, “Americans had to learn how to hate Germans, but hating Japs comes natural—as natural as fighting Indians once was.” In turn, few events evoked the Indian removal of the 1830s more than the 1940s herding of 100,000 Japanese and Japanese-Americans into camps in the Western interior while many of their white neighbors avidly claimed their farmlands and possessions.

An expansive and celebratory vision of white settlement also retained its purchase: by the 1950s, Andrew Jackson’s studded republic was remade through the promise of homeownership on “the crabgrass frontier.” Working in conjunction with real estate and banking industries, federal housing authorities drew up “residential security maps” that identified with stark red-lines where the valued property, credit, and people needed go—and where untrustworthy denizens should remain fixed.

By the late 1960s, as sharply racialized contests over public space and civic belonging gave way to the “wars” on crime and drugs, sociologist Sidney Willhelm foresaw that urban blacks in particular, who were no longer required for industrial labor, were “going the way of the American Indian” into carceral warehouses. It is hardly incidental that Michigan’s Oakland County Executive Brooks Patterson thought it apt, quite recently, to characterize inner city Detroit as a “reservation, where we herd all the Indians into the city, build a fence around it, and then throw in the blankets and the corn.”

This push and pull of U.S. settler ethics, narratives, and corollary institutions of violence in the name of freedom has yielded a distinctive and multi-layered carceral history and geography, at once domestic and transnational: a global archipelago of prisons, internment camps, and detention centers. In the past years, at Standing Rock, its raw circuitry of indigenous sequester and citizen protection was once again laid bare as state police and U.S. military forces had tense stand-offs with thousands of Sioux and supporters who were blocking construction of the Dakota Access oil pipeline through Indian reservation lands.

Here, we might observe how settler ethics and practices continue to create liberated citizens and subordinated subjects together; the former are defined by democratic, formally egalitarian claims to nationhood, legal status, consumer choice and protection, and the latter defined as atavistic, backward, passively disappearing, slated for elimination, subject to sequestration, or bound by what is thought to be permanent inferior status. “Savagery,” in short, has been a fungible and centrifugal construct, with fears of the native fueling racism as well as nativism, while a recursive, blank-slate conception of settler primacy and preeminence animates movements, programs, and policies for eliminating or warding off alien or foreign presence.

The inceptive structuring of indigenous elimination as a condition of the settlers’ freedom has yielded an enduring tendency among American officials, and among the publics they conscript, to think of democratic self-rule as interdependent with expansive and coercive rule over alien subjects. After 9/11, this historical subtext returned to the foreground as Americans were told not only that fighting terrorists overseas meant not having to fight them at home, but also that continuing to shop and spend at home was no less the duty of a civilized and prosperous people. The term “enemy combatant” itself was a neologism invented for “unlawful” fighters, those deserving no legal standing or status—those who could be detained (and tortured) with impunity—those subject to an unlimited deprivation of freedom, one whose avowed legal precedent, once again referred back to the Indian wars.

As inhabitants of a finite and ecologically stressed planet, the challenges of undoing settler ethics—its ways of war, its presumptions about a need for limitless growth, its hostile vision of blank slate autonomy without dependency, and its delimitations of social and political membership—have never been higher. For more than simple racism or discrimination, the destructive premise at the core of the settler narrative is that freedom itself must be built upon eliminationism, and that growth therefore requires expiry.

And it this temptation—to remain on the right side of might that makes right—that stalks the future of a planet in the grips of climate destruction, secular stagnation, and unevenly distributed misery. Earthly co-existence, material subsistence, and ecological sustainability demand nothing less than a new dispensation of human freedom. Otherwise, there truly will be none left to mourn.

 The Pervasive Power of the Settler Mindset. By Nikhil Pal Singh. Boston Review , November 26, 2019. 






On September 21, 1945—five months after Franklin Roosevelt’s death—President Harry Truman assembled his cabinet for a meeting that one historian has called “a turning point in the American century.” The purpose of the meeting was to discuss Secretary of War Henry Stimson’s proposal to share atomic bomb information with the Soviets. Stimson, who had directed the Manhattan Project, maintained that the only way to make the Soviets trustworthy was to trust them. In his proposal to Truman, he wrote that not sharing the bomb with the Soviets would “almost certainly stimulate feverish activity on the part of the Soviets . . . in what will in effect be a secret armament race of a rather desperate character.”

Henry Wallace, the secretary of commerce and former vice president, agreed with Stimson, as did Undersecretary of State Dean Acheson (though he later changed his position), but Secretary of the Navy James Forrestal laid down the definitive opposition. “The Russians, like the Japanese,” he argued, “are essentially Oriental in their thinking, and until we have a longer record of experience with them . . . it seems doubtful that we should endeavor to buy their understanding and sympathy. We tried that once with Hitler. There are no returns on appeasement.” Forrestal, a skilled bureaucratic infighter, had made his fortune on Wall Street and frequently framed his arguments in economic terms. The bomb and the knowledge that produced it, Forrestal argued, was “the property of the American people”—control over it, like the U.S. seizure of Japan’s former Pacific Island bases, needed to be governed by the concept of “sole Trusteeship.”

Truman sided with Forrestal. Stimson retired that very same day, his swan song ignored, and Wallace, soon to be forced out of the Truman administration for his left-wing views, described the meeting as “one of the most dramatic of all cabinet meetings in my fourteen years of Wash­ington experience.” Forrestal, meanwhile, went on to be the country’s first secretary of defense in 1947 and is the man who illustrates perhaps more than anyone else how Cold War militarism achieved its own coherence and legitimacy by adopting economic logic and criteria—that is, by envi­sioning military power as an independent domain of capital expenditure in the service of a political economy of freedom. From his pivotal work in logistics and procurement during World War II, to his assiduously cultivated relationships with anti–New Deal congressmen and regional business leaders sympathetic to the military, Forrestal both helped to fashion and occupied the nexus of an emerging corporate-military order. He only served as defense secretary for eighteen months (he committed suicide under suspicious circumstances in 1949), but on the day of that fateful cabinet meeting, he won the decisive battle, advocating for what he once called a state of ongoing “semi-war.” The post–World War II rise of a U.S. military-industrial complex is well understood, but it still remains hidden in plain sight. Today warnings about Donald Trump’s assault on the “liberal international order” are commonplace while less examined is how we arrived at a point where democratic and “peacetime” governance entails a global military infrastructure of 800 U.S. military bases in more than 70 countries.

Moreover, this infrastructure is under the command of one person, supported by a labor force numbering in the millions, and oriented to a more-or-less permanent state of war. If a politics of threat inflation and fear is one part of the answer, the other, more prosaic component is that the system itself is modeled after the scope of business and finance. By managing a diverse portfolio of assets and liabilities and identifying investment opportunities, it envisions a preeminently destructive enterprise as a series of returns calibrated to discretionary assessment of threats and a preponderance of force. This was Forrestal’s bailiwick.

A little-known anecdote about Truman’s 1947 call to Congress for decisive intervention in the Greek civil war—generally viewed as the official declaration of the Cold War—illustrates this point. Truman’s speech is famous for its emphasis on political freedom, particularly the idea of protecting peoples’ rights to self-determination against “armed minorities”—“the terrorist activities of several thousand armed men, led by communists.” “One of the primary objectives of the foreign policy of the United States,” Truman said, establishing the characteristic linkage between World War II and the Cold War, “is the creation of conditions in which we and other nations will be able to work out a way of life free from coercion. Our victory was won over countries which sought to impose their will, and their way of life, upon other nations.”

 The moral and rhetorical heightening of the opposition between democracy and communism (and, incipiently, terrorism) was a conscious choice. Truman was famously advised by Republican senator Arthur Vandenburg that securing public and congressional support for unprecedented and costly peacetime intervention into European affairs entailed “scaring the hell out of the American people.” Another, less visible choice, however, was to downplay the role of the accountant’s ledger, which was more overt in an early draft of Truman’s speech. That draft argued that emergency financial support for Greece (and Turkey) was now a requirement of world capitalism: “Two great wars and an intervening world depression have weakened the [capitalist] system almost everywhere except in the United States. If, by default, we permit free enterprise to disappear in other countries of the world, the very existence of our democracy will be gravely threatened.” Acknowledging the less-than-compelling purchase of this argument, Secretary of State Dean Acheson remarked derisively that it made “the whole thing sound like an investment prospectus.”

Truman’s delivered address, by contrast, made use of the words “free” and “freedom” twenty-four times in a few minutes, as if talismanic repetition were enough to hinge the defense of private capital accumulation to the maintenance of popular democracy the world over. Yet, despite the inflated rhetoric, economic considerations remained the skeletal core of the Truman Doctrine. Buried inside the address was the acknowledged collapse of British imperial policy in the region, along with an “invitation” from a dubiously democratic, right-wing Greek government for “financial and other assistance” in support of “better public administration.” The imperatives of democracy and self-government—preeminent political values understood by the U.S. public—were subordinated to building “an economy in which a healthy democracy can flourish.” In a final nod to the bean counters, Truman noted that the amount he was requesting was a mere fraction of what the United States spent during World War II, and no less justified as “an investment in world freedom and world peace.”
 The challenge for U.S. policy makers going forward was to reconcile a lofty rhetorical and moral emphasis upon the principle of political self-determination with the necessity of investing military force (i.e., “other assistance”) whose paramount end was securing the market freedoms of national and international capitalists. The teleological (and tautological) proposition that a substratum of properly capitalist economic relations organically yielded a democratic harvest would be the farmer’s almanac of a rising generation of modernization theorists. But the reality on the ground—in a world where the main provenance of self-determination was defined by the bloody rearguard defense of colonial prerogatives on the part of the United States’ most important allies and industrial partners—was bitter, and far less susceptible to universalizing nostrums. Straight-talking U.S. policy makers, particularly those at the center of the military apparatus, knew it.

The following year, for example, George Kennan, author of the “containment” doctrine, a protégé of Forrestal, and the single most influential strategic foreign policy thinker of the moment, offered a strikingly candid version of the task at hand, in a classified memo that consciously punctured the universalist ambit of the Truman Doctrine:

     ’’ We have about 50% of the world’s wealth but only 6.3% of its population. This disparity is particularly great as between ourselves and the peoples of Asia. In this situation, we cannot fail to be the object of envy and resentment. Our real task in the coming period is to devise a pattern of relationships which will permit us to maintain this position of disparity without positive detriment to our security. To do so, we will have to dispense with all sentimentality and day-dreaming; and our attention will have to be concentrated everywhere on our immediate national objectives. We need not deceive ourselves that we can afford today the luxury of altruism and world-benefaction. (emphasis added)””

When thinking about nations and peoples, particularly those outside of Europe, Kennan again foregrounded a logic of investment and risk management, and he advised restraint and limitation of liability, espescially with respect to “the peoples of Asia . . . [who] are going to go ahead, whatever we do, with the development of their political forms and mutual interrelationships in their own way.” Kennan warned that the coming period would be neither “liberal” nor “peaceful,” and that such countries were likely to “fall, for varying periods, under the influence of Moscow, whose ideology has a greater lure for such peoples, and probably greater reality, than anything we could oppose to it . . . [or that] our people would ever willingly concede to such a purpose.” In this light, he concluded that the United States needed to dispense with commitments, rhetorical and otherwise, to “unreal objectives such as human rights, the raising of living standards, and democratization. The day is not far off when we are going to have to deal in straight power concepts.”

This view is sometimes depicted as an exemplary instance of realism—wiser and more in tune with the messy, uneven world that emerged from World War II—and a point of view that, had it been heeded, may have prevented the costly overreach of global cold war, especially “blunders” such as the Vietnam War (which Kennan, long retired to academia, opposed). The concept of realism, however, fails to grasp the functional logic of risk and threat assessment—the insistent and anxious hedging and speculation that made the careers and fortunes of Kennan, Forrestal, and many that followed them. Forrestal fretted obsessively in his diary along these lines: “I am more impressed than ever as things develop in the world today that policy may be frequently shaped by events unless someone has a strong and clear mental grasp of events; strong enough and clear enough so that he is able to shape policy rather than letting it be developed by accidents.” This recurrent epistemic anxiety initiated an insistent demand for anticipatory policy, abiding mistrust, and the maintenance of a preponderance of force. As Forrestal bluntly put it, “Power is needed until we are sure of the reign of law.”



 Despite his long period of service within a New Deal liberal political milieu, Forrestal (like Kennan) was disinterested in universalizing the scope of political self-determination overseas, recognizing as more press ing the preservation of a capitalist economy built on uneven development and asymmetric military power at a world scale. Electrified upon reading Kennan’s “Long Telegram” (1946), Forrestal viewed his fellow Princeton man as a kindred soul, one who had intuited similar grounds of Orientalist menace, inscrutability, and immunity to anything but the language of force in Soviet conduct. It was Forrestal who brought Kennan to Washington, D.C., from Moscow and into the policy-making apparatus; both men were solicitous toward the value of rank and privilege, tolerant of authoritarian deviations from liberal standards, and assured that freedom from coercion was the provenance of those who, in Kennan’s words, were already imbued with “Anglo-Saxon traditions of compromise.”

Forrestal framed his own deference for hierarchy in terms of the prerogatives of corporate capitalism—the idea that practical men of business, rather than reformers and intellectuals, had won World War II and needed to be running the world going forward. Among his more forceful conclusions was that liberal globalism would be disastrous if it were not steeled with counterrevolutionary animus. As he confided to diplomat Stanton Griffiths:


   “”Between Hitler, your friends to the east, and the intellectual muddlers who have had the throttle for the last ten years, the practical people are going to have a hell of a time getting the world out of receivership, and when the miracles are not produced the crackpots may demand another chance in which to really finish the job. At that time, it will be of greatest importance that the Democratic Party speaks for the liberals, but not for the revolutionaries.”

For these realists, even more than the wooly moralists they sometimes ridiculed, it was the credibility of U.S. threats of force that ensured the freedom and mobility of productive capital and supported its resource needs and allied interests across an ever-widening sphere. Of a more aristocratic and consciously anti-democratic mien, Kennan likewise recognized that the animating logic was not strictly anti-communist but counterrevolutionary—indeed even racial. The inevitable dissolution of the colonial system meant that the challenge of U.S. policy in the coming period was broader than the struggle with Soviet communism, as “all persons with grievances, whether economic or racial will be urged to seek redress not in mediation and compromise, but in defiant, violent struggle.” Inspired by communist appeals, “poor will be set against rich, black against white, young against old, newcomers against established residents.”

By eliding soviet designs with those of heterogeneous movements demanding effective sovereignty and challenging material deprivation, Forrestal and his colleagues contributed to a perverse recasting of the dynamic of European colonial disintegration as the field of Soviet imperial expansion. This rhetorical and ideological frame practically demanded the militarization of U.S. foreign policy, with U.S. “counterforce” the only alternative to a world ruled by force. As such, along with Arthur Radford, Forrestal was instrumental in developing the Central Intelligence Agency (CIA), and that agency’s work soon echoed his. In 1948, for instance, a CIA document entitled “The Break-Up of Colonial Empires and its Implications for US Security” defined expressions of “economic nationalism” and “racial antagonism” as primary sources of “friction between the colonial powers and the US on the one hand, and the states of the Near and Far East on the other.”




The CIA’s analysts suggested that poverty and a legacy of anti-colonial grievances rendered colonized and formerly colonized peoples “peculiarly susceptible to Soviet penetration” and warned that the “gravest danger” facing the United States was that decolonizing nations might fall into alignment with the USSR. At the same time, they faulted Europe’s colonial powers for their failure to satisfy “the aspirations of their dependent areas” and advised them to “devise formulae that will retain their good will as emergent or independent states.” Envisioning U.S. responsibility to author such formulae in the future, the classified brief concluded that the United States should adopt “a more positive and sympathetic attitude toward the national aspirations of these areas,” including policy that “at least partially meets their demands for economic assistance.” Otherwise “it will risk their becoming actively antagonistic toward the US,” including loss of access to previously “assured sources of raw materials, markets, and military bases.”

While the emerging U.S. foreign policy clearly accepted the un-resolvable antagonism toward the Soviet Union, the challenge of the future, as the CIA argued, was how the United States should address the “increasing fragmentation of the non-Soviet world,” or, in a word, decolonization. The means for assessing risk and reward in this expansive and heterogeneous terrain of imperial disintegration were by no means clear. But it is revealing that the possibility of potential alignments between decolonizing nations and Soviet power was far less concrete and worrisome to the United States than the more definite and delineated material losses faced by the United States and the colonial powers with which it had aligned itself—namely, being deprived access to formerly “assured sources of raw materials, markets and military bases.” In other words, the challenge of the future, as Kennan had underlined, was to devise “formulae” to buttress the forms of political authority that sustained economic inequality (at a world scale) in the face of inevitable revolt and revolution against such authority and the social conditions it supported.

Despite his later misgivings, Kennan had authored the concept whose rhetorical elasticity and ideological indeterminacy proved crucial to fashioning a nemesis that suited this consciously expansionist vision of U.S. economic and military power. With the creation of the CIA, the National Security Council, and Forrestal’s own new position of secretary of defense, these years saw the growth of a national security bureaucracy that was divorced from meaningful oversight and public accountability for its actions, including myriad moral failures and calamities. A covert anti-Soviet destabilization campaign in Eastern Europe, for example, greenlit by Forrestal and Kennan, enlisted Ukrainian partisans who had worked with the Nazis. This type of activity would become routine in Latin America, Asia, and Africa, where Kennan derided respect for the “delicate fiction of sovereignty” that undeserving, “unprepared peoples” had been allowed to extend over the resources of the earth.

Over the next quarter century, fewer than 400 individuals operated the national security bureaucracy, with some individuals enjoying decades of influence. That the top tier was dominated by white men who were Ivy League–educated lawyers, bankers, and corporate executives (often with ties to armament-related industries) lends irony to official fearmongering about armed conspiracies mounted by small groups, let alone the idea that the role of the United States was to defend free choice against coercion imposed by nonrepresentative minorities. This fact, perhaps more than any other, suggests that, as much as the Cold War represented a competition between incompatible, if by no means coeval or equally powerful systems of rule (i.e., communist and capitalist), it was marked by convergences too. The Soviet “empire of justice” and the U.S. “empire of liberty” engaged in mimetic, cross-national interventions, clandestine, counter-subversive maneuvers, and forms of clientelism that were all dictated by elite, ideologically cohesive national security bureaucracies immune from popular scrutiny and democratic oversight.

Those charged with governing the controlling seat of U.S. globalism consistently doubted the compatibility of normative democratic requirements and the security challenges they envisioned, including distrust that often bordered on contempt for the publics in whose name they claimed to act. “We are today in the midst of a cold war, our enemies are to be found abroad and at home,” remarked Bernard Baruch, coining the term that names this era. In this context, “the survival of the state is not a matter of law,” Acheson famously declared, an argument similar to one being advanced by former Nazi jurist Carl Schmitt. Vandenberg, echoing defenders of Roosevelt’s accretive accumulation of war powers, was positively wistful lamenting “the heavy handicap” that the United States faced “when imperiled by an autocracy like Russia where decisions require nothing but a narrow Executive mandate.” For Forrestal, “the most dangerous spot is our own country because the people are so eager for peace and have such a distaste for war that they will grasp for any sign of a solution of a problem that has had them deeply worried.”

Forrestal felt that the danger at home manifested itself most frustratingly in the threat that congressional budgeting posed to military requirements. The preservation of a state of peace was a costly proposition when it revolved around open-ended threat prevention the world over. Upholding the permanent preponderance of U.S. military power at a global scale required a new type of fiscal imagination, one that had to be funded by the future promise of tax receipts. During his final year in office, Forrestal’s diary records in mind-numbing detail his worries about acquiring Pentagon funding adequate to his projections for global military reach. In Forrestal’s view, budgetary considerations were captive to the wrong baseline of “peak of war danger” and combatting “aggression” rather than to “maintenance of a permanent state of adequate military preparation.”

A fascinating aspect of these budget wrangles is Forrestal’s manic efforts to translate future-oriented geostrategic needs into precise dollar values. Just months before his forced retirement and eventual suicide, he confided to Walter G. Andrews:

   ’ Our biggest headache at the moment, of course, is the budget. The President has set the ceiling at 14 billion 4 against the pared down requirements that we put in of 16 billion 9. I am frank to say, however, I have the greatest sympathy with him because he is determined not to spend more than we take in in taxes. He is a hard-money man if ever I saw one.’”

Despite his grudging admiration for the stolid Truman, Forrestal’s Wall Street background had left him at ease in a more speculative or liquid universe; at that precise moment, he was devising accounting gimmicks to offset near billion-dollar costs of stockpiling raw materials as a “capital item” that could be “removed from the budget.” The important point to emphasize is the relationship between two interrelated forms of speculation and accounting—economic and military—in which an absolute inflation of threats tempted a final break with lingering hard-money orthodoxies and a turn to deficit spending. Forrestal did not live to see the breakthrough, but his work paid off.



As Acheson described it, the Korean War—the first hot war of the Cold War era—“saved” the fledgling national security state. With its outbreak, the dream of eternal military liquidity was realized when Leon Keyserling, the liberal economist serving as Truman’s chairman of the Council of Economic Advisors, argued that military expenditures functioned as an economic growth engine. That theory then underpinned NSC 68, the document that justified massive U.S. defense outlays for the foreseeable future and which was authored by another Forrestal protégé, Paul Nitze. By yoking dramatically increased federal spending to security prerogatives, military Keynesianism thus achieved a permanent augmentation of U.S. state capacity no longer achievable under appeals to Keynesianism alone.

The embedding of the global priorities of a national security state, which sometimes appears inevitable in retrospect, was by no means assured in the years leading up to the Korean War. It was challenged by uncooperative allies, a war-weary or recalcitrant U.S. public, and politicians who were willing to cede U.S. military primacy and security prerogatives in the name of international cooperation. But by 1947, men such as Forrestal had laid the groundwork for rejecting the Rooseveltian internationalist inheritance, arguing it was necessary to “accept the fact that the concept of one world upon which the United Nations was based is no longer valid and that we are in political fact facing a division into two worlds.” Although the militarization of U.S. policy is often understood to have been reactive and conditioned by threats from the outside, his ruminations illustrate how militarized globalism was actively conceived as anticipatory policy (in advance of direct confrontations with the Soviet Union) by just a few architects and defense intellectuals—men under whose sway we continue to live and die.

Ultimately, the declaration of the Cold War says more about how these U.S. elites represented and imagined their “freedom” and envisioned the wider world as a domain for their own discretionary action and accumulation than it did about enabling other people to be free, let alone shaping the terms of a durable and peaceful international order. As early as 1946, Forrestal began taking important businessmen on tours of the wreckage of Pacific Island battles, which also happened to be future sites for U.S. nuclear testing. Forestall described these ventures as “an effort to provide long-term insurance against the disarmament wave, the shadows of which I can already see peeping over the horizon.” The future of the bomb and the empire of bases were already on his mind.

Forrestal recognized that force and threat are always fungible things to be leveraged in the service of the reality that truly interested him, the reality made by men who own the future. For those of his cast of mind, “international order” was never more than the fig leaf of wealth and power. As he noted in a 1948 letter to Hansen Baldwin of the New York Times: “It has long been one of my strongly held beliefs that the word ‘security’ ought to be stricken from the language, and the word ‘risk’ substituted. I came to that conclusion out of my own business experience.” It was the job, after all, of these East Coast lawyers and moneymen to make sure all bets were hedged, and Forrestal knew that speculation could turn into “an investment gone bad.” As a leading investor in the Cold War project, he wanted a guaranteed return, even if the rule of law never arrived and even when the price was ruin.

Banking on the Cold War. By Nikhil Pal Singh. Boston Review,  March 14 , 2019.


Nikhil Pal Singh is Professor of Social and Cultural Analysis and History at New York University and Faculty Diretor of the NYU Prison Education Program.