True crime writing isn’t famous for its impeccable gender politics. Think of how male criminals (e.g. the late Mark “Chopper” Read) have been glorified and women law-breakers demonised. Or how women who are victims of crime can be stereotyped as either virgins or vamps.
Two new books, Mark Morri’s Remembering Anita Cobby and Martin McKenzie-Murray’s A Murder without Motive, offer a fresh approach to the true crime genre. Both were published in early 2016. Both have been penned by male journalists. Both focus on men who find themselves involved (albeit in different ways) in murder cases where the victims are women.
The “Anita” in Morri’s title is Anita Cobby, a Sydney nurse who was gang-raped and murdered in January 1986. The book discusses the experiences of her husband, John Cobby, who was estranged from his wife at the time of her death, and who has (until now) purposely eluded media attention.
Morri met John around the time of the murder, and the two men developed a rapport. In conversation with the author, John describes the grief and horror that overwhelmed him in the wake of Anita’s death. He tells of trying to escape through alcohol and overseas travel and the homicidal fantasies he continues to harbour about taking revenge on his wife’s killers.
In A Murder without Motive, McKenzie-Murray addresses the murder of young Perth woman, Rebecca Ryle. In May 2004, Ryle was strangled to death by James Duggan, a man she had just met at a local pub. In the ensuing trial, no motive could be established for his actions (hence the book’s title).
The author was tenuously connected to the victim. He grew up in the same suburb as her, and his brother once personally knew Duggan.
McKenzie-Murray reflects on the “strains of misogyny” that could be detected in the milieu in which they lived. This was a world where young men were encouraged to flaunt their “virility”, and women existed “for sex, acquisition, bluster”.
Both books cast a critical eye on a toxic strand of masculinity.
It’s an eye that has been missing from many true crime books. Two such examples were the books Blood Stain (2002) and The Vampire Killer (1992), which reproduced crude and misogynist feminine stereotypes.
In both Morri and McKenzie-Murray’s books, the male protagonists are constrained by prevailing codes of masculinity. In A Murder without Motive, for instance, McKenzie-Murray recalls his teenage participation in a blokey, boozy culture.
Still, throughout the book, he demonstrates the ability to stand back and evaluate this harmful culture. The book’s broader aim is to provide a nuanced perspective on the Ryle case. McKenzie-Murray explicitly distances himself from “popular treatments of criminality”, which (he says) are “salacious and vampiric” – and, I would add, frequently sexist.
Yet, John Cobby and Mckenzie-Murray confront the excesses of toxic masculinity, seeing it as the lethal social construct it is – not something glamorous or natural.
Morri’s book is less overtly concerned with gender politics. Nevertheless, he does quote “Miss X” (the unnamed woman who obtained a confession from one of Cobby’s killers, John Travers) as saying that she reported Travers to police because of his “behaviour towards women”. “Miss X” was married to Travers’ uncle at the time of Cobby’s death. Morri never specifies what exactly Travers’ “behaviour towards women” entailed, though we can assume that this behaviour was derogatory.
Of course, it should not take a dead woman for men to recognise that masculinised brutality is unacceptable.
But Remembering Anita Cobby and A Murder without Motive are important because they depict men who confront and abhor a culture of misogyny. Hopefully, their work will influence other true crime writers, resulting in more nuanced gender perspectives.
Several years ago, Oxford professor and Shakespeare scholar Jonathan Bate decided to write a biography of the British Poet Laureate Ted Hughes. Initially it seemed he had the support of Hughes’ widow, Carol Hughes – who had inherited copyright of her deceased husband’s writings, along with those of his more famous first wife, Sylvia Plath, who committed suicide in 1963.
Jonathan Bate embarked on his biography with great seriousness. Yet somewhere along the way, Carol Hughes became worried he was going to chronicle her late husband’s personal life, in addition to his poetic one. The result? In order to avoid a lawsuit, Bate was forced to give up all hope of being allowed to quote more than a token number of words from Hughes’ – or Plath’s – diaries, letters, manuscripts or jottings. He ended up contorting his original vision into a pretzel.
Bate recently published “Ted Hughes: The Unauthorised Life.” Now Janet Malcolm, the venerable journalist and essayist of the New Yorker, denounces Professor Bate in The New York Review of Books for daring to write openly about Hughes’ private and public life.
Malcolm’s review is full of insult and a kind of Victorian outrage in defense of Hughes’ second wife Carol, a nurse whom Hughes married in 1970. It’s meant to wound not just Bate, but all those who attempt to write about the private lives of major figures.
In fact, Malcolm adds to a rich tradition of censorship by those who have deemed themselves the arbiters of what can and can’t be written in biographies – even those of the dead.
Tastelessness or truth?
Malcolm’s review is titled “A Very Sadistic Man” – a reference to the accusations of a distinctly sadistic, often violent and rapacious approach to adulterous sex that some of Hughes’ mistresses have detailed in recent years. Malcolm argues that Bate, by including these previously published anecdotes, has blown Hughes up “into a kind of extra-large sex maniac.”
Beyond Bate’s “tastelessness,” there is, she writes, “Bate’s cluelessness about what you can and cannot do if you want to be regarded as an honest and serious writer.”
Malcolm excoriates his “squalid findings about Hughes’ sex life,” and his “priggish theories about his [Hughes’] psychology.”
Carol Hughes greets Prince Charles in a 2002 photograph. Reuters
Moreover, she declares that it is “excruciating for spouses and offspring to read what they know to be untrue and not to be able to do anything about it except issue complaints that fall upon uninterested ears.” After having read only 16 pages of the 662-page biography, Carol Hughes put the book down and released a statement through her lawyer, saying she found the tome “offensive” – and demanded that Professor Bate apologize.
Malcolm claims that biographers should simply not be permitted to address the private lives of their subjects.
“If anything is our own business,” she declares, it is privacy – “our pathetic native self. Biographers in their pride, think otherwise. Readers, in their curiosity, encourage them in their impertinence. Surely Hughes’ family, if not his shade, deserves better.”
The beautiful and the base
Impertinence? Biography has been here before. For thousands of years, the genre – like great fiction – has been contested.
And dating back to Suetonius and Plutarch, there have been almost endless examples of its antithesis: anti-biography, and attempts at censorship.
The Roman historian Suetonius was, it is believed, exiled from Rome for daring to research and write his “De Vita Caesarum,” or Twelve Caesars. British explorer Sir Walter Raleigh was executed, in part for having annoyed King James I by his impudence in his “History of the World.”
Lady Bird Johnson took exception to Robert Caro’s series on LBJ, refusing to speak to him for decades after Caro portrayed Johnson as something of a sexual and political monster in his first volume. As a result, Caro was not allowed to speak at the presidential library, a federal archive – and the papers he wished to see were withheld until 2003.
We should not be surprised, however, that Malcolm has chosen to attack Hughes’ posthumous biographer – for Malcolm’s review of Bate’s book reprises her infamous attack on biography while Ted Hughes was alive.
There, she openly challenged biographers and readers of biography with the argument that private life should henceforth be off-limits.
“The biographer at work,” she wrote in 1993, “is like the professional burglar, breaking into a house, rifling through certain drawers that he has good reason to think contain the jewelry and money, and triumphantly bearing his loot away.”
She refused to accept that there was more to biography than a pretense “of scholarship.” In her view, biography was simply about scandal, with biographers no more than peeping toms “listening to backstairs gossip and reading other people’s mail.”
Those of us who knew anything of the history of biography were appalled, even then, that Malcolm would so disregard the words of the great 18th-century English writer Samuel Johnson, the father of modern biography.
Johnson had decried the stilted approaches to life writing of his own time by mocking whitewashed accounts that failed to get behind the public facade. As he put it, “more knowledge may be gained of a man’s real character, by a short conversation with one of his servants, than from a formal and studied narrative.”
The greatness of biography, according to Johnson, was in tackling “the beautiful and the base,” and in embracing “vice and virtue,” rather than relying on the “sober sages of the schools.”
His most famous put-down of the puritanical approach to biography was to his own biographer, James Boswell. If a man wants to indulge in a spotless eulogy or “Panegyrick,” he told Boswell, “he may keep vices out of sight, but if he professes to write A Life he must represent it really as it was.”
Is the journalist’s goal to protect or reveal?
Why, then, has Malcolm been crusading against serious biography which embraces both the beautiful and the base for more than 20 years?
Malcolm claimed she had spent years interviewing and corresponding with serious biographers for her Plath project, “The Silent Woman.” Why, as a professional journalist, was she content not to interview Hughes himself, or even speak to those men and women who actually knew the real Ted Hughes? What kind of a journalist is that?
In her new review, Malcolm pours scorn on Professor Bate, but she fails to reveal that in her earlier book, she’d defended Ted Hughes against the many biographers attempting to reveal the truth about him, and about the tragic story of Plath’s suicide.
In Malcolm’s view, Hughes had every right to use libel, property and copyright laws to protect his reputation as a husband and a poet by threatening legal action against anyone who snooped – or threatened to spill the beans – about his louche, often manic private behavior.
Though the law of libel ceased its protection of Hughes upon Hughes’ death 18 years ago, Professor Bate’s book has aroused Malcolm to new fury. Now she is determined to defend the second Mrs. Hughes; no snooping, revelation or even literary criticism of her late husband without her inherited copyright authority – and certainly no revelations of what Hughes was doing on the night of Sylvia Plath’s suicide.
As in her “Silent Woman” articles and book, Malcolm once again declines to question this utter misuse of copyright. (The world’s first copyright act was originally passed in 1710 to protect income, not reputation, for a maximum of 14 years – and especially not to protect posthumous reputation.)
With continuous, almost annual lawsuits and moves to amend copyright law, the battle between “authorized” and “unauthorized” biographies will thus go on, more than half a century since Plath’s death, and almost two decades since Ted Hughes’. Any “unauthorized” biographer of either Plath, Hughes or both must continue to write with his or her arm tied behind the back, unable to quote more than a few authentic words without Carol Hughes’ express permission.
Samuel Johnson would be appalled. And it would be a sad day for biography if Malcolm’s injunction were to be followed, given the major contributions to critical interpretation, history and memory that the genre has become in the many centuries since Suetonius.
A friend – both close and a little odd odd – gave me a novel a few days before I left on a long haul fight.
In The Unlikely Event.
Judy Blume’s first novel in fifteen years, I like to believe that the gift was based on a debate we’d had about Michael’s name for his penis in Blume’s Forever. As opposed to her thinking I’d want to read about plane crashes. While on a bloody plane.
(Ralph. For the record. Although I maintain, stubbornly, that naming a penis Roger just makes more sense).
I hadn’t read any Blume since devouring her back-catalogue in primary school, twenty-odd years ago. Are You There God, It’s Me Margaret? – with its brow-furrowing menstrual pining and complex feminine hygiene appartus – and the title character in Deenie rubbing her “special spot” in the shower. Blume provided me with my first taste of everything salacious. While I’ve never really had idols – nor for that matter even a mentor – reading those Blume books likely did set me on an lifelong journey of skewed discovery.
Prior to opening In the Unlikely Event I read an interview with Blume where she claimed not to be a good writer but a good storyteller.
I’ve been stuck on this idea. About whether there is a distinction. About whether, in fact, it matters.
While I’ve read hundreds of books since novels like Tiger Eyes and Then Again Maybe I Won’t, it’s the Blume books that have stayed with me. Not the most interesting or beautifully written ones I’ve read, but memorable. They spoke to an information-ravenous nine-year-old in an era before the Internet and provided a gentle introduction into the capacity to carve a career from writing about the taboo.
A good writer or a good storyteller?
At 9-years-old I suspect I had no real clue about good writing. Those Blume books lingered on likely because they were doing something that the The Baby-Sitters Club, Enid Blyton, Hunter Davies and Sheila Lavelle books I’d been reading hadn’t. Because we have a tendency to attach disproportionate acclaim to the material we enjoyed in our formative years. Because we remember with excessive fondness our earliest – even if merely vicarious – forays into adulthood.
In The Unlikely Event.
At 35 I’d like to think I’m a better judge of good writing than I was in primary school. This assertion however gets challenged daily when I read gushing praise for books I thought thoroughly wretched or those I adored but got reviewed no further than Amazon.
Equally, when I look at my own writing, some of the pieces I’ve been happiest with are the ones that are least read, and those written in much haste and probably without much heart got devoured. (And don’t get me started about the slew of bizarre (read: bullshit) “good writing” lessons gleaned from too many semesters of Creative Writing at university).
In The Unlikely Event.
In one scene good Greek girl Christina describes first-time sex with her beau, Jack, culminating in him ejaculating on her stomach.
“Like a pool of hot sauce.”
Good writing? Uh, no. Good storytelling? A trickier question.
Something that irritated me throughout the novel were the constant qualifiers: “She looked out the window and saw a moonscape. Or what she thought a moonscape would look like.” Invariably these were the thoughts of her teenage characters. Is it fair then, to think teenagers would actually think of semen feeling akin to, say, a good splash of béchamel on the belly? Mornay? Velouté? Is it good writing if we’re inside the head of a character who isn’t a very good scribe themselves?
I finished it, I teared up in the way that I do if any TV show/book/film dares flash forward decades into the future to show who lived, died, thrived. In the Unlikely Event may not be a beautiful piece of writing but it’s a solid read, an enjoyable story and perhaps, if you ask me in a few years, it might even be memorable.
Maybe that’s what matters most in a world where agreeing on “good” is thoroughly fraught.
Amid the many calls for scientists to engage with the general public, there are some who feel that scientists ought to remain aloof and disconnected from the broader public.
They believe academics shouldn’t even attempt to communicate their research to common folk. And many scientists oblige them, by writing in a turgid manner that is highly effective at keeping the public (and their peers) at bay.
So, here are a few of the tricks that scientists use to produce such turgid science writing. These methods restrict science to the smallest and most specialist audience possible.
But writers beware! Stray from these methods and you risk finding an audience for your writing.
What was done by whom?
Keeping yourself out of the picture is an old-fashioned way of reducing interest in science. Windell Oskay/flicr
You probably already know of journalists’ penchant for “who, what, where, when, why and how”. These are the essentials for creating a captivating story (at least according to journalists). But for scientists who want to remain in the ivory tower, a good start is dropping the “who.”
Hence the passive “it was found that…” rather than the active “I found…” or “scientists discovered…”. Excessive use of such passive voice can easily drain the agency and sparkle from science writing.
This depopulated style was once the norm in many academic journals but even bastions of science such as Nature prefer the active voice. No longer should scientists write themselves out of their own manuscripts.
That said, a few funding agencies and journals still encourage the old style of science writing. For example, in hundreds of ARC Discovery Project summaries the word “we” occurs a mere 30 times. I’ve even seen guides for students encouraging the use of the passive voice. Nice to see that universities’ devotion to old traditions isn’t limited to dull lectures and silly graduation garments.
What’s a picture worth?
A scientist writing about science may well be forced to use images and plots. This obviously presents a risk of clear and concise means of communication. A picture is worth a thousand words? Wrong!
The key to unlocking a science image or plot is often in the caption. I can show you a plot of supernovae distances and velocities, but if you are unfamiliar with the plot and its conclusions it may tell you nothing. It’s Nobel Prize-winning significance can remain hidden from view.
A caption can tell you what to look for, warn you about subtleties in the image, or just tell you what the axes represent. A poorly worded caption can guarantee that a picture tells far less than a thousand words. Alternatively, an overly long caption can bury key points in a wall of text.
And there are even more ways of keeping science out of the limelight with images and plots. Some scientists choose font sizes, symbols and colours that don’t work well when viewed on a screen. More than a dash of clutter can stymie insight too. That can reduce the chance that images are understood by an increasingly small audience.
This image could tell you a lot about galaxies, but not with this perfunctory caption. Michael Brown / SDSS
Language
There are all sorts of ways scientists can hinder communication by misusing language. Unnecessary jargon and acronyms (UJAA) are an obvious starting point. Indeed, a recent study found that scientists committing fraud use more jargon than other scientists, presumably to obscure true understanding of their “research”.
Scientists can also water down the impact of their work with excessively cautious language. Or perhaps, it is possible they might potentially water down any likely impact of their preliminary study with language that could in some circumstances be consistent with excessive caution.
Scientists can antagonise their audiences too. Stating something is “obvious” or “clear” without any quantitative analysis is a good start. They may even want to ignore their data, so the text doesn’t match the analysis. Scientists may be pleasantly surprised at how often they can get away with this.
What I did on my science
An incredible labour-saving device is a slavish devotion to chronology. Some science writers don’t organise and synthesise, but just doggedly follow the time line. You may be familiar with this writing style from primary school essays, such as the timeless classic “what I did on my holiday”.
The pursuit of science is not particularly linear. There are methodological dead ends, repeated analyses, new questions and the random arrival of genuine insights. With the benefit of hindsight, a researcher would invariably do things differently, but they don’t need to share that hindsight with others.
Rather than summarising methodological dead ends, pages can be devoted to them, despite their marginal benefit to others. A slavish devotion to chronology allows scientists to get bogged down in method, rather than distractions such as motivations and findings.
Scientists can scatter the fundamental questions and key insights throughout their writing (ideally in the middle of paragraphs), which will then be overlooked by all but the most dedicated readers.
By being slavishly chronological, you can get bogged down in method and reduce the organisation of your science writing. J Mark Dodds/flickr
With these simple techniques scientists can resist the siren call of public engagement. Interest and insight can be avoided, keeping the public at arm’s length.
Indeed, with sufficient devotion to this turgid and disorganised writing style, scientists may even keep interest and insight hidden from themselves.
Clearing out my office in preparation for a faculty move, I am faced with the dilemma of what books to retain and what to discard. With non-fiction it is easy: keep any reference books that might prove useful in later life, such as the Oxford Guide to Philosophy or Primates of the World. But with fiction, particularly Australian fiction, it is harder to decide.
What lasts, I ask myself, what writing survives? The Guardian critic Jonathan Jones bemoaned in August that a middlebrow cult of the popular was holding literature to ransom. My colleague Ivor Indyk in the Sydney Review of Books added in September that it was in the giving of literary prizes that:
the cult of the middlebrow seems now to have established itself.
The academic Beth Driscoll entered the debate, with a recent, wide-ranging article on the middlebrow, with particular focus on three recent Australian novels, by Susan Johnson, Stephanie Bishop and Antonia Hayes. To which the authors in question last week published their responses, in part taking umbrage at the description of their work as middlebrow because, in Hayes’ words:
it implies an aesthetic pecking order, and is more often than not used in a derogatory way.
The distinctions between highbrow and middlebrow fiction are as old as literature itself.
In the 18th century, novel-reading was regarded as frivolous and morally suspicious. Real literature was to be found in religious tracts, epic poetry and mannered letters written by the nobility. It was the duty of learned men to uphold literary standards against the rising tide of middle-class tastes.
Even Dickens was considered by many of his contemporaries to be too middlebrow to be a serious writer, and Edmund Wilson wrote of Raymond Chandler that he “remained a long way below Graham Greene”.
“Literature is bunk,” Chandler replied, “written by fancy boys, clever-clever darlings, stream-of-consciousness ladies and gents and editorial novelists.”
Would I ever read Graham Greene again? Probably not, I decide – all that Catholic angst – but I keep two novels by Raymond Chandler. If we can’t trust our literary academics and critics, to whom, then, should we entrust the judgement of literary quality?
The only answer is the passage of time. What is valorised today might not be read in 50 years. Quintus Servinton (1830), the first novel published in Australia, was written by a convict in 1830, but no one would ever describe it as literature. It survives for its historical value alone.
Eugene Hsu
In a 2002 poll by the Norwegian Nobel Institute, 100 international authors, including Nobel prize-winners, chose Don Quixote (1605) as the most important book of all time, ahead of novels by Tolstoy and Dostoevsky. (No Australian writer made their list.)
the final and the greatest expression of human thought, the bitterest irony that a human is capable of expressing …
And yet Don Quixote is neither highbrow nor middlebrow; if anything it is a satire on literary pretensions, on those genres — the epic and romance of chivalry — that preceded it. Yet for a long time Don Quixote was regarded as light literature and not worthy of serious study.
To talk about literature, therefore, we must ask what is literature? A common response is that it’s a force for change, or morally instructive, but these vague motherhood statements would exclude Rabelais, Henry Miller or the Marquis de Sade.
In a wonderful 2002 essay, Negotiating with the Dead, Margaret Atwood suggested there is only one question to be asked about any work — is it alive, or is it dead?
This is a far better measure of a novel or story’s worth than whether it is highbrow or lowbrow. The best fiction transcends brows. The playwright David Mamet suggested the purpose of literature is to delight.
“To create or endorse the Scholastic is a craven desire,” he wrote in 2000. “It may yield a low-level self-satisfaction, but how can this compare with our joy at great, generous writing.”
Great, generous writing that is alive. Now we are getting closer to answering the question: What lasts, what writing survives? And what books should I keep?
The only way a text can survive is through its interaction with a reader – “no matter how far away that reader may be from the writer in time and space,” Atwood wrote. Miguel Cervantes died in 1616 yet his creation Don Quixote de la Mancha lives on four centuries later, and no-one today reads a pastoral romance.
We all know what dead writing is, for we encounter it every day in managerial speak, in those densely-worded, multi-paged documents about course intended learning outcomes, quality assurance mechanisms and international benchmarking activities. All around us dead sentences are falling on the living.
But the writing that survives, great literature, reminds us of our existence in this world, and our connection with other living things.
Tolstoy’s Anna Karenina (1877) was written 140 years ago and yet most of it remains alive and vivid. There is something presumptuous, if not pretentious, about the term “literary fiction”. One can’t imagine Tolstoy telling Chekhov at his country estate that he was writing “literary fiction” or “highbrow literature”.
On the contrary, Tolstoy held an aesthetic that required fiction to be morally improving and accessible to the widest public. It is not the morally improving aspects of Tolstoy’s prose that we appreciate today; rather it is passages such as the scene of Konstanin Levin travelling through the Russian countryside, during which:
the tall grass softly twined around the wheels and the horse’s legs, leaving its seeds on the wet spokes and hubs.
This is what lasts, this is what survives, our joy of discovery at something simple and straightforward that connects us to the world. If the best fiction is a way of dealing with death, then it is also a way of learning about the inter-related nature of life.
One of the most interesting and entertaining parts of following my favourite authors on Twitter is witnessing a little bit of the writing process.
Getting a peek into how my favourite books are written is like watching a real-time behind-the-scenes DVD featurette. But not every update is a positive one. There’s something that haunts all writers, be they professional or amateur: writer’s block.
Writer’s block can be difficult to define, because no two people share the same experience of it. Probably the simplest and most straightforward definition comes from Dr. Patricia Huston:
a distinctly uncomfortable inability to write.
But what could be the cause of this vaguely described problem? Has a writer’s Muse simply deserted them, or can we find an explanation hidden somewhere in the brain?
The location of language
While there haven’t been any published scientific studies on people with writer’s block, we can take a few different avenues to try and determine what parts of the brain may be affected. One of those is looking at where words come from in the first place.
Language has traditionally been thought to be one of the few skills found in a very specific location in the brain: on the left side of the front part of the brain, fittingly called the frontal lobe.
This is called Broca’s area, named after the scientist who first reported that damage to this area led to the inability to form words, called aphasia. Since writer’s block is, fundamentally, an inability to write down words, this makes the frontal lobe an excellent place to start in researching the underpinnings of writer’s block.
The lateral view of Broca’s area. Database Center for Life Science, CC BY-ND
We can also look at writer’s block as an inability to come up with a story, be it fiction, non-fiction, or the story of how to program your remote. Most who experience writer’s block aren’t having trouble producing words – they simply can’t figure out what should happen next.
A small number of studies have looked at the concept of “story creation” and what areas of the brain might be involved. In one study from 2005, participants were presented with a set of three words and asked to create a story based around them.
On some trials, they were asked to “be creative” and on others to “be uncreative”.
When this task was done in an fMRI scanner, which measures blood flow to different regions of the brain as an indicator of increased or decreased activity, there was a significant increase in activity in the prefrontal cortex.
This increased activity was seen not just on the left side, where Broca’s area is located, but also in the right prefrontal cortex. Some of these areas, such as the anterior cingulate cortex, that are associated with making associations between unrelated concepts – a critical skill for a great writer.
In another study, from 2013, participants were asked to actually write a story while in the fMRI scanner. They were given the first 30 words of a familiar text, asked to brainstorm a continuation of that text, and then given two minutes to physically write out their story. These stories were then scored based on creativity and measured against the brain activity data generated while in the scanner.
Both the “brainstorming” and “creative writing” portions of the experiment showed strong increases in brain activity in the frontal lobe, particularly in the language areas.
In the “brainstorming” condition, the subregions involved included those associated with planning and control, whereas many of those regions involved in the “creative writing” condition were involved with memory and the motor areas related to the physical act of writing.
So when we speak of writer’s block, we may actually be talking about a “creation block” – the inability to make the connections and the plans that allow creative writing to occur.
So we’ve got an idea of where writer’s block is happening – but what can you do to fight against it? There’s no pill you can take to make it go away, but there are some simple things that you can try to loosen up your frontal lobe, all recommended by Dr. Huston in 1998:
Read someone else’s writing. Studies have shown that people are more creative when they’re exposed to the creative ideas of others. Just make sure you’re only inspired by their writing and not copying from it.
Break the work down into pieces. If you can’t get the introduction to flow the way you want it to, try something in the middle. Check off each part as you finish so you can get an accurate sense of how much you’ve completed.
Write without stopping. Try writing a whole draft without going back and re-reading what you’ve written. Some of it may not be great, but I bet a lot of it will be usable. At the very least, it will give you a place to start.
Plan breaks into your writing schedule. Many swear by the pomodoro technique, but find a rhythm that works for you. Go for a walk or grab a meal with friends or watch that video of the puppy that can’t roll over (a personal favorite). Relaxing will make it easier to get back into the writing spirit.
Don’t procrastinate. The more you put off what you have to write, the more anxiety you’ll feel. This is always my stumbling block (and why I’ve watched half of the second season of Fringe while writing this).
Ultimately, be kind to yourself. You’re not the first to go through this and you’re not the last. Being stuck doesn’t make you a bad writer or a bad person. It makes you a human being with a flawed (but marvellous) brain.
You must be logged in to post a comment.