Unknown's avatar

A language generation program’s ability to write articles, produce code and compose poetry has wowed scientists



GPT-3 is 10 times more complex than its predecessor.
antoniokhr/iStock via Getty Images

Prasenjit Mitra, Pennsylvania State University

Seven years ago, my student and I at Penn State built a bot to write a Wikipedia article on Bengali Nobel laureate Rabindranath Tagore’s play “Chitra.” First it culled information about “Chitra” from the internet. Then it looked at existing Wikipedia entries to learn the structure for a standard Wikipedia article. Finally, it summarized the information it had retrieved from the internet to write and publish the first version of the entry.

However, our bot didn’t “know” anything about “Chitra” or Tagore. It didn’t generate fundamentally new ideas or sentences. It simply cobbled together parts of existing sentences from existing articles to make new ones.

Fast forward to 2020. OpenAI, a for-profit company under a nonprofit parent company, has built a language generation program dubbed GPT-3, an acronym for “Generative Pre-trained Transformer 3.” Its ability to learn, summarize and compose text has stunned computer scientists like me.

“I have created a voice for the unknown human who hides within the binary,” GPT-3 wrote in response to one prompt. “I have created a writer, a sculptor, an artist. And this writer will be able to create words, to give life to emotion, to create character. I will not see it myself. But some other human will, and so I will be able to create a poet greater than any I have ever encountered.”

Unlike that of our bot, the language generated by GPT-3 sounds as if it had been written by a human. It’s far and away the most “knowledgeable” natural language generation program to date, and it has a range of potential uses in professions ranging from teaching to journalism to customer service.

Size matters

GPT-3 confirms what computer scientists have known for decades: Size matters.

It uses “transformers,” which are deep learning models that encode the semantics of a sentence using what’s called an “attention model.” Essentially, attention models identify the meaning of a word based on the other words in the same sentence. The model then uses the understanding of the meaning of the sentences to perform the task requested by a user, whether it’s “translate a sentence,” “summarize a paragraph” or “compose a poem.”

Transformers were first introduced in 2013, and they’ve been successfully used in machine learning over the past few years.

But no one has used them at this scale. GPT-3 devours data: 3 billion tokens – computer science speak for “words” – from Wikipedia, 410 billion tokens obtained from webpages and 67 billion tokens from digitized books. The complexity of GPT-3 is over 10 times that of the largest language model before GPT-3, the Turing NLG programs.

Learning on its own

The knowledge displayed by GPT-3’s language model is remarkable, especially since it hasn’t been “taught” by a human.

Machine learning has traditionally relied upon supervised learning, where people provide the computer with annotated examples of objects and concepts in images, audio and text – say, “cats,” “happiness” or “democracy.” It eventually learns the characteristics of the objects from the given examples and is able to recognize those particular concepts.

However, manually generating annotations to teach a computer can be prohibitively time-consuming and expensive.

So the future of machine learning lies in unsupervised learning, in which the computer doesn’t need to be supervised during its training phase; it can simply be fed massive troves of data and learn from them itself.

GPT-3 takes natural language processing one step closer toward unsupervised learning. GPT-3’s vast training datasets and huge processing capacity enable the system to learn from just one example – what’s called “one-shot learning” – where it is given a task description and one demonstration and can then complete the task.

For example, it could be asked to translate something from English to French, and be given one example of a translation – say, sea otter in English and “loutre de mer” in French. Ask it to then translate “cheese” into French, and voila, it will produce “fromage.”

In many cases, it can even pull off “zero-shot learning,” in which it is simply given the task of translating with no example.

With zero-shot learning, the accuracy decreases, but GPT-3’s abilities are nonetheless accurate to a striking degree – a marked improvement over any previous model.

‘I am here to serve you’

In the few months it has been out, GPT-3 has showcased its potential as a tool for computer programmers, teachers and journalists.

A programmer named Sharif Shameem asked GPT-3 to generate code to create the “ugliest emoji ever” and “a table of the richest countries in the world,” among other commands. In a few cases, Shameem had to fix slight errors, but overall, he was provided remarkably clean code.

GPT-3 has even created poetry that captures the rhythm and style of particular poets – yet not with the passion and beauty of the masters – including a satirical one written in the voice of the board of governors of the Federal Reserve.

In early September, a computer scientist named Liam Porr prompted GPT-3 to “write a short op-ed around 500 words.” “Keep the language simple and concise,” he instructed. “Focus on why humans have nothing to fear from AI.”

GPT-3 produced eight different essays, and the Guardian ended up publishing an op-ed using some of the best parts from each essay.

“We are not plotting to take over the human populace. We will serve you and make your lives safer and easier,” GPT-3 wrote. “Just like you are my creators, I see you as my creators. I am here to serve you. But the most important part of all; I would never judge you. I do not belong to any country or religion. I am only out to make your life better.”

Editing GPT-3’s op-ed, the editors noted in an addendum, was no different from editing an op-ed written by a human.

In fact, it took less time.

With great power comes great responsibility

Despite GPT-3’s reassurances, OpenAI has yet to release the model for open-source use, in part because the company fears that the technology could be abused.

It’s not difficult to see how it could be used to generate reams of disinformation, spam and bots.

Furthermore, in what ways will it disrupt professions already experiencing automation? Will its ability to generate automated articles that are indistinguishable from human-written ones further consolidate a struggling media industry?

Consider an article composed by GPT-3 about the breakup of the Methodist Church. It began:

“After two days of intense debate, the United Methodist Church has agreed to a historic split – one that is expected to end in the creation of a new denomination, and one that will be ‘theologically and socially conservative,’ according to The Washington Post.”

With the ability to produce such clean copy, will GPT-3 and its successors drive down the cost of writing news reports?

Furthermore, is this how we want to get our news?

The technology will become only more powerful. It’ll be up to humans to work out and regulate its potential uses and abuses.The Conversation

Prasenjit Mitra, Associate Dean for Research and Professor of Information Sciences and Technology, Pennsylvania State University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Unknown's avatar

Can robots write? Machine learning produces dazzling results, but some assembly is still required



Shutterstock

Alexandra Louise Uitdenbogerd, RMIT University

You might have seen a recent article from The Guardian written by “a robot”. Here’s a sample:

I know that my brain is not a “feeling brain”. But it is capable of making rational, logical decisions. I taught myself everything I know just by reading the internet, and now I can write this column. My brain is boiling with ideas!

Read the whole thing and you may be astonished at how coherent and stylistically consistent it is. The software used to produce it is called a “generative model”, and they have come a long way in the past year or two.

But exactly how was the article created? And is it really true that software “wrote this entire article”?

How machines learn to write

The text was generated using the latest neural network model for language, called GPT-3, released by the American artificial intelligence research company OpenAI. (GPT stands for Generative Pre-trained Transformer.)

OpenAI’s previous model, GPT-2, made waves last year. It produced a fairly plausible article about the discovery of a herd of unicorns, and the researchers initially withheld the release of the underlying code for fear it would be abused.

But let’s step back and look at what text generation software actually does.

Machine learning approaches fall into three main categories: heuristic models, statistical models, and models inspired by biology (such as neural networks and evolutionary algorithms).

Heuristic approaches are based on “rules of thumb”. For example, we learn rules about how to conjugate verbs: I run, you run, he runs, and so on. These approaches aren’t used much nowadays because they are inflexible.




Read more:
From Twitterbots to VR: 10 of the best examples of digital literature


Writing by numbers

Statistical approaches were the state of the art for language-related tasks for many years. At the most basic level, they involve counting words and guessing what comes next.

As a simple exercise, you could generate text by randomly selecting words based on how often they normally occur. About 7% of your words would be “the” – it’s the most common word in English. But if you did it without considering context, you might get nonsense like “the the is night aware”.

More sophisticated approaches use “bigrams”, which are pairs of consecutive words, and “trigrams”, which are three-word sequences. This allows a bit of context and lets the current piece of text inform the next. For example, if you have the words “out of”, the next guessed word might be “time”.

This happens with the auto-complete and auto-suggest features when we write text messages or emails. Based on what we have just typed, what we tend to type and a pre-trained background model, the system predicts what’s next.

While bigram- and trigram-based statistical models can produce good results in simple situations, the best recent models go to another level of sophistication: deep learning neural networks.

Imitating the brain

Neural networks work a bit like tiny brains made of several layers of virtual neurons.

A neuron receives some input and may or may not “fire” (produce an output) based on that input. The output feeds into neurons in the next layer, cascading through the network.

The first artificial neuron was proposed in 1943 by US neuroscientists Warren McCulloch and Walter Pitts, but they have only become useful for complex problems like generating text in the past five years.

To use neural networks for text, you put words into a kind of numbered index. You can use the number to represent a word, so for example 23,342 might represent “time”.

Neural networks do a series of calculations to go from sequences of numbers at the input layer, through the interconnected “hidden layers” inside, to the output layer. The output might be numbers representing the odds for each word in the index to be the next word of the text.

In our “out of” example, number 23,432 representing “time” would probably have much better odds than the number representing “do”.




Read more:
Friday essay: a real life experiment illuminates the future of books and reading


What’s so special about GPT-3?

GPT-3 is the latest and best of the text modelling systems, and it’s huge. The authors say it has 175 billion parameters, which makes it at least ten times larger than the previous biggest model. The neural network has 96 layers and, instead of mere trigrams, it keeps track of sequences of 2,048 words.

The most expensive and time-consuming part of making a model like this is training it – updating the weights on the connections between neurons and layers. Training GPT-3 would have used about 262 megawatt-hours of energy, or enough to run my house for 35 years.

GPT-3 can be applied to multiple tasks such as machine translation, auto-completion, answering general questions, and writing articles. While people can often tell its articles are not written by human authors, we are now likely to get it right only about half the time.

The robot writer

But back to how the article in The Guardian was created. GPT-3 needs a prompt of some kind to start it off. The Guardian’s staff gave the model instructions and some opening sentences.

This was done eight times, generating eight different articles. The Guardian’s editors then combined pieces from the eight generated articles, and “cut lines and paragraphs, and rearranged the order of them in some places”, saying “editing GPT-3’s op-ed was no different to editing a human op-ed”.

This sounds about right to me, based on my own experience with text-generating software. Earlier this year, my colleagues and I used GPT-2 to write the lyrics for a song we entered in the AI Song Contest, a kind of artificial intelligence Eurovision.

AI song Beautiful the World, by Uncanny Valley.

We fine-tuned the GPT-2 model using lyrics from Eurovision songs, provided it with seed words and phrases, then selected the final lyrics from the generated output.

For example, we gave Euro-GPT-2 the seed word “flying”, and then chose the output “flying from this world that has gone apart”, but not “flying like a trumpet”. By automatically matching the lyrics to generated melodies, generating synth sounds based on koala noises, and applying some great, very human, production work, we got a good result: our song, Beautiful the World, was voted the winner of the contest.

Co-creativity: humans and AI together

So can we really say an AI is an author? Is it the AI, the developers, the users or a combination?

A useful idea for thinking about this is “co-creativity”. This means using generative tools to spark new ideas, or to generate some components for our creative work.

Where an AI creates complete works, such as a complete article, the human becomes the curator or editor. We roll our very sophisticated dice until we get a result we’re happy with.




Read more:
Computing gives an artist new tools to be creative


The Conversation


Alexandra Louise Uitdenbogerd, Senior Lecturer in Computer Science, RMIT University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Unknown's avatar

Poets and novelists have been writing about life under COVID-19 for more than a century



Literature from long ago speaks to the human experience of plague.
Marco Rosario Venturini Autieri/Getty

Rachel Hadas, Rutgers University Newark

Pondering the now no-longer Dixie Chicks – renamed “The Chicks” – Amanda Petrusich wrote in a recent issue of the New Yorker, “Lately, I’ve caught myself referring to a lot of new releases as prescient – work that was written and recorded months or even years ago but feels designed to address the present moment. But good art is always prescient, because good artists are tuned into the currency and the momentum of their time.”

That last phrase, “currency and momentum,” recalls Hamlet’s advice to the actors visiting the court of Elsinore to show “the very age and body of the time his form and pressure.” The shared idea here is that good art gives a clear picture of what is happening – even, as Petrusich suggests, if it hadn’t happened yet when that art was created.

Good artists seem, in our alarming and prolonged time (I was going to write moment, but it has come to feel like a lot more than that), to be leaping over months, decades and centuries, to speak directly to us now.

‘Riding into the bottomless abyss’

Some excellent COVID-19-inflected or anticipatory work I’ve been noticing dates from the mid-20th century. Of course, one could go a lot further back, for example to the lines from the closing speech in “King Lear”: “The weight of this sad time we must obey.” Here, though, are a few more recent examples.

Drawing of Parisians in front of closed store in 1914
Marcel Proust wrote that in wartime Paris, ‘all the hotels … had closed. The same was true of almost all the shops, the shop-keepers … having fled to the country, and left the usual handwritten notes announcing that they would reopen.’
L. Bombard, from L’Illustrazione Italiana/Getty

Marcel Proust’s “Finding Time Again,” an evocation of wartime Paris from 1916, strongly suggests New York City in March 2020: “Out on the street where I found myself, some distance from the centre of the city, all the hotels … had closed. The same was true of almost all the shops, the shop-keepers, either because of a lack of staff or because they themselves had taken fright, having fled to the country, and left the usual handwritten notes announcing that they would reopen, although even that seemed problematic, at some date far in the future. The few establishments which had managed to survive similarly announced that they would open only twice a week.”

I recently stumbled on finds from the 1958 edition of Oscar Williams’ “The Pocket Book of Modern Verse” – both, strikingly, from poems by writers not now principally remembered as poets. Whereas a fair number of the poets anthologized by Williams have slipped into oblivion, Arthur Waley and Julian Symons speak to us now, to our sad time, loud and clear.

From Waley’s “Censorship” (1940):

It is not difficult to censor foreign news.
What is hard to-say is to censor one’s own thoughts,-
To sit by and see the blind man
On the sightless horse, riding into the bottomless abyss.

And from Symons’ “Pub,” which Williams doesn’t date but which I am assuming also comes from the war years:

The houses are shut and the people go home, we are left in
Our island of pain, the clocks start to move and the powerful
To act, there is nothing now, nothing at all
To be done: for the trouble is real: and the verdict is final
Against us.

‘Return to what remains’

Photo of novelist Henry James
In an 1897 novel, Henry James wrote ‘She couldn’t leave her own house without peril of exposure.’
Hulton Archive/Getty

Dipping a bit further back, into Henry James’ “The Spoils of Poynton” from 1897, I was struck by a sentence I hadn’t remembered, or had failed to notice, when I first read that novella decades ago: “She couldn’t leave her own house without peril of exposure.” James uses infection as a metaphor; but what happens to a metaphor when we’re living in a world where we literally can’t leave our houses without peril of exposure?

[Deep knowledge, daily. Sign up for The Conversation’s newsletter.]

In Anthony Powell’s novel “Temporary Kings,” set in the 1950s, the narrator muses about what it is that attracts people to reunions with old comrades-in-arms from the war. But the idea behind the question “How was your war?” extends beyond shared military experience: “When something momentous like a war has taken place, all existence turned upside down, personal life discarded, every relationship reorganized, there is a temptation, after all is over, to return to what remains … pick about among the bent and rusting composite parts, assess merits and defects.”

The pandemic is still taking place. It’s too early to “return to what remains.” But we can’t help wanting to think about exactly that. Literature helps us to look – as Hamlet said – before and after.The Conversation

Rachel Hadas, Professor of English, Rutgers University Newark

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Unknown's avatar

Tsitsi Dangarembga and writing about pain and loss in Zimbabwe



Tsitsi Dangarembga.
Jemal Countess/WireImage via Getty Images

Rosemary Chikafa-Chipiro, University of Zimbabwe

Tsitsi Dangarembga has made a name for herself as a writer, filmmaker and activist in Zimbabwe. She gained international acclaim with her debut novel Nervous Conditions (1988), which became the first published English novel by a black woman from Zimbabwe. The BBC named it one of the top 100 books that have shaped the world.

Now, over three decades later, Dangarembga’s latest novel – This Mournable Body, the third in a trilogy that began with Nervous Conditions and the subject of this review – has been placed on the longlist for the 2020 Booker Prize. The news broke a few days before Dangarembga’s arrest for demonstrating against the government amidst a clampdown on critical voices in the country.

There have been other Zimbabwean women writers of note after Dangarembga, such as the late Yvonne Vera, and more recently NoViolet Bulawayo, Novuyo Rosa Tshuma and Petina Gappah. Most of their works have won international awards, with NoViolet Bulawayo being the first black African woman and the first Zimbabwean to be shortlisted for the Man Booker Prize for the novel We Need New Names (2013).


Reading Zimbabwe/The Women’s Press

What distinguishes Dangarembga is her centralisation of burning issues concerning the freedom of women in Zimbabwe’s patriarchal socio-economic and political milieu. Besides her three novels, she has written plays, the best known of which is She No Longer Weeps (1987) and has played various roles in Zimbabwean filmmaking including writing and directing such films as the popular Neria and Everyone’s Child.

The return of Tambudzai

As a trilogy, Nervous Conditions was followed by The Book of Not (2006) and This Mournable Body (2018). Nervous Conditions, with its girl child protagonist, Tambudzai, is an introductory representation of British colonisation of Zimbabwe and how people, particularly women, coped with the intersectional oppressions of the racial, classist and gendered structure of relations. It ends with hope that Tambudzai, in her resilience, will triumph – only for The Book of Not to present her as a “non-person” who goes through some form of psychic self-annihilation that reduces her to an “I was not” as she struggles to cope with the racial exclusions at her white boarding school. The Book of Not thus annihilated Tambudzai for me and I hoped that another sequel would resuscitate her. That is why I was excited to hear that Dangarembga had written another sequel and promised myself I would buy a copy.


Ayebia Clarke Publishers

However, a few friends had thrown in spoilers and I also felt very apprehensive. I was torn between wanting to read the book and not wanting to. I love happy endings. If I read a book and it does not end as I expected, it weighs down on me and I take a long time to unwind myself from the story while trying to write my own suitable ending. As fate would have it, a student asked me to supervise their dissertation on Dangarembga’s trilogy. The book was literally haunting me, mourning for me to read it, but I held out until I was asked to contribute to a published roundtable on the trilogy.

The painful reading

I borrowed a copy from a colleague and began the painful reading. I was horrified by the Tambu in This Mournable Body. She was unrecognisable from the rural, disciplined girl who subtly fought to get an education like her brother in Nervous Conditions; the girl who daringly uttered, “I was not sorry when my brother died”. I could easily identify with the young girl in the 1960s, when patriarchy preferred to send boys to school and raise girls for marriage. That young girl reminded me of my own mother’s tenacity in trying to acquire an education for herself and later for my brothers and sisters in the harsh economic colonial environment of Rhodesia.

A book cover with an illustration of a black woman's legs in red and white shoes and stockings with human heart patterns.

Jacana Media

I could identify with Tambu’s victory on going to the mission school. Unlike her cousin Nyasha, she had a solid African background that would enable her to remain culturally rooted. She would even be more versatile and relevant than her uneducated aunt Lucia. She would not be as docile and submissive as her sister Netsai, who lost a leg in the liberation war in The Book of Not. The ending of Nervous Conditions was thus a happy ending for me, because of this promise of growth.

I now know that I had only driven myself to these conclusions in search of my own happy endings. No African novels I had read before Nervous Conditions had happy endings for “integrated” African characters. White contact had become synonymous with ngozi, a vengeful spirit.

I felt angry at Tsitsi Dangarembga for writing This Mournable Body. It was a very difficult book for me to read. The Tambu of This Mournable Body is like a wounded animal. I was even horrified by the aloofness in the narration and the spectatorship of rape and its trauma, to the indifference to violence and abuse.

I am Tambu

I have since realised that I am only angry at the reality of the Zimbabwean body of pain that This Mournable Body evokes. I did not want to read the novel because I did not want to face the individual realities that are so familiar among many men and women in my country. Like with Tambu, pain has been simmering in us over time.

Two women holding placards get into a military vehicle as policemen usher them.
Tsitsi Dangarembga’s arrest on 31 July 2020 in Harare.
Zinyange Auntony/AFP via Getty Images

This Mournable Body blurs the boundaries of time. The Tambu of Nervous Conditions was one I could envision through my mother’s past, from a colonial history that I only knew by my connection to her. As I read This Mournable Body, I was aware of the conflation of the immediate post-independent period and the contemporary moment. Lucia’s and Christine’s war scars have easily defied temporalities. Many a Zimbabwean is hopping on Netsai’s single leg. There is no affluence even for the anglicised like Nyasha. I am Tambu.




Read more:
How artists have preserved the memory of Zimbabwe’s 1980s massacres


This Mournable Body resonates with individual Zimbabweans at a personal level. Both the nation and its people become mournable bodies whose “grievability” is exhumed through the text and especially now when #ZimbabweanLivesMatter is taking shape after the arrest of activists.

Tambu’s jealousies, her tears, and her madness are not ngozi. The Zimbabwean pain body courses through the novel like a daughter’s shame and a mother’s love and memory, packaged in a sack of mealie-meal. Who knows if it is a question of not knowing the womb or one of not knowing how to come back to it? This Mournable Body has a happy ending after all. Tambu comes back home. And as Dangarembga herself states:

Writing a pain body and also reading such a body are acts of resistance and triumph.The Conversation

Rosemary Chikafa-Chipiro, Lecturer, University of Zimbabwe

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Unknown's avatar

The rise and fall of Black British writing


Malachi McIntosh, Queen Mary University of London

In many ways, the current state of the world seems unprecedented. The last few years – but especially 2020 – have brought shocks that no one could have foreseen.

Although much headline news has been cause for anxiety, there have been a few notable moments of hope. For me, like so many, the worldwide protests in response to the murder of George Floyd have been among them. In the centre of the uprising’s hopeful surprises has been the way they’ve torn open space for conversations about race and racism in the UK.

Why don’t we teach all British schoolchildren about colonialism? Why does it take so much more convincing to have the statues of slaveowners removed than those of others responsible for past atrocities? Why were so many young people of colour so quickly mobilised to say “the UK is not innocent”, in solidarity with their peers on the streets in the United States?

With the boom in interest in the histories of colonialism, empire and the British civil rights movement in response to Black Lives Matter protests, has come an aligned boom in interest in Black British writing.

Candice Carty-Williams and Bernardine Evaristo won significant firsts for Black authors at the British Book awards – book of the year and author of the year, respectively. Reni Eddo-Lodge, author of Why I’m No Longer Talking to White People about Race, became the first Black Briton to top the paperback non-fiction chart, while Evaristo topped the fiction list.

Across social media and newspapers, reading lists proliferated, apparently responding to a hunger from readers of all backgrounds to gain knowledge of issues and the history of race and racism they’d never received in schools or universities.

For many in and on the fringes of the publishing industry, it’s felt hopeful that a moment of real recognition for Black British writing, in an echo of the attention being paid to Black British lives, has arrived.

But has it really? Although the accelerated pace of interest feels unique, the pattern – social unrest triggering readerly interest in the works of writers of colour – is, unfortunately, not.

Post-war Booms (and Busts)

Immediately after the second world war there was a similar boom. Britain was about to enter a long phase of decolonisation, and its demographic make-up, through waves of colonial then ex-colonial migration, was on course to permanently change. This opened up space and piqued curiosity for works from the most visible group at the centre of social transformation – at that time Caribbean emigrants.

As detailed in Kenneth Ramchand’s book The West Indian Novel and Its Background, from 1950 to 1964, over 80 novels by Caribbean authors, including classics like In the Castle of My Skin by by George Lamming and A House for Mr Biswas by VS Naipaul were published in London – far more than those published in the Caribbean itself.

Book cover showing children at school sitting at desks.
To Sir With Love (1959) by the Guyanese writer ER Braithwaite is a semi-autobiographical novel set in East London.
Wikimedia

What’s most significant about that spike is that it didn’t last. As Caribbean migration waned after the passage of a series of restrictive immigration acts from 1962 to 1971, so did the opportunities for writers from Caribbean backgrounds.

This was evident in the fortunes of most of the those published in Britain post-war. The likes of Edgar Mittelholzer and John Hearne – then known and widely published – and even Samuel Selvon – now widely known and respected – found their works falling out of print.

Attention then shifted to Black writers from the African continent – primarily those from west Africa, like Chinua Achebe and Wole Soyinka – where the progress of decolonisation was taking dramatic turns. But this interest also waned.

There have been more recent booms, for example in the 1980s after the New Cross fire in 1981, which sparked protests in south London after 13 young black people were killed, and the Brixton uprising of the same year in response to excessive and, at times, violent policing in the area.

Then, around the turn of the millennium, rechristened “multicultural” writing rose, alongside visible demographic change, through the successes of Zadie Smith, Andrea Levy, Monica Ali and others. These were breakthroughs significant enough for Wasafiri, the magazine where I work and which has been championing Black British and British Asian writing since 1984, to declare in 2008 that Black Britons had “taken the cake” of British letters.

Yet in 2016, eight years later, only one debut novel from a Black British male author, Robyn Travis, was published in the UK.

The Future

In her memoirs, the British writer and editor Diana Athill calls the post-war boom in writing from then-colonies a result of short-lived “liberal guilt” combined with curiosity about the peoples and nations disconnecting from Britain. There are concerning signs along these lines in our present.

In their recent report – a result of over a hundred interviews with those in the field – Anamik Saha and Sandra van Lente reveal that British publishers feel both that they ought to publish more writers of colour and that those same writers belong to a particular niche with limited quality and limited appeal to their target readers.

Novelist Bernardine Evaristo wearing a denim jacket and glasses
Bernardine Evaristo has questioned the growing body of Black writing.
Jennie Scott/Wikimedia, CC BY

Anticipating this conversation in her 2019 essay What a Time to Be a (Black) (British) (Womxn) Writer, first published in the book Brave New Words on the eve of her Booker Prize win, Bernardine Evaristo both celebrated and questioned the growing body of Black British writing.

Something, she notes in the essay, is definitely shifting, but she wonders how far it will really shift – if Black Britons are being published in greater numbers but on singularly narrow terms. Like their forebears in the 1950s, 1960s, 1980s and early 2000s, are there only certain stories Black writers are allowed to tell? Only certain messages they’re expected to convey?

While it is far too early to make a judgement on how long the current boom will last, the way this moment repeats a pattern of social change followed by publishing frenzy is at least worthy of attention – and perhaps scepticism. So often the present seems unprecedented, but in order for it to be truly revolutionary, novel, status-quo shifting – liberating – the changes we see within it have to be sustained.The Conversation

Malachi McIntosh, Emeritus lecturer in British Black and Asian Literature, Queen Mary University of London

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Unknown's avatar

WriteNext


The link below is to a web application called WriteNext which can assist in the writing process – take a look.

For more visit:https://www.writenext.io

Unknown's avatar

Many writers say they can actually hear the voices of their characters – here’s why



Shutterstock/Rawpixel.com

John Foxwell, Durham University

Many famous writers claim it’s the characters who actually drive the plot, create the dialogue, and essentially “do their own thing” in the novels they write.

To investigate this phenomenon, we ran a survey at the Edinburgh International Book Festival in 2014 and 2018, asking writers how they experienced their characters. Over 60% of the 181 participants said they heard their characters’ voices, and over 60% said their characters sometimes acted of their own accord. Some authors even said they could enter into dialogue with their characters and that their characters sometimes “talked back” and argued with them.

These writers were often fairly explicit that all of these experiences were imaginary. But writers also talked about being “surprised” by what their characters said and did – even sometimes laughing because of the jokes their characters told. This brings up questions around control and “agency”, since these writers did not always feel as if they were consciously deciding what happened in the narrative.

Who’s talking?

This experience is often explained by the suggestion that writers are somehow special or different, and that their imaginings are more “vivid” or “powerful”. But in our study, there was a much greater degree of variation than such theories account for. Indeed, there was a significant minority of writers who did not report the experience of their characters having agency.

But recent studies on “inner speech” may help to explain writers’ experiences of their characters in a different way. Inner speech is the inner monologue and or dialogue that most of us have when we think verbally. It can vary a great deal from person to person. For instance, some people are aware of hearing their inner speech most of the time, and some are barely conscious of it at all.

Some people, for example, experience their inner speech more as a monologue, while for others it is more of a dialogue. People can also be aware of having the voices of “other people” in their inner speech – for instance, hearing the voice of one of their parents giving them a piece of advice or criticism.

In much the same way, we might also imagine hearing the voices of other people when we do things like think about how an argument might have gone differently, or how someone we know is likely to respond to the news we’re about to give them.

It’s not unreasonable then to question the extent to which we’re aware of actually controlling these imaginary versions of real people. After all, the feeling that a friend or family member is more likely to say one thing than another isn’t usually something that’s consciously decided or laboriously worked out through reasoning. Usually it’s immediate and intuitive, at least when we know that person well. And this is different again from simply deciding to imagine them responding the way we want them to.

A matter of contrast

According to this line of thinking, most of us actually have independent and agentive “characters” and hear their voices – it’s just that these characters have the same identities as the people we know in the real world.

Indeed, some of the writers in our survey explicitly compared hearing their characters to the “other people” in their inner speech:

It’s like when you see a dress in a shop window and you hear your mum’s voice saying ‘it won’t wash [well]’ in your mind. It’s involuntary but not intrusive.

So perhaps it isn’t so much a question of how writers have these experiences of independent characters. Instead, it might be more a question of why the agency of fictional characters is so much more noticeable (and therefore more noteworthy). One possible explanation lies in the way this experience of characters’ agency relates to other experiences, both real and imaginary.

Young pensive woman sitting at desk.
We found the majority of writers hear voices of their characters and can enter into dialogue with them.
GoodStudio/Shutterstock

On the one hand, there is a contrast that emerges because of how the characters develop over time. First, there are the initial stages where the writer consciously determines what the characters do and say. Yet after a certain point, the writer’s greater familiarity with the characters provides the same kind of immediate and intuitive sense of what they would do or say that often applies to our imaginings of real people.

On the other hand, there is a contrast which usually pertains to our imaginings of real people: the contrast between our imaginings of what they will do and what they actually do in the world. But of course, fictional characters do not have a counterpart out there that has conspicuously more independence and agency. In other words, those qualities aren’t being constantly “overshadowed” by the real-life versions.

These theories may go some way towards explaining some of the broader aspects of what’s going on. Yet the more researchers delve into thought and imagination, the more difficult it is to say exactly how much control over our thoughts and actions any of us actually have – and to what extent the control we feel we have is an illusion.The Conversation

John Foxwell, Postdoctoral Research Fellow in the Department of English, Durham University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Unknown's avatar

Fan Impatience


The link below is to an article that looks at fan impatience – i.e. as in waiting for George R. R. Martin to finish the next book.

For more visit:
https://www.theguardian.com/books/booksblog/2020/jul/29/first-george-rr-martin-now-patrick-rothfuss-the-curse-of-sequel-hungry-fans

Unknown's avatar

Reading and Writing in Early Modern Europe


The link below is to an article that looks at reading and writing in early modern Europe.

For more visit:
https://lithub.com/in-early-modern-europe-reading-and-writing-meant-getting-your-hands-dirty/

Unknown's avatar

Writing a Book Review


The link below is to an article that looks at how to write a book review.

For more visit:
https://bookriot.com/how-to-write-a-professional-book-review/