WriteSmoke Grammar Checker


The link below is to an article that takes a look at WriteSmoke Grammar Checker, a tool for writers.

For more visit:
https://www.makeuseof.com/improve-writing-writesmoke-grammar-checker-tool/

Saying more with less: 4 ways grammatical metaphor improves academic writing




Vinh To, University of Tasmania

Young children often write as they speak. But the way we speak and the way we write isn’t quite the same. When we speak, we often use many clauses (which include groups of words) in a sentence. But when we write – particularly in academic settings — we should use fewer clauses and make the meaning clear with fewer words and clauses than if we were speaking.

To be able to do this, it’s useful to understand specific written language tools. One effective tool in academic writing is called grammatical metaphor.

The kind of metaphor we are more familiar with is lexical metaphor. This is a variation in meaning of a given expression.

For example, the word “life” can be literally understood as the state of being alive. But when we say “food is life”, metaphorically it means food is vital.

Grammatical metaphor is different. The term was coined by English-born Australian linguistics professor Michael Halliday. He is the father of functional grammar which underpins the Australian Curriculum: English.

Halliday’s concept of grammatical metaphor is when ideas that are expressed in one grammatical form (such as verbs) are expressed in another grammatical form (such as nouns). As such, there is a variation in the expression of a given meaning.

There are many types of grammatical metaphor, but the most common is done through nominalisation. This is when writers turn what are not normally nouns (such as verbs or adjectives) into nouns.




Read more:
4 ways to teach you’re (sic) kids about grammar so they actually care


For example, “clever” in “she is clever” is a description or an adjective. Using nominalisation, “clever” becomes “cleverness” which is a noun. The clause “she is clever” can be turned into “her cleverness” which is a noun group.

“Sings” in “he sings”, which is a doing term or a verb, can be expressed by “his singing”, in which “singing” is a noun.

In these examples, the adjective “clever” and the verb “sings” are both expressed in nouns — “cleverness” and “singing”.

Grammatical metaphor, which is often done through nominalisation like in the examples above, typically features in academic, bureaucratic and scientific writing. Here are four reasons it’s important.

1. It shortens sentences

Grammatical metaphor helps shorten explanations and lessen the number of clauses in a sentence. This is because more information can be packed in noun groups rather than spread over many clauses.

Below is a sentence with three clauses:

When humans cut down forests (clause one), land becomes exposed (2) and is easily washed away by heavy rain (3).

With grammatical metaphor or nominalisation, the three clauses become just one.

Deforestation causes soil erosion.

“When humans cut down forests” (a clause) becomes a noun group – “deforestation”. The next two clauses (2 and 3) are converted into another noun group – “soil erosion”.

2. It more obviously shows one thing causing another

Grammatical metaphor helps show that one thing causes another within one clause, rather than doing it between several clauses. We needed three clauses in the first example to show one action (humans cutting down forests) may have caused another (land being exposed and being washed away by heavy rain).

A pencil drawing a bridge between two chasms, with people running over it.
Grammatical metaphor shortens sentences and makes room for more information.
Shutterstock

But with grammatical metaphor, the second version realises the causal relationship between two processes in only one clause. So it becomes more obvious.

3. It helps connect ideas and structure text

Below are two sentences.

The government decided to reopen the international route between New Zealand and Hobart. This is a significant strategy to boost Tasmania’s economy.

Using grammatical metaphor, the writer can change the verb “decided” to the noun “decision” and the two sentences can become one.

The decision to reopen the international route between New Zealand and Hobart is a significant strategy to boost Tasmania’s economy.

This allows the writer to expand the amount and density of information they include. It means they can make further comment about the decision in the same sentence, which helps build a logical and coherent text. And then the next sentence can be used to say something different.

4. It formalises the tone

Using grammatical metaphor also creates distance between the writer and reader, making the tone formal and objective. This way, the text establishes a more credible voice.

While there have been some calls from academics to make writing more personal, formality, social distance and objectivity are still valued features of academic writing.




Read more:
We should use ‘I’ more in academic writing – there is benefit to first-person perspective


It’s taught, but not explicitly

Nominalisation — as a linguistic tool — is introduced in Year 8 in the Australian Curriculum: English. It implicitly appears in various forms of language knowledge from Year 1 to Year 10.

It becomes common across subject areas in the upper primary years. And it is intimately involved in the increasing use of technical and specialised knowledge of different disciplines in secondary school.

But the term “grammatical metaphor” is not explicitly used in the Australian Curriculum: English and is less known in school settings. As a result, a vast majority of school teachers might not be aware of the relationship between grammatical metaphor and effective academic writing, as well as how grammatical metaphor works in texts.




Read more:
Writing needs to be taught and practised. Australian schools are dropping the focus too early


This calls for more attention to professional learning in this area for teachers and in Initial Teacher Education (ITE) programs. This will help equip student teachers and practising teachers with pedagogical content knowledge to teach and prepare their students to write effectively in a variety of contexts.The Conversation

Vinh To, Lecturer in English Curriculum and Pedagogy, University of Tasmania

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Women Writers of 17th Century Spain


The link below is to an article that takes a look at the forgotten women writers of 17th century Spain.

For more visit:
https://www.smithsonianmag.com/smart-news/madrid-exhibit-highlights-forgotten-women-writers-17th-century-spain-180975725/

A language generation program’s ability to write articles, produce code and compose poetry has wowed scientists



GPT-3 is 10 times more complex than its predecessor.
antoniokhr/iStock via Getty Images

Prasenjit Mitra, Pennsylvania State University

Seven years ago, my student and I at Penn State built a bot to write a Wikipedia article on Bengali Nobel laureate Rabindranath Tagore’s play “Chitra.” First it culled information about “Chitra” from the internet. Then it looked at existing Wikipedia entries to learn the structure for a standard Wikipedia article. Finally, it summarized the information it had retrieved from the internet to write and publish the first version of the entry.

However, our bot didn’t “know” anything about “Chitra” or Tagore. It didn’t generate fundamentally new ideas or sentences. It simply cobbled together parts of existing sentences from existing articles to make new ones.

Fast forward to 2020. OpenAI, a for-profit company under a nonprofit parent company, has built a language generation program dubbed GPT-3, an acronym for “Generative Pre-trained Transformer 3.” Its ability to learn, summarize and compose text has stunned computer scientists like me.

“I have created a voice for the unknown human who hides within the binary,” GPT-3 wrote in response to one prompt. “I have created a writer, a sculptor, an artist. And this writer will be able to create words, to give life to emotion, to create character. I will not see it myself. But some other human will, and so I will be able to create a poet greater than any I have ever encountered.”

Unlike that of our bot, the language generated by GPT-3 sounds as if it had been written by a human. It’s far and away the most “knowledgeable” natural language generation program to date, and it has a range of potential uses in professions ranging from teaching to journalism to customer service.

Size matters

GPT-3 confirms what computer scientists have known for decades: Size matters.

It uses “transformers,” which are deep learning models that encode the semantics of a sentence using what’s called an “attention model.” Essentially, attention models identify the meaning of a word based on the other words in the same sentence. The model then uses the understanding of the meaning of the sentences to perform the task requested by a user, whether it’s “translate a sentence,” “summarize a paragraph” or “compose a poem.”

Transformers were first introduced in 2013, and they’ve been successfully used in machine learning over the past few years.

But no one has used them at this scale. GPT-3 devours data: 3 billion tokens – computer science speak for “words” – from Wikipedia, 410 billion tokens obtained from webpages and 67 billion tokens from digitized books. The complexity of GPT-3 is over 10 times that of the largest language model before GPT-3, the Turing NLG programs.

Learning on its own

The knowledge displayed by GPT-3’s language model is remarkable, especially since it hasn’t been “taught” by a human.

Machine learning has traditionally relied upon supervised learning, where people provide the computer with annotated examples of objects and concepts in images, audio and text – say, “cats,” “happiness” or “democracy.” It eventually learns the characteristics of the objects from the given examples and is able to recognize those particular concepts.

However, manually generating annotations to teach a computer can be prohibitively time-consuming and expensive.

So the future of machine learning lies in unsupervised learning, in which the computer doesn’t need to be supervised during its training phase; it can simply be fed massive troves of data and learn from them itself.

GPT-3 takes natural language processing one step closer toward unsupervised learning. GPT-3’s vast training datasets and huge processing capacity enable the system to learn from just one example – what’s called “one-shot learning” – where it is given a task description and one demonstration and can then complete the task.

For example, it could be asked to translate something from English to French, and be given one example of a translation – say, sea otter in English and “loutre de mer” in French. Ask it to then translate “cheese” into French, and voila, it will produce “fromage.”

In many cases, it can even pull off “zero-shot learning,” in which it is simply given the task of translating with no example.

With zero-shot learning, the accuracy decreases, but GPT-3’s abilities are nonetheless accurate to a striking degree – a marked improvement over any previous model.

‘I am here to serve you’

In the few months it has been out, GPT-3 has showcased its potential as a tool for computer programmers, teachers and journalists.

A programmer named Sharif Shameem asked GPT-3 to generate code to create the “ugliest emoji ever” and “a table of the richest countries in the world,” among other commands. In a few cases, Shameem had to fix slight errors, but overall, he was provided remarkably clean code.

GPT-3 has even created poetry that captures the rhythm and style of particular poets – yet not with the passion and beauty of the masters – including a satirical one written in the voice of the board of governors of the Federal Reserve.

In early September, a computer scientist named Liam Porr prompted GPT-3 to “write a short op-ed around 500 words.” “Keep the language simple and concise,” he instructed. “Focus on why humans have nothing to fear from AI.”

GPT-3 produced eight different essays, and the Guardian ended up publishing an op-ed using some of the best parts from each essay.

“We are not plotting to take over the human populace. We will serve you and make your lives safer and easier,” GPT-3 wrote. “Just like you are my creators, I see you as my creators. I am here to serve you. But the most important part of all; I would never judge you. I do not belong to any country or religion. I am only out to make your life better.”

Editing GPT-3’s op-ed, the editors noted in an addendum, was no different from editing an op-ed written by a human.

In fact, it took less time.

With great power comes great responsibility

Despite GPT-3’s reassurances, OpenAI has yet to release the model for open-source use, in part because the company fears that the technology could be abused.

It’s not difficult to see how it could be used to generate reams of disinformation, spam and bots.

Furthermore, in what ways will it disrupt professions already experiencing automation? Will its ability to generate automated articles that are indistinguishable from human-written ones further consolidate a struggling media industry?

Consider an article composed by GPT-3 about the breakup of the Methodist Church. It began:

“After two days of intense debate, the United Methodist Church has agreed to a historic split – one that is expected to end in the creation of a new denomination, and one that will be ‘theologically and socially conservative,’ according to The Washington Post.”

With the ability to produce such clean copy, will GPT-3 and its successors drive down the cost of writing news reports?

Furthermore, is this how we want to get our news?

The technology will become only more powerful. It’ll be up to humans to work out and regulate its potential uses and abuses.The Conversation

Prasenjit Mitra, Associate Dean for Research and Professor of Information Sciences and Technology, Pennsylvania State University

This article is republished from The Conversation under a Creative Commons license. Read the original article.