Category Archives: Chat GPT

What does “to write” mean in 2023?

I used to think I knew what it meant to write and to teach how to write, but in recent weeks I am not so sure anymore.

I’ve read how AI Chat GPT can write paragraphs hard to distinguish from student-written paragraphs.  I’ve read some of those Chat GPT paragraphs, and I can’t tell the difference.  Is this how students—those who can afford Chat GPT—will write now:  input information and receive coherent, grammatically correct output to turn in for assignments?

Since students will certainly use Chat GPT and other AI like it, what do teachers teach?  If not on the writing process, should the focus be on key words?  Should teachers  look at the output and think, well, the input must have been pretty good to achieve this good of an output, so I’ll give the student an A+ on input.  Are key words what we will be grading from now on since we can expect the actual composing will be done by a machine?

Do teachers need to ask students to weave some highly local information—the spelling bee yesterday at XYZ School, the performance of substitute teacher Mrs. Poggi last week—into their writing so that AI has no way to access that local information into its output, and so students are forced to write for themselves?

Do teachers need to look at the kind of writing AI can do well—description, for example, and historical summaries—and no longer assign that kind of writing?  Do teachers need to look at the kind of writing AI can’t do well—hypothetical situations, for example, or inference or human emotions—and and assign writing embedding those concepts?  (If General Lee had asked for your advice when President Lincoln offered him command of the Union armies at the start of the Civil War, what would you have advised him in view of his reputation then and today?)

With visual information sources (streaming TV, YouTube, video games, and Facetime) replacing more static sources (newspapers, journals, and letters) in the 21st century, is the kind of writing teachers focused on in the 20th century no longer useful to students today?  Should English teachers stop asking students to write essays determining who is responsible for the deaths of Romeo and Juliet and instead ask students to create a video comparing the open carrying of swords in 16th century Verona to the open carrying of guns in the US today—complete with photos of swords and guns and videos of  sword fights and gun fights?

We are living in a period of rapid flux, with the technology of 2022 already out of date in 2023.  The teacher education I received in the early 1990s was outdated then, with no mention of how to incorporate computers into learning—and for that matter, no course on how to teach writing.  I assume courses on teaching writing are now offered, but I suspect none incorporate how to use Chat GPT as a writing tool.  And by the time they do, it will be supplanted by a more advanced technology.

Which brings me back to my point:  What does “to write” mean in 2023.

Chat GPT–just another calculator, stick shift or spell check?

First three stories.

One.  When I took the SATs years ago, no calculators were allowed.  Square roots?  Do the time-consuming math in longhand.  Sine, cosine, and tangent?  Draw and label the triangles, write the formulas from memory (no formulas were given within the exam), and compute.  Today 55 minutes of the SAT allows calculators.  Some of the drudgery of the test has been eliminated.

Two.  When I learned to drive a car, I needed to learn and be tested on a stick shift.  I had to depress the clutch every time I changed a gear.  When starting uphill from a parked position, I needed to release the brake, depress the clutch and maneuver into first gear all in one quick, smooth motion or the car would stall.  But years later, I drove an automatic shift, and didn’t need to depress the clutch (there wasn’t any!) or shift gears or stress over getting into first gear going uphill.  It was so much easier.

Three.  A student wrote an essay and sent it to me online.  Squiggly lines suggested places where the software program perceived mistakes though it didn’t explain what the mistakes were.  The student clicked on each underlined word, and the software suggested corrections.  The student clicked on the suggested corrections, and the software instantly replaced the mistakes.  No dictionaries, no grammar handbooks, no need to even understand why the original mistake was wrong.

What do these stories have in common?  Technology—the kind which makes life easier.

The calculator makes computing math easier.  I still have to figure out what math to use and to input the numbers, but the calculating is done by a machine, freeing me for thinking.  An automatic transmission makes driving easier, allowing me to ignore the mechanics of driving so I can focus on the rules of driving and the actions of other vehicles.  Software backed by millions of data points and patterns  suggests writing corrections which usually are correct.

This brings me to Open AI’s Chat GPT, a controversial software which searches for patterns in millions and millions of word, grammar and sentence data.  As it finds patterns and incorporates them into its “brain,” Chat GPT becomes more and more able to suggest likely outcomes for various situations, including writing a student’s essay.

Like with the examples of technology above, Chat GPT technology makes it easier to do something—in this case, to write logically.  You can ask Chat GPT for a paragraph about many things you need written, and you can suggest a style and vocabulary, such as that of a fourth grader.  Chat GPT can do that.  It searches its vast database for vocabulary and description likely to be used in the sentences of a nine-year-old, and then it writes whatever you need.

Chat GPT is at an early stage of its development.  It needs the correct input of data to produce the desired output of writing.  It can describe accurately but it cannot “think” the way a human being can think.  It can tell what has happened, but it cannot predict.  It cannot tell you what won’t happen, or what won’t work.  If programmers corrupt the inputted data, the outputted product is corrupt—perhaps not true, perhaps using foul vocabulary, perhaps written in university academic vocabulary and sentence structure rather than those of a fourth grader.  (Garbage in, garbage out.)

We are already used to baby steps in this kind of technology, as when software offers suggestions for grammar or spelling.  Teachers use this kind of help for their own writing, so they are likely to allow it for their students’ writing as well.  So why then the bru-ha-ha about Chat GPT?

Chat GPT goes beyond suggesting a synonym or a different spelling; Chat GPT can write the whole essay.  And often teachers cannot tell the difference.  Is it so different from the following story?

A college business student took an English writing class from me.  I questioned him about a paper he turned in because it seemed much better written than the student’s in-class assignments.  “Oh, I gave it to my father’s secretary, and she fixed it,” he said.  “Fixed it?”  “Well, mostly she wrote it,” he said.  He justified the situation by saying he worked for his father, and would inherit the business, and would always have a secretary to write for him.  “I don’t need to know how to write,” he said,  I explained this to the academic dean.  “Let it go,” she said.  The student graduated from a four-year college though he couldn’t write a coherent paragraph.

Is Chat GPT giving our students the latest iteration of a calculator, a stick shift, or spell check?  Or is Chat GPT giving students their own online secretaries–leveling the playing field for students who don’t have their father’s secretary to write their papers?  Is Chat GPT a bad thing?  Does it matter?