The AI invents a quote, badly

I mentioned a while back that one of the ChatGPT papers handed in to me last semester included a made-up quotation that not only doesn’t appear in the original but actually contradicts what the author said.

Here’s the AI:

Edgar Roberts defines a story as a “verbal representation of human experience” that “usually involves a sequence of events, a protagonist, and a resolution” (Roberts, 2009, p. 95).

And here’s what the real Edgar Roberts has to say on page 95, the first page of “Chapter 5 Writing about Plot: The Development of Conflict and Tension in Literature”:

Stories, drama, narrative poetry, and films are made up mostly of actions or incidents that follow one another in chronological order. The same is also true of life, but there is a major difference: Fiction must make sense even though life itself does not always seem to make sense at all. Finding a sequential or narrative order is therefore only a first step in our consideration of fictions. What we depend on for the sense or meaning of fiction is plot–the elements governing the unfolding of the actions.

The English novelist E.M. Forster, in Aspects of the Novel (1927), presents a memorable illustration of plot. To illustrate a bare set of actions, he proposes the following: “The king died, and then the queen died.” Forster points out, however, that this sequence does not form a plot because it lacks motivation and causality; it’s too much like life itself to be fictional. Thus, he introduces motivation and causality in his next example: “The king died, and then the queen died of grief.” The phrase “of grief” shows that one thing (grief) controls or overcomes another (the normal desire to live), and motivation and causation enter the sequence to form a plot. In a well-plotted story or play, one thing precedes or follows another not simply because time ticks away but more important, because effects follow causes. In a good work of fiction, nothing is irrelevant or accidental; everyone is related and causative. (95)


ChatGPT: verbal representation of human experience, protagonist, resolution, p95

Edgar Roberts: no verbal representation of human experience, no protagonist, no resolution, not on page 95 or anywhere else in the book as far as I can tell. Roberts may well believe all these things, but if so he doesn’t think them important enough to discuss.

More to the point, Roberts specifically rejects the idea that a story can be defined as a sequence of events. For Roberts, a story is a sequence of events with motivation and causality. Stuff happens versus stuff happens for a reason: those are two different things.

This reminds me of the precision teaching distinction between EGGs and NEGGs, examples and nonexamples. e.g.: What is a chair (examples), what isn’t (nonexamples)? To teach well, you need both.

To teach well, you also need “close-in” nonexamples, or border cases. A close-in nonexample is missing one–just one–critical feature of the concept being taught. Obviously, close-in nonexamples are essential to definition, as Roberts understands. That’s why he devotes his chapter on plot to distinguishing between a sequence of events with motivation and causality (plot) and a sequence of events without (plot). 

It’s hard for me to imagine the AI ever distinguishing between EGGs and close-in NEGGs. Trial-and-error learning without any built-in guidance just doesn’t get you there (I think).

But shouldn’t the AI be able to tell whether a quotation actually appears on the page it’s citing?

That doesn’t seem hard to me, but am I missing something?

Artificial intelligence: other posts

2 thoughts on “The AI invents a quote, badly

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s