More examples of the ways A.I.-generated writing falls short of perfection.
In an earlier post, I went over a few examples of A.I.-generated writing and attempted to explain its failings (and the reasons for them). Since then, I’ve continued to work with A.I.-generated text, and to no surprise, I’ve found several additional examples. I’m sharing them here not only to help illustrate why A.I.’s writing is not and likely never will be a substitute for human-generated writing, but also to help human editors spot these errors in their own writing.
When labor markets soften, the importance of strong technical skills becomes increasingly critical.
Sounds reasonable. Unfortunately, this sentence contains an error that actually renders it false—because the skills themselves become more critical, not their importance. The importance per se is just a measure of how critical those skills are. While this sentence is still comprehensible, the fact that its subject noun is completely superfluous to its meaning sucks all the life out of it—it’s like gesturing to empty space, rather than to the thing you want the reader to focus on: “When labor markets soften, strong technical skills become increasingly critical.”
(This is as good a place as any to point out that I’ve never seen ChatGPT use a dash correctly. I used two ems in the paragraph above, both for emphasis, to mimic the cadence of human speech. If ChatGPT is capable of doing this, I’ve yet to observe it.)
Inflation is not only impacting the cost of consumer goods but also the labor used to manufacture and distribute them.
Human writers produce sentences like this quite often—too often!
Its problem is a common one: there’s a breakdown in parallel construction. Whenever you see phrases like “both X and Y” or “not only X but also Y,” these X–Y pairs (or lists) of items need to be set up such that the lead-in to the first item also agrees with the items that follow.
In this example (simplified as “not only impacting X but also Y”), “not only” precedes the present participle “impacting.” This sets us up for another participle after “but also”—like “not only impacting X but also exacerbating Y”, for example—but there is no participle there, just a plain old noun phrase. The human brain can compensate for this imbalance, but it takes effort, and the more work you make your reader do to interpret a faulty text, the more wearying and unpleasant the experience is.
For what it’s worth: I’d suggest “Inflation is impacting the costs of both consumer goods and the labor used to manufacture and distribute them.”
A failure to effectively communicate the bill’s contents has led to a lack of understanding and misconceptions about its probable effects.
There’s nothing grammatically incorrect about this sentence, technically, but it does fail to account for the way that people will read it. There’s ambiguity in the way the object nouns are phrased: does “a lack of understanding and misconceptions” mean that both understanding and misconceptions are lacking? Or were these meant to scan as two separate items? Increasingly, A.I. is being used to summarize texts written by other A.I.s, so it’s not hard to imagine a bot parsing this phrase incorrectly and confidently assuring its human readers that the lack of misconceptions is a pressing issue.
This is easy enough to fix with a simple transposition (“A failure to effectively communicate the bill’s contents has led to misconceptions and a lack of understanding about its probable effects”), but it takes a human being who actually understands what this combination of words means to do it.
In a similar vein: the problem with
Inflation has had a massive impact on the cost of goods and people.
isn’t strictly a grammatical one — it’s the implication that the people in question are also being bought and sold. A human writer probably would have caught that.
Bicycle lanes in urban areas are often unmarked, causing potential physical harm.
One of A.I.’s greatest weaknesses as a writer is its tendency to use phrases whose meanings are more or less clear even though they don’t technically say what we take them to say. Take a look at the placement of the modifier “potential” here. The potential for physical harm is present in the situation described above, so the phrase “potential physical harm” seems sensible. But where does that leave “causing”? With the qualifier placed at the end of this modifying phrase, it actually says that unmarked bike lanes absolutely, positively cause potential physical harm. And at the risk of getting lost in the metaphysical weeds: how can you definitively cause something that only potentially occurs?
The proper position for the qualifier is before the participle. Unmarked bike lanes have the potential to cause physical harm—it’s the causation itself that may or may not happen. To wit: “Bicycle lanes in urban areas are often unmarked, potentially causing physical harm.” (Better yet, say it like an actual person: “Bicycle lanes in urban areas are often unmarked, which can lead to physical harm.”) Just because the A.I.-generated sentence is comprehensible doesn’t mean it’s correct.
Will there be a Part III? Stay tuned!