“AI” generated writing

Neil Clarke, editor of a respected science fiction magazine, reports on his blog that numbers of spammy short fiction submissions are way up for his publication. He says that spammy submissions first started increasing during the pandemic, and “were almost entirely cases of  plagiarism, first by replacing the author’s name and then later by use of programs designed to ‘make it your own.'”

Helpfully, he gives an example of what you get with one of the programs to “make it your own.” First he gives a paragraph from the spam submission, which sounds a little…odd. Then he provides the paragraph from the original short story on which the spam submission was based. However, Clarke says: “These cases were often easy to spot and infrequent enough that they were only a minor nuisance.”

Then in January and February, spammy submissions have skyrocketed. Clarke says: “Towards the end of 2022, there was another spike in plagiarism and then ‘AI’ chatbots started gaining some attention, putting a new tool in their arsenal…. It quickly got out of hand.” It’s gotten so bad that now 38% of his short fiction submissions are spammy, either “AI” generated,* or generated with one of those programs to “make it your own.”

38%. Wow.

Clarke concludes: “”It’s not just going to go away on its own and I don’t have a solution. … If [editors] can’t find a way to address this situation, things will begin to break….”

This trend is sure to come to a sermon near you. As commenters on the post point out, writers are already using chatbots to deal with the “blank page struggle,” just trying to get words on the paper. (To which Neil Clarke responds that his magazine has a policy that writers should not use AI at any stage in the process of writing a story for submission.) No doubt, some minister or lay preacher who is under stress and time pressure will do (or has done) the same thing — used ChatGPT or some other bot to generate an initial idea, then cleaned it up and made it their own.

And then “AI” generated writing tools will improve, so that soon some preachers will use “AI” generated sermons. For UU ministers, it may take longer. There are so few of us, and it may take a while for the “AI” tools to catch on to Unitarian Universalism. But I fully expect to hear within the next decade that some UU minister has gotten fired for passing off an “AI” generated sermon as their own.

My opinion? If you’re stressed out or desperate and don’t have time to write a fresh sermon, here’s what you do. You re-use an old sermon, and tell the congregation that you’re doing it, and why — I’ve done this once or twice, ministers I have high regard for have done this, and it’s OK, and people understand when you’re stressed and desperate. Or, if you don’t have a big reservoir of old sermons that you wrote, find someone else’s sermon online, get their permission to use it, and again, tell the congregation that you’re doing it, and why. Over the years, I’ve had a few lay preachers ask to use one of my sermons (the same is true of every minister I know who puts their sermons online), and it’s OK, and people understand what’s it like when you’re stressed and desperate and just don’t have time to finish writing your own sermon.

But using “AI” to write your sermons? Nope. No way. Using “AI” at any stage of writing a sermon is not OK. Not even to overcome the “blank page struggle.” Not even if you acknowledge that you’ve done it. It’s spiritually dishonest, and it disrespects the congregation.

* Note: I’m putting the abbreviation “AI” in quotes because “artificial intelligence” is considered by many to be a misnomer — “machine learning” is a more accurate term.

The singularity as atheist religion

In a talk titled “Dude You Broke the Future,” science fiction author and atheist Charlie Stross takes on Ray Kurzweil and other advocates of the “singularity,” the moment when all our problems will be solved with the emergence of transhuman artificial intelligence:

“I think transhumanism is a warmed-over Christian heresy. While its adherents tend to be vehement atheists, they can’t quite escape from the history that gave rise to our current western civilization. … If it walks like a duck and quacks like a duck, it’s probably a duck. And if it looks like a religion it’s probably a religion. I don’t see much evidence for human-like, self-directed artificial intelligences coming along any time now, and a fair bit of evidence that nobody except some freaks in university cognitive science departments even want it. What we’re getting, instead, is self-optimizing tools that defy human comprehension but are not, in fact, any more like our kind of intelligence than a Boeing 737 is like a seagull. So I’m going to wash my hands of the singularity as an explanatory model without further ado — I’m one of those vehement atheists too — and try and come up with a better model for what’s happening to us. …”

I find it delightful to see a self-proclaimed “vehement atheist” calling out other atheists for doing religion. This is especially admirable, since those other atheists would doubtless insist that they are not doing religion at all; they would claim that they are doing science. Not only that, those other atheists are doing bad religion — transhumanism is as bad as the Prosperity Gospel, insofar as both types of religion are barely believable, have no redeeming social worth, do not engage in worthwhile cultural production, assert that the vast majority of humanity will not be “saved,” spread fear, and are stupid and hard to believe.

This is just a parenthetical remark in a much longer talk — and the rest of the talk is definitely worth reading, particularly for Charlie Stross’ take on corporations as AIs that are making global climate change accelerate.