Let us name it … ASS

People talk about “artificial intelligence.” They get corrected by people who say, It’s not intelligence, it’s “machine learning.” But actually machines don’t learn either. All this false terminology isn’t serving us well. It obscures the fact that the humans who design the machines are the intelligences at work here, and by calling the machines “AI” they get to dodge any responsibility for what they produce.

In a recent interview, science fiction author Ted Chiang came up with a good name for what’s going on:

” ‘There was an exchange on Twitter a while back where someone said, “What is artificial intelligence?” And someone else said, “A poor choice of words in 1954”,’ [Chiang] says. ‘And, you know, they’re right. I think that if we [science fiction authors] had chosen a different phrase for it, back in the ’50s, we might have avoided a lot of the confusion that we’re having now.’ So if he had to invent a term, what would it be? His answer is instant: applied statistics.” [quoted by, originally in, emphasis mine]

Applied statistics is a much better term to help us understand what is really going on here. When a computer running some ChatBot application comes up with text that seems coherent, the computer is not being intelligent — rather, the computer programmers had assembled a huge dataset to which they apply certain algorithms, and those algorithms create text from the vast dataset that sounds vaguely meaningful. The only intelligence (or lack thereof) involved lies in the humans who programmed the computer.

Which brings me to a recent news article from Religion News Service, written by Kirsten Grieshaber: “Can a chatbot preach a good sermon?” Jonas Simmerlein, identified in the article as a Christian theologian and philosopher at the University of Vienna, decided to set up a Christian worship service using ChatGPT. Anna Puzio, who studies the ethics of technology at the University of Twente in The Netherlands, attended this worship service. She correctly identified how this was an instance of applied statistics when she said: “We don’t have only one Christian opinion, and that’s what AI [sic] has to represent as well.” In other words, applied statistics can act to average out meaningful and interesting differences of opinion. Puzio continued, “We have to be careful that it’s not misused for such purposes as to spread only one opinion…. We have to be careful that it’s not misused for such purposes as to spread only one opinion.”

That’s exactly what Simmerlein was doing here: averaging out differences to create a single bland consensus. I can understand how a bland consensus might feel very attractive in this era of deep social divisions. But as someone who like Simmerlein is trained in philosophy and theology, I’ll argue that we do not get closer to truth by averaging out interesting differences into bland conformity; we get closer to truth by seriously engaging with people of differing opinions. This is because all humans (and all human constructions) are finite, and therefore fallible. No single human, and no human construction, will ever be able to reach absolute truth.

Finally, to close this brief rant, I’m going to give you an appropriate acronym for the phrase “applied statistics.” Not “AS,” that’s too much like “AI.” No, the best acronym for “Applied StatisticS” is … ASS.

Not only is it a memorable acronym, it serves as a reminder of what you are if you believe too much in the truth value of applied statistics.