AI and UU sermons

Should Unitarian Universalists use so-called AI (Large Language Models, or LLM) to write sermons?

Since Unitarian Universalists don’t have a dogma to which we must adhere, there will be many answers to this question. Here are my answers:

I/ Adverse environmental impact of LLMs

Answer: No. The environmental cost of LLMs is too great.

First, we all know about the huge carbon footprint of LLMs. And the more complex the answer required from the LLM, the more carbon that is emitted. Deborah Prichner, in a June 19, 2025, Science News article on the Frontiers website, sums up the impact by quoting someone who researched energy use of LLMs:

“‘The environmental impact of questioning trained LLMs is strongly determined by their reasoning approach, with explicit reasoning processes significantly driving up energy consumption and carbon emissions,’ said … Maximilian Dauner, a researcher at Hochschule München University of Applied Sciences…. ‘We found that reasoning-enabled models produced up to 50 times more CO2 emissions than concise response models.’”

Thus, not only do LLMs have a big carbon footprint, but handling something as complex as a sermon could result in a carbon impact 50 times greater than the lowest LLM carbon footprint.

Second, the data centers running LLMs use a tremendous amount of fresh water. In their paper “Making AI Less ‘Thirsty’: Uncovering and Addressing the Secret Water Footprint of AI Models,” Pengfei Li (UC Riverside), Dr. Jianyi Yang (U Houston), Dr. Mohammad Atiqul Islam (U Texas Arlington), and Dr. Shaolei Ren (UC Riverside) state:

“The growing carbon footprint of artificial intelligence (AI) has been undergoing public scrutiny. Nonetheless, the equally important water (withdrawal and consumption) footprint of AI has largely remained under the radar. For example, training the GPT-3 language model in Microsoft’s state-of-the-art U.S. data centers can directly evaporate 700,000 liters of clean freshwater, but such information has been kept a secret. More critically, the global AI demand is projected to account for 4.2 – 6.6 billion cubic meters of water withdrawal in 2027, which is more than the total annual water withdrawal of … half of the United Kingdom.”

Third, on 1 May 2025, IEEE Spectrum reported that “AI data centers” cause serious air pollution. The article, titled “We Need to Talk About AI’s Impact on Public Health: Data-center pollution is linked to asthma, heart attacks, and more,” raises several concerns. The authors write:

“The power plants and backup generators needed to keep data centers working generate harmful air pollutants, such as fine particulate matter and nitrogen oxides (NOx). These pollutants take an immediate toll on human health, triggering asthma symptoms, heart attacks, and even cognitive decline.”

In sum: Because my religious commitments call on me to aim for a lower ecological impact, the environmental impact of LLMs alone is enough to stop me from using them to write sermons.

II/ Sermons as human conversations

Answer: No. I feel that sermons should be the result of human interaction.

You see, for me, a sermon should arise from the spiritual and religious conversations that people are having in a specific congregation or community. As a minister, I try to listen hard to what people in the congregation are saying. Some of what I do in a sermon is to reflect back to the congregation what I’m hearing people talk about. At present, a LLM cannot access the conversations that are going on in my congregation — a LLM can’t know that P— made this profound observation about their experience of aging, that A— asked this deep question about the reality of the death of a family member, that C— made a breakthrough in finding a life direction, that J— took this remarkable photograph of a coastal wetland. Some or all of those things affect the direction of a sermon.

Mind you, this is not true for all religions. Deena Prichep, in a 21 July 2025 article on Religion News Service titled “Are AI sermons ethical? Clergy consider where to draw the line,” states that “The goal of a sermon is to tell a story that can break open the hearts of people to a holy message.” In other words, according to Prichep, for some religions the role of the preacher is to cause other people to accept their holy message. Prichep quotes Christian pastor Naomi Sease Carriker as saying: “Why not, why can’t, and why wouldn’t the Holy Spirit work through AI?” I can see how this would be consistent with certain strains of Christianity — and with certain strains of Unitarian Universalism, for that matter, where the important thing is some abstract message that somehow transcends human affairs.

But that’s not my religion. My religion centers on the community I’m a part of. Yes, there is a transcendent truth that we can access — but as a clergyperson, I don’t have some special access to that transcendent truth. Instead, truth is something that we, as a community of inquirers, gradually approach together. Any single individual is fallible, and won’t be able to see the whole truth — that’s why it’s important to understand this as a community conversation.

As a clergyperson, one thing I can do is to add other voices to the conversation, voices that we don’t have in our little local community. So in a sermon that’s trying to help us move towards truth, I might bring in William R. Jones, Imaoka Shinichiro, or Margaret Fuller (to name just a few Unitarian Universalist voices). Or I might quote from one of the sacred scriptures — i.e., from one of the sources of wisdom traditions — from around the world. Now it is true that maybe a LLM could save me a little time in coming up with some other voices; but given the huge environmental costs, it seems silly to save a small amount of time by using a LLM.

III/ Biases built into LLMs

Answer: No, because of hidden biases.

LLMs are algorithms trained on digitized data which has been input into them. For a LLM, the digitized data is mostly in the form of text. But we know that certain kinds of authors are going to be under-represented in that digitized data: women, non-Whites, working class people, LGBTQ people, etc. The resulting biases can be subtle, but are nonetheless real.

As a Universalist, I am convinced that all persons are equally worthy. I have plenty of biases of my own, biases that can keep me from seeing that all persons are equally worthy of love — but at least if my sermons are affected by my own biases, my community can successfully challenge me about my biases. If I use a LLM model to write a sermon, a model that’s riddled with biases that I’m not really aware of, that makes it harder for my community to help me rid my sermons of my biases.


IV/ Final answer: No

Would I use a LLM to write a sermon?

No. It goes against too many things I stand for.

Should you use a LLM to write your sermons?

I ‘m not going to answer that question for you. Nor should you ask a LLM model to answer that question for you. We all have to learn how to be ourselves, and to live our own lives. Once we start asking others — whether we’re asking LLMs or other authority figures — to answer big questions for us, then we’re well on the road to authoritarianism.

Come to think of it, that’s where we are right now — on the road to authoritarianism. And that’s a road I choose not to follow, thank you very much.