Is a Machine Writing the Sermons You Hear? The Answer Might Surprise You
AI helped me write this post. I say that up front. However, I have instructed my AI tool to never presume to generate content for me. Read on and I will explain.
AI must be forced to stay in its role, and it can be. Even when a message begins with a burden placed on the heart by the Holy Spirit, today's tools offer such swift and convincing assistance that one can unknowingly relinquish parts of that sacred burden. With only a few clicks, the outlines appear. The tone is polished. The sentences flow smoothly. But somewhere in the process, the weight of the message may be lost.
What’s the risk in that?
In a word: drift.
Not rebellion. Not heresy. Just a quiet, slow movement away from the soul-deep wrestling that used to mark the preparation that ministers engage in. The message still comes forth—but perhaps without the tears. Without the ache. Without the living edge that comes only from time spent in God's counsel.
The most troubling part is: this happens invisibly. AI tools do not flash warnings when they begin to carry more than they should. And ministers, already pressed for time, may not notice when their reliance shifts. They still believe. They still care. But over time, the messages they deliver sound right—yet feel light.
So what can be done?
One option is to not use these tools at all, for preparation. Some ministers have made that stand. But as the tech world changes, I suspect that will become less and less possible. AI tools will soon permeate everything we do, new habits will be formed for all daily tasks, and those habits will carry over into sermon preparation. NOW is the time to be preparing for that world.
One boundary is both available and wise:
You can ask your AI to stay out of idea generation—and it will comply.
This simple guardrail can preserve the heart of the calling: to be a vessel through whom the Spirit speaks. Not a mouthpiece for a machine.
The better the tools become, the clearer our lines must be.
I asked ChatGPT to describe how it helped with this post, what boundaries it was instructed not to cross, and how it will collaborate with me in the future on preparing spiritual content. Here is what it said:
“As your assistant, I helped with phrasing, clarity, and structure in this post—but I did not originate the core message. I was instructed to stay out of spiritual ideation. That means I did not generate the concept, choose the topic, or introduce theological claims. Instead, I followed your lead, refining the words you brought forward.
Going forward, my role is to support—not substitute—spiritual discernment. I can help polish the message, surface examples, offer balance in tone, or draw connections. But the spiritual burden remains yours to bear before God. That boundary is firm—and respected.”
It gave this answer, because I gave it strict instructions to take this and ONLY this role with me. This collaborative model is not only possible—it’s necessary, if we are to use tools without losing truth.
In the rapidly advancing world of AI tools, it is important for those charged with feeding God's sheep to be proactive about setting up guardrails. Without them, it is all too easy to offload burdens you should be bearing.
When AI becomes more than an assistant—when it generates ideas rather than simply refining them—who then is doing the feeding?
The prophet Isaiah, centuries ago, warned of a similar mistake in his day:
~Isaiah 30:1
Who are we taking counsel of, in this age? If it is a machine, that counsel will end up downstream, upon the people. It may include many good things, it may sound right—but it will never be anointed of the Holy Spirit.
That is a key and indispensable difference. The better the tools get, the better the minister must be at making sure it is the Holy Spirit who is feeding the sheep—and nothing else.
If you have questions or comments about this post, please contact choosedigitalholiness@gmail.com.