There is a lot of ongoing chat – particularly as it pertains to ministry – on the relative merits of AI. I am not one of those who is hugely sceptical of the benefits. I am one of those who is mindful of severe limitations.
Where AI – of the chatbot variety – is particularly weak is when you are relying on it for information that you are not wholly certain of yourself. I have run various test questions, asking Chat GPT about some niche and other less niche bits of information that I have a reasonable handle on. Its answers are a mixed bag, to say the least.
Now, in one sense, this is to be expected. Nothing is perfect. Human being make claims and are wrong sometimes too. C’est la vie. Unfortunately – excepting a very particular kind of human – most people are more willing to admit when they are unsure than Chat GPT. This is one of the biggest flaws with it in my opinion. AI bots are programmed not only to give you answers, but to give you incredibly confident answers. Worse, they are programmed to even give confident sounding answers even when what they are saying is entirely made up!
Of course, if you know your subject well, you will be able to catch these confidently stated untruths. The issue is, unless you are very well acquainted with the information you are asking for, you will almost certainly get an incredibly confident answer back and it is touch and go as to whether it will be true or a complete fabrication that sounds reasonably compelling. Of course, if you know your subject that well to catch the falsehoods, it begs the question why you need an AI bot to answer your question. If you don’t know your subject that well, it is hard to see how you can be at all confident that it won’t ply you with reasonable sounding, confidently stated nonsense. As far as information gathering and factual content is concerned, they are pretty horrendous.
Others have boasted of using AI to write their sermons for them. Whether that is adding in parts or getting a whole script. Notwithstanding the problem stated above and so having little to no confidence it could offer a credible interpretation of the passage in front of you, there are other issues here. Even if it manages to offer a decent overview of your passage, how can AI credibly apply a passage to your particular people? How can it know the needs of the congregation? How can it know whether your people need a hard challenge or a soft word and precisely what it should be?
However, one area I have found AI helpful is in the sphere of content summary. Our community groups are based on our Sunday sermons. I have found Chat GPT does a pretty good job of taking my full sermon script and giving me a helpful summary. The reason for such a summary is as a reminder of the key points from Sunday’s sermon before we jump into the further points of application we will discuss. To get AI to do the summary is a time saver for me inasmuch as I don’t need to think about the best way to summarise it myself.
The above issues are avoided when it comes to content summary. For one, I am familiar enough with my own sermon script to know if the summary provided is no good. Further to that, I am feeding the bot the entire content. I am asking it for a summary of all the data I input. I am not relying on it pulling “relevant” information that I have no access to. This means I can pick up if it is offering factual errors easily enough as I know my content and I don’t need to worry about it pulling in outside sources of information because I am only asking it to summarise a limited piece of data. All my tests on this front have been remarkably successful and it gives me a very good summary of what has been written.
As with most tools, AI is neither to be entirely shunned nor necessarily embraced. It is most useful when used where it can operate well and best avoided where it has severe limitations or even terrible flaws. The issue is less about whether we should employ it and more about when and where we should employ it.
