AI, humanity and Christian ministry

Jeremy Peckham BSc, FRSA

Hardly a day goes by without AI being in the news, either some new “ground breaking” achievement that is benefiting society, robot interviews or gloomy dystopian forecasts of its existential threat to our existence. In January 2023, ChatGPT[1] took the world by storm, achieving 100 million users in just two months. Just a few months later and questions about Christianity or the Bible and even the sermon you heard at church can now be answered by ChatGPT-powered chatbots trained with specific Christian content.[2] Over 300 people attended an experimental service at a Lutheran church in Germany conducted by a ChatGPT-powered AI avatar.[3]

These tools have enabled us, perhaps unwittingly, to create a digital priesthood, raising questions for us as to how God desires to communicate His truth and His word. What will be the impact of replacing personal evangelism, our pastor or a Christian friend with an unaccountable digital intermediary where we cannot trace back the words spoken? Will such tools blunt our spiritual formation as we come to depend on chatbots rather than spend the effort reflecting on the word of God?

In this article we will briefly look at the ability of Generative AI to emulate human capabilities and its impact on two specific areas, truth and relationships and the consequences for Christian ministry.

Minds, machines and gods

The most sophisticated AI tools are simply statistical pattern matchers or stochastic processes that rely on training data to develop a model of the "world" such as language or business processes. This is why such algorithms will probably always be biased, not only because their data sets are biased but because people are also biased. This encompasses the developers who select and label training data, and design the algorithms and training approach. They will also have a limited, even if very large, data set against which to produce a model. The stochastic nature of these algorithms explains why they produce so-called hallucinations or confabulations - inexplicable outputs that are purely made up and not true. This is a feature that Christians should particularly be aware of when thinking about using AI in ministry and outreach. ChatGPT is not equivalent to doing a Google search!

What AI algorithms are capable of achieving or doing now and in the future will depend on your worldview. If you have a materialistic worldview and see the brain as simply an evolved computing engine, then it is not unreasonable to assume that eventually we will be able to replicate it - including sentience and consciousness. It is fairly well acknowledged that we don't have a theory of intelligence let alone consciousness yet, so we can't programme it. However there are those who think that neural networks mimic the brain and that eventually it might be possible to replicate it in a computer, even if we can’t define intelligence. This explains why Google software engineer, Blaise Aguera y Arcas, in an article in the Economist, argued that neural networks were ‘striding toward consciousness’. In interactions with LaMDA, Google’s “breakthrough conversation technology” he stated that ‘I felt the ground shift under my feet, ….I increasingly felt like I was talking to something intelligent.’.[4]

Becoming like us

My biggest concern is not that these algorithms are clever or will become clever, like humans are, but that we perceive that they are.

Herein lies the danger to society. Many applications are in fact doing quite a good job at simulating human capabilities and even sentient attributes, but they are in no way intelligent (depending on your definition of intelligence!) nor sentient. One of the things that’s unique about AI algorithms when applied to replicate human capabilities, particularly when they interact with us, is that the only framework that we have for dealing with such artefacts is human to human relationships. We will therefore tend to treat such artefacts as if they were human.

When a computer generated image of a person or a humanoid robot produces a fluent human like, natural language, answer to our question, even with emotion, we will tend to trust that response, especially as the developers promote the narrative that these artefacts are smarter than us. Unlike a book or article, these outputs are not the actual thoughts of a single identifiable person, but a synthesis of words from the training corpus that matches the input request with the highest probability. On occasions they are also prone to produce incorrect output, referred to as confabulations or hallucinations. Yet there is a danger that the convenience of such tools could, over time, diminish our critical thinking skills. Were we to use Generative AI to write a sermon or prayer we might well ask, where is the Holy Spirit in all of this?

Truth in question

The emulation of the human characteristics represented by the various facets of AI technology, especially Generative AI, poses one of the biggest threats to truth and reality in our times. How will we know what is true or what is real? This challenge is accentuated when bad actors access this technology and use it to generate conspiracy theories, fake news around election times or to intimidate women with fake pornographic video of themselves.

How will ChatGPT and Christian versions of it shape how we view the gospel, the canon of scrip-ture and Christian tradition? No longer are we reading the wisdom of scripture or of a well-known theologian but a statistically based amalgam of the words they were trained on, sometimes with error. We know that the devil is the father of lies so we can expect Generative AI to be a tool that he will use against humanity. The scale and ubiquity of the digital world ensures that advances are rapidly disseminated and taken up and the potential for bad actors is almost unlimited. As Elon Musk put it way back in 2014, ‘With artificial intelligence we are summoning the demon’.[5]

Imaging Christ

As technologies simulate more and more human capabilities, the danger is that we come to rely on them and in so doing, dumb down our true humanity. Authentic relationships are diminished as we lose capacity to empathise, cognitive acuity is lost the more we look to machines to make decisions and creativity is subverted as we turn to computers for the creative arts. Ultimately, as with self-drive vehicles, we delegate moral agency, a trait unique to humans, to a machine. We must engage this technology carefully to avoid sacrificing our true humanity and imaging Christ authentically for the sake of convenience and efficiency.

Applications of AI, such as chatbots, impact how we value and regard relationships. In what way are we modelling Christ’s relational nature when we delegate evangelism or counselling to an anonymous chatbot? In fulfilling the Great Commission, it is tempting in the digital age and with Generative AI, to envision spreading our reach to many more people of different languages across the globe. Yet Christ himself showed us through the gospels that His mission was personal and embodied. He spent time with the woman at the well, he personally taught his small group of disciples, sometime choosing to move on rather than deal with the crowd seeking him out. As the Son of God, he could have manifest himself throughout the world yet he chose to be born in a lowly stable and to restrict his ministry largely to the tiny the region of Galilee.

The downside of AI is not so much the havoc that such tools could create in the hands of bad actors, sig-nificant though that is, but its impact on ordinary people engaging with various applications day-by-day and society becoming dependent on it and addicted to it. These are the steps that lead to modern day idolatry. We are in danger of creating idols in our own image and putting our trust in them. They could become a new intermediary between us and God, a digital priesthood.

Our mandate to steward creation means that we also have a mandate to be wise in how we use AI, will it be a tool in our hands that we are in control of or a creation that we become enslaved by.[6]

Webinar Recording | Slides

Jeremy Peckham is a technology entrepreneur and author of the book “Masters or Slaves? AI and the Future of Humanity” published by IVP in 2021.

He spent much of his career in the field of Artificial Intelligence, and was Project Director of a 20m Euro, 5 year pan European research project on Speech Understanding and Dialogue (SUNDIAL) that broke new ground in AI.

He founded his first company in 1993 through a management buy-out, based on the AI technology developed at Logica, and launched a successful public offering on the London Stock Exchange in 1996. Jeremy also served in church leadership for many years and writes and speaks on the ethical issues surrounding AI and on leadership.

_____________________

[1] 'ChatGPT (Generative Pre-Trained Transformer) is an example of what has come to be known as 'Generative AI'. These computer algorithms, in response to a user prompt, produce artificial content such as texts, computer code, images, audio or video based on the data that they have been trained on.

[2] Examples of such chatbots are biblemate.org and pastors.ai

[3] Associated Press News, Can a chatbot preach a good sermon? Hundreds attend church service generated by ChatGPT to find out, June 10, 2023, retrieved on 1 September 2023 at https://apnews.com/article/germany-church-protestants-chatgpt-ai-sermon-651f21c24cfb47e3122e987a7263d348

[4] The Economist, Artificial neural networks are making strides towards consciousness, according to Blaise Agüera y Arcas, September 2nd, 2022.

[5] Matt McFarland, Elon Musk: ‘With artificial intelligence we are summoning the demon.’, Washington Post, 2014 retrieved on 2 Aug 2023 at https://www.washingtonpost.com/news/innovations/wp/2014/10/24/elon-musk-with-artificial-intelligence-we-are-summoning-the-demon/

[6] These questions are addressed in more detail in: Jeremy Peckham, Masters or slaves, AI and the Future of Humanity, IVP, 2021.

The views and opinions expressed above are those of the author alone and do not necessarily reflect those of the Jubilee Centre or its trustees.

Previous
Previous

Why I Seldom Shop on a Sunday

Next
Next

Debunking the De-bankers