Ever since ChatGPT exploded into the public sphere in November 2022, questions about its effect on employment and industry, and on the ethics of using new artificial intelligence (AI) systems more generally, have become more pressing than ever.
Both the scientific community and Hollywood have long been fascinated with the prospect of so-called ‘strong AI’—machines that function indistinguishably from a human mind. (Ex Machina and Ridley Scott’s Blade Runner are notable examples of film explorations of this idea.)
While this prospect raises curious hypothetical ethical questions, the real challenge lies not in the development of ‘strong AI’—which may not even be possible—but in the proliferation of ‘weak AI’, says Professor Neil Dodgson of Victoria University of Wellington, New Zealand.
Professor of Computer Graphics and Dean of Graduate Research, and with 25 years at Cambridge University behind him, Prof Dodgson will be appearing in Melbourne on Saturday 16 March for a panel discussion on ‘AI x Christianity: Gospel Wisdom for an AI world’. Organised by ISCAST, an institute committed to the conversation between science and Christianity, the event will explore how to navigate the challenges of artificial intelligence with real wisdom.
Ahead of his Melbourne appearance, Prof Dodgson says that one of the fundamental questions on the ethics of using AI is whether it is enhancing the development of the human person.
The use of AI tools is a question that universities are grappling with on many levels. A concrete example of this involves graduate students composing the (dreaded) literature survey chapters that are often a required component of a thesis.
‘Literature survey chapters are normally extraordinarily dull,’ he explains. ‘They’re really tedious to write because you’ve got to read a very large amount of literature and synthesise it down to a concise summary.’
With the emergence of ChatGPT, students can get the AI tool to do it for them, which might seem like a life-saver, except for a couple of things.
‘That’s [ethically] equivalent to you contracting it out to another human being, except it’s worse than that because ChatGPT is just a large language model. All it’s doing is stringing one word after another. So it might produce stuff that sounds good, some stuff that is accurate, and it’s likely to produce some stuff that’s completely wrong. It tends to simply hallucinate references.’
‘It’s not actually developing you as a human being,’ he says, ‘so that’s absolutely wrong. You should not be doing that. I believe that when you’re writing something, every reference you use should be something you’ve at least skim-read … You know what the content is and it’s backing up what you’ve said.’
Established AI tools that have previously supported student development are now moving into this area. Grammerly, a tool designed to help students with spelling and grammar, now offers the ability to rewrite whole paragraphs.
‘That is getting dodgy, because it’s not just fixing your grammar. It’s possibly changing the whole meaning.’
The church has the ability here to re-inject humanity into the argument.
This is just one small area in which AI presents ethical challenges. As someone who operates across a number of disciples, Prof Dodgson has a good grasp of some of the major concerns.
In the job market, an area where AI might have significant social implications, for example, is in the removal, in some sectors, of apprentice-level work.
‘It’s already a thought that in some areas, the senior jobs like senior reporter or senior policy analyst, they stay in place. But the juniors, the entry-level jobs which was people who collated information, wrote drafts, that sort of grunt labour that got people trained, you can use AI systems to do that grunt stuff,’ he says.
AI has also become better at reading human emotions. In the advertising space, Prof Dodgson says, AI can be used to test which advertisements ‘work’ at producing certain emotions during the trial stages of a marketing campaign.
On a broader level, the proliferation of weak AI could also exacerbate the loss of autonomy and privacy at the hands of tech companies who want to harness the ‘data’ of human behaviour for profit.
Behind all of this, Prof Dodgson says, is a particular, and problematic, worldview.
‘The way engineers look at the world is not the only worldview you can have,’ he explains. ‘If you think about some of the people that are driving the AI revolution—Mark Zuckerberg, Elon Musk, Bill Gates and others—they do have a very particular worldview, and that worldview is often that technology will solve everything.’
‘One of the things that the church knows,’ he says, ‘is that technological solutions do not fit well with how people really work.’
This way of looking at the world inevitably diminishes our understanding of what it is to be human.
‘The people who are creating these AIs are extremely successful people, many of whom have very little patience for people who are not successful,’ Prof Dodgson believes. With the constant drive towards greater efficiency, productivity and success that AI systems bring, there’s a real chance a lot of people will be left behind who don’t fit the mould of these AI ‘revolutionaries’.
‘We need to think about how this affects people generally, not just rich people or people who have been privileged or people who happen to find themselves in a space where they can benefit from us.’
When you’re using these AI tools, you should be using them to enhance you as a human being, not to replace you.
Some of the challenging questions we must ask are: ‘How does this impact us as a society? What sort of society do we want to build? A society where we value other human beings is what we want. Are you AI gurus building that, or are you actually marginalising even more people for the benefit of a small number of people who look like you?’
Prof Dodgson, who converted to Christianity when he was 19, believes this is where the church can engage with this issue.
‘Where I think the Christian worldview works is that we fundamentally believe that people are valuable in and of themselves, and there’s a whole bunch of stuff that comes out of that belief—one of which is actually, when you’re using these AI tools, you should be using them to enhance you as a human being, not to replace you.’
‘We can’t just sit on the sidelines carping on about this,’ he insists. ‘I think the church has the ability here to re-inject humanity into the argument.’
Dr Chris Mulherin, Director of ISCAST and a lecturer at Catholic Theological College (CTC), will also be on the panel for the Melbourne event. He explains how it has grown out of their organisation’s belief that Christianity must regularly engage with science and technology.
‘We are very concerned and driven by the fact that our society is increasingly dominated by science and technology as the place to go for the answers, and increasingly less connected to religious and philosophical things,’ he says. ‘We believe there’s a harmony between all areas of knowledge, and therefore we have to promote that.’
In line with Prof Dodgson’s reflections, Dr Mulherin believes that dominant forms of technology also come with certain worldviews driving them.
‘The most reductionist view, which seems to be very common, is simply that human beings are a bunch of information,’ he explains, ‘[that] it’s all contained in the brain … The question of what it is to be human is absolutely fundamental in a post-Christian society.’
This event is designed to keep Christians informed and engaged. ‘We want to help people to think about these things and at least get up to speed a little bit with what is such a fast-moving area.’
Visit iscast.org/aixchristianity to learn more about the ‘AI x Christianity’ event or or to register.
Banner image: depiction of humanity caught within the cogs and gears of technology. (Photo: OSV News/Reimund Bertrams, Pixabay.)