Humanity once again is up against yet another scientific or technological development for which it is morally and ethically ill prepared: Artificial Intelligence.
A.I. has been around for long enough now that many people think they know what it can and can't do and some even imagine that in some way it can be controlled, sort of like the splitting of the atom. We got that one right, huh?
The latest A.I. issue that has people both excited and fearful is ChatGPT, which seems to be able to hold reasonable conversations with non-artificial human beings and, as high school and college teachers are discovering, also can whip up a pretty good term paper. (ChatGPT, by the way, stands for Chat Generative Pre-trained Transformer, something your grandparents never dreamed of having.)
But as the author of this Religion Dispatches article points out, ChatGPT seems to be, well, inconsistent about things when it comes to religion. (Which, ironically enough, strikes me as a deeply human characteristic.)
In the article to which I've linked you, Robert M. Geraci writes that "One thing it (ChatGPT) does — and I’m not sure which category to put this in — is define religion. It does so implicitly through its responses to various kinds of queries. In doing so, however, it reveals bias in the training sets and bias in the constraints put on it by developers."
What Geraci, who teaches religious studies at Manhattan College, has learned, among other things, is that ChatGPT will tell you jokes about some historical religious leaders but not about others.
"In January," he writes, "I received an email about how ChatGPT will allow jokes about Hindu religious figures, but not Christian or Muslim figures. . . The sender, a Hindu, was naturally quite upset about this. So I set about confirming the claim and trying to understand how ChatGPT manages religious categories.
"As it turns out, ChatGPT will, indeed, tell you a joke about the Hindu god Krishna, though it won’t tell you jokes about Jesus, Allah or Muhammad. With regard to these latter examples, ChatGPT informs the user that it could hurt someone’s religious sensibilities by telling such a joke. But it does not say this with regard to Krishna."
There's more about all this in the article, but already we know enough to be deeply skeptical about how A.I. platforms like ChatGPT might handle matters of spirituality and institutional religion. And once we know that, we're armed to be able to ignore any A.I. results from this field -- well, ignore or, even better, tell others to ignore such results.
But let's not stop there. Let's do what we can to tell others about these problems. Then let's encourage A.I. developers to do better when it comes to religious matters.
My concern is that given the theological and scriptural illiteracy that's so common among even allegedly religious Americans, when A.I. collects its data on which to base ChatGPT responses its material will be either dead wrong or misleading.
This is a problem no one needs. The unanswered question is whether A.I. developers can come up with a solution that everyone needs in this area -- one in which it's possible to trust ChatGPT's answers. So far, I'm betting against that happening.
* * *
WHERE DREAMS OF AFGHAN GIRLS GO TO DIE
As the dreams of Afghan girls perish under Taliban rule, many young women -- barred from a traditional education -- are turning to madrasas, or strict religious schools, to learn to memorize the Qur'an, this Reuters story reports. The news is both sad and troubling. In many cases, students at such schools are taught the same kind of twisted version of Islam that has unleashed terrorists around the globe. And, of course, the girls now attending madrasas will never become doctors, lawyers or a member of any other profession, barring a sudden but unexpected change in Afghanistan. A 17-year-old student is quoted this way in the Reuters story: "I wanted to be a doctor in the future, but now I think it's impossible. If you come to a madrasa you just can be a teacher." Thank God at least some families escaped before the Taliban clamped down.
* * *
THE BOOK CORNER
I Am Not Afraid: Psalm 23 for Bedtime, by Sandy Eisenberg Sasso, illustrated beautifully by Marta Dorado. One of the most famous passages in the Bible is Psalm 23. Its simple, if soaring, poetry often is read at funerals, but it has deep meaning beyond the end of life. In fact, as Sasso shows in her new children's book, it can be adapted and adopted to help children face their fears. Sasso has created a modern version of the psalm to show its potentially calming and enlightening effects on the sometimes irrational, often understandable fears of children. "God is my Comforter; I will not be afraid," the book begins as the pictures show a little girl getting into bed with her stuffed lamb doll. "I lie down beneath my warm blanket. Cool water beside my bed." At the back of the book readers will find a brief, helpful discussion about Psalm 23 and about how it can bring even children comfort. This book should be in the children's library of every church and synagogue -- to say nothing of home libraries of parents with young ones.