We’re exploring the spiritual implications of Artificial Intelligence with pastor and author Reverend Nathan Webb.
Listen and subscribe: Apple Podcasts / Spotify / Amazon / YouTube
Through the conversation, we explore the intersection of artificial intelligence (AI) and spirituality, discussing how AI can both enhance and complicate our spiritual practices. Nathan shares his personal experiences with AI tools that help him manage his daily tasks and ADHD, emphasizing the importance of ethical considerations in their use within faith communities. The conversation mentions the potential harms of AI, particularly in how it can distort our understanding of reality and relationships, urging listeners to approach AI with a spirit of curiosity and ethical awareness.
About our guest:
Rev. Nathan Webb is the founding pastor of Checkpoint Church, a digital-first church aimed at connecting with individuals who identify as nerds, geeks, and gamers. Nathan wrote God and the Machine: Navigating Faith in the Age of AI, which released in January of 2026.
In this episode:
[00:00] Welcome to the discussion on Artificial Intelligence
[05:25] AI Tools for Organized Living
[09:28] Balanced Approach to AI
[13:06] Ethical Concerns Surrounding AI
[16:05] AI for Accessibility and Connection
[17:24] Texting Jesus and AI Limits
[24:06] Responsible Technology Use in Church
[25:52] AI Slop and Internet Content
[29:54] Faith, AI, and Responsible Use
[33:05] AI Use: Ethics and Reflection
[37:00] Technology as a Connection Tool
Related Episodes:
Help us spread the word
- Tell others: friends, coworkers, and anyone else might benefit from these conversations.
- Share us on Facebook, Twitter, and other social media sites.
- Review us on Apple Podcasts, Google Podcasts, Spotify, or wherever you download the episode. Great reviews help others find us.
- Email our host Ryan Dunn about future topics and feedback.
More podcasts
- Get Your Spirit in Shape and other United Methodist podcasts
- Rev. Ryan Dunn also hosts and produces the MyCom Church Marketing podcast
Thank you for listening, downloading, and subscribing.
This episode posted on January 21, 2026
Episode Transcript:
Ryan Dunn:
The robots want you to know artificial intelligence can help organize daily tasks and enhance spiritual practices. So does AI spiritually help us or harm us? We're talking this and more with Pastor Nathan Webb on this episode of Compass. Hi, this is Compass. Finding Spirituality in the Everyday. My name is Ryan Dunn. First, let me read for you what I distilled as the theme of this episode. Then I'll read for the AI Recommended promo to this episode. Doing this underscores a major theme that comes out in this conversation that we're having.
Ryan Dunn:
We find utility in AI, but we need to consider ethical guardrails. So here's what's happening in this episode. I talked with Reverend Nathan Webb, who's the founding pastor of Checkpoint Church, an all digital expression of church which seeks to connect with individuals who identify themselves as nerds, geeks, and gamers. We've had Nathan on before and the work that he's doing in building community and digital spaces is really cool and inspiring. And now he's written a book which should be out any day. It's called God and the Machine. And through it, Nathan addresses what artificial intelligence might mean for communities of faith and how it can both harm and help us in our search for spiritual connection. I asked Nathan to talk with us about how AI might assist in our call to love God and love our neighbors, and how it might be utilized to distort our conception of what is and what is not real.
Ryan Dunn:
We got into quite a bit of conversation around ethics. That said, the recording software I use automatically generates a synopsis for all recordings. That's just one of the ways that AI is integrated into so much of what many of us do today. The following is what the AI kicked.
Ryan Dunn:
Out for this episode.
Ryan Dunn:
In this episode of Compass, Ryan Dunn engages in a thought provoking conversation with Nathan, a prominent figure from the reality show Too Hot to Handle, and a digital first pastor. Okay, I need to interrupt here, Real.
Ryan Dunn:
Life Ryan, because Nathan Webb wrote about.
Ryan Dunn:
Searching his name and finding a reality show contestant who shares the same name. And we joked about it at the beginning of the episode and now I fear that we're just adding to the bot's confusion around Nathan. So let me say this unequivocally. Reverend Nathan Webb of Checkpoint Church was not a contestant on Too Hot to Handle. Okay, let's continue with the AI's description. They delve AI.
Ryan Dunn:
Flag word that bugs me.
Ryan Dunn:
They delve into the intersection of artificial intelligence, AI and spirituality, exploring how AI tools can enhance organization and spiritual practices. Nathan shares his experiences with AI, particularly in journaling and content creation, emphasizing the importance of using technology responsibly within faith communities. He advocates for a balanced approach to AI, encouraging listeners to be aware of its implications, while fostering a spirit of curiosity and and ethical consideration in its use. All right, all that aside, I don't disagree with any of the rest of that and I appreciate that it actually brought up Nathan's own experiences, which is something of value to note in this whole conversation. And so maybe there's a sense of value in having something like this in that it calls attention to something that I might otherwise have neglected. There's lots to mull over in this episode of Compass, and let's settle into the conversation with Reverend Nathan Webb, founder of Checkpoint Church and author of God and the Machine. Well, Nathan, thank you so much for.
Ryan Dunn:
Joining us on Compass. I just want to start off in.
Ryan Dunn:
Saying that you are my all time.
Ryan Dunn:
Favorite contestant on Too Hot to Handle.
Ryan Dunn:
When I was watching that season, I.
Ryan Dunn:
Didn'T realize it was you, but yeah.
Ryan Dunn:
As it is, so therefore it is, right?
Nathan Webb:
Yep, yep. The funny thing about that is anytime I ever Google myself, that is indeed what pops up. And there was one point where I was looking up for like YouTube analytics, how to best perform on YouTube and they were like, well, you should use of course, your SEO maximization, which is the show too hot to handle. And I'm like, nope, nope, that is not what's going to boost me up the ratings, I'm afraid.
Ryan Dunn:
Well, I also share a name with somebody who is well known in reality television. Yeah. So I have not yet figured out how to capitalize on that correlation either. And of course since the other Ryan done has passed away, it feels a little inappropriate to try to garner some recognition off of his name. But we are going to talk about artificial intelligence and its, I guess, implications.
Ryan Dunn:
For spiritual development in the faith community. I did want to start with this.
Ryan Dunn:
Is there an AI tool, Nathan, that has impacted your spiritual development in some way, either for better or for worse?
Nathan Webb:
Yeah, I am a big fan of journaling and journal practices and kind of keeping my life coordinated in that way. And so I will say that I have made ample use of AI for like organizing my day and taming my adhd. So I think that that like in tandem is a spiritual kind of thing because it allows me the wherewithal and the brain space to then focus on those kind of things. But even something just as simple as like, I use this platform called Motion, which has minimal AI implement, but it for the most part tells you when to do Your next task. And so I've been taking advantage of that space so that I can keep my brain thinking about what I have to do instead of when I have to do it. It's things like that I think that have been the biggest, like spiritual boon is less are they the spiritual tool and more like, how are those tools helping me to better tune my other tools?
Ryan Dunn:
Does it ever impact your work as.
Ryan Dunn:
A content creator because you put out.
Ryan Dunn:
A lot of content.
Ryan Dunn:
So do you find yourself maybe kind of leaning into it to say, like, give me Bible verses specific to provenient grace or something like that?
Nathan Webb:
I would say more like the content creation side of the thing. Maybe even like curation is maybe a better word than creation. I think that a lot of AI helps putting in an existing script or saying like, I need stuff to help me perform better on a search engine. Recommend like tags for this thing or recommend a description for me or come up with a social media caption for this picture. So a lot of that is more what I do professionally, but that is even like keeping a pretty clean break between my personal stuff, because I don't necessarily want my personal social media to sound like AI, but I'm just fine with the branding to sound like a brand voice.
Ryan Dunn:
Interesting. Well, in your book, you. You kind of sit on the fence, so to speak, about AI usage.
Ryan Dunn:
And I, I don't mean that in a negative way, but you're like, yes.
Ryan Dunn:
We, let's explore how we can do this responsibly and also keep some guard trails around how we engage in the AI space, especially within its implications for the church. So you do write that Christians should not opt out of using artificial intelligence, but instead maybe think about the ways in which we can be thoughtful stewards of AI usage. What might that start to look like in the everyday spiritual life? Especially because I think we have to assume that most of our listeners are not working in specific tech spaces or like a content space.
Nathan Webb:
Yeah, I think that where I'm trying to approach it from is in my, like, world, I'm very tech heavy. And so I use technology every day. I like interacting with technology. And so I'm approaching it in a community that at the beginning of the AI boom was like very invested, very involved, very gung ho. And I think a lot of them still are super gung ho about AI. But then as it kind of developed, as we learned more about it, as we learned kind of like the dark side of artificial intelligence, my creative community that I love started to really dislike it. And so I was like, okay, well, something's. Something's happening here where I'm noticing, like, a very tenuous situation brewing.
Nathan Webb:
We're about to have this huge divide between these two camps. And, like, sure enough, it has become one of the most divisive topics, at least in my communities are like, people are either so in on AI that if you dare question it, like, you.
Ryan Dunn:
You.
Nathan Webb:
You better get on board or get, like, lost or on the other end of things, it's like, if you even consider talking about AI, you better be prepared to be, like, ousted from the community. And so it's become like this intense tribalism between these two camps. And so that's why I wrote from this perspective, was trying to offer to the spiritual person, whether that be a person in a church or a spiritual leader or just somebody that's just curious about their own spirituality of like, hey, there is another way to approach this. You don't have to be gung ho. You don't have to be overboard with AI, nor do you have to feel like you can't even talk about it as. As if it's some big taboo. And so I think for me, it starts with, like, a posture of kind of a wary curiosity. Like, I think that you should be willing to be curious enough to ask good questions of AI and to really discern, hey, is this thing right for me? Is this something I should be implementing? And so for me, I think the stance was less you should use AI and more you should be aware of AI.
Nathan Webb:
You should at least know what's going on. I think that none of us can get out of using AI entirely. Like, it's. It's in everything at this point. But there are conscious choices that we can make to be like, I know I'm not going to use an LLM. I'm not going to use a chat bot. I'm going to abstain from using these sorts of tools. Whereas you can at least be aware of what's going on in those spaces.
Nathan Webb:
So that, you know, like, hey, whenever I go on Instagram and I create that filter, whenever I go on TikTok and I use that filter like I am. I am using AI in those moments. I think a lot of people are just kind of unaware that that is even impacting them.
Ryan Dunn:
Yeah, even spell checker is something that utilizes AI.
Ryan Dunn:
Right.
Ryan Dunn:
Well, you mentioned LLM. That's a large language model. Correct. And in a chatbot, you said you aren't going to use those. I don't know if you're just being figurative or do you Personally guard against.
Nathan Webb:
I do, I do use them. My. So my personal boundary is very much like image and video generation. And so those are the two areas for you.
Ryan Dunn:
You don't. You're not going to do. Okay.
Nathan Webb:
That's where I set up my boundaries. I don't insist that everybody else does. So I have a lot of like, peers that certainly utilize it and utilize it well. For me, I think it is incredibly obvious. It's one of those things that I would be very nervous about using, like, especially if I were a church leader, is because of fact that it is like painfully easy to see that you're utilizing those things. And so for now, it's like a question of optics. But in the long run, it's a term of ethics. Like a lot of folks eventually say, oh, well, you know, someday you won't even be able to tell that this is image generated.
Nathan Webb:
And that's all well and good, like, probably true, but is that a good enough reason for me to use it? No, I'm still like ethically against the implementation of it. But again, I think that's one of those personal decisions people have to set.
Ryan Dunn:
Yeah. What are the ethics that are boundarying. Setting a boundary around the boundary. What are the ethics that are boundary. Setting a boundary around your use of. Of that. Can you. Are there maybe? Is it scripturally bounded? Is that where you're going or how did you formulate that decision?
Nathan Webb:
There's a lot going into it. I think that right now, the present, like, conversation around the ethics of AI, like secularly outside of my spiritual angle to this thing have to do with a lot of people talk about the water consumption. Back whenever all of this was first getting started, there was a lot of copyright issues where there was some moves taken by some of the leaders in the AI spaces that were very questionable. Some copyrighted material just kind of stolen for training purposes, purposes and that kind of thing. And then these lawsuits happen and they have like more money than God and so they're able to pay off those lawsuits. And people are like, well, then it must have been legitimate if they could pay for it. Well, no, it was still a lawsuit. They just settled.
Nathan Webb:
Like that kind of situation is always sort of in existence. And why I say that I think that awareness is so key to this conversation is like being aware of what you're doing and where these resources came from and if they were ethically sourced or if they were not. But for me, all of that is kind of couched in this spiritual concept of loving God and loving neighbor. With particular attention to loving neighbor. And so for me, I know a lot of artists, I know a lot of people that are very clear with me. They're like, I am losing my job. I am losing my sense of self on, like, a metaphysical, like, level. I am.
Nathan Webb:
I am doubting my own artistic integrity. I'm doubting my ability to create things because of AI. And so it becomes a choice not only out of these ethical concerns in, like, the worldly dynamics, the political dynamics, but also a means of, like, I am painfully aware of the people that I love and the people that I know that are being harmed. And so out of my kind of Methodist ethic of doing no harm, I'm trying to avoid using any kind of image or video generation because I see the harm that's causing to real, breathing human beings.
Ryan Dunn:
Well, the church is always kind of engaged in adopting new tools, sometimes rather slowly, right. But in their earlier days, like, we can think of the printing press, I think the church kind of led the way in the adoption of the printing press. And even in our own Wesleyan Heritage, down to John Wesley using pamphlets. Could AI be another tool that really helps the church, as you mentioned, like, love our neighbor with a little bit more intention?
Nathan Webb:
Yes. Yeah, absolutely. And there are, like, there's a myriad of ways that this is already happening. One of my favorite ways that I mentioned in one of the chapters in the book is I'm. I'm a big fan of taking as many measures as possible to creating the most accessible digital experience. And so for me, like, I am a digital first pastor. I planted a digital first church. I have a real passion for people that are experiencing the divine via digital means.
Nathan Webb:
Right. And so I want to use AI to an end to make that experience as accessible as possible. So whether that means trying to help, like, screen readers with alternative text to whatever images are out there, like, I feel more comfortable taking an artist made image and saying, come up with alt text to this so that then I can provide that to somebody who wants to, who maybe has impaired vision or has some other kind of thing that's keeping them from best experiencing that, to be able to use tools to better that experience. So I'm hopeful that we will continue to see AI being like, well implemented by the powers that be and by smart people to create the most accessible experience of the Internet that we ever thought possible so that more people can experience it, and so that inevitably we can have deeper relationships with one another. And so if it leads down that avenue of deepening our relationships and community like that for me is an obvious win. That is a way to do good, that's a way to love my neighbor, is by creating a space that they can be seen and known despite whatever issues may have risen up before this technology.
Ryan Dunn:
I've talked on this podcast before about an app that I use sometimes. It's called. It's just called Text Jesus and it is what it says it is. You can text Jesus and get responses in a Jesus type tone or voice. In my mind, it simply serves as reminders. Like, I know I'm not actually conversing with the, you know, triune God here, but that I am talking with a piece of technology that supplies me with reminders of how I can apply, I guess, a Jesus lens or focus to whatever situation I feel like I'm walking through. So I felt slightly called out in the portion of the book when you were talking about, you know, how we interface with these, I guess, AI personalities, so to speak, and that some people can really tend to lean into them as, as cognizant personalities that have information to offer in their lives. So do you see some kind of healthy guardrails that we can adopt around these AI personalities that maybe don't replace the relational part of our lives or begin to distort our sense of what is real?
Nathan Webb:
Yeah, that's such an important thing to keep in mind. The way that I phrase it pretty often throughout the book is that I want to put up a big red flag with a big word and I want us to be like super cognizant and aware of anthropomorphism and how that should not be playing into our experience with AI. We are seeing work being done on unhealthy like relationship building dependency on AI partners. We're seeing these like, there was a big rise from OpenAI and ChatGPT where somebody died by suicide because of this and other the relationship that they had formed. And. And then whenever they decided to make the swap because of this kind of fallout from this tragedy, they got just an onslaught of messages from people who were like, you took away my partner, you took away my best friend. You took away like my confidant. And so I think that for me, all of that is another clear instance of how we're seeing harm being done by these spaces.
Nathan Webb:
And it's so necessary for us to lay that as a firm boundary of this is technology, this is bits and bytes, this is ones and zeros. As tempting as it may be for us to like hope for this kind of fictional, realized artificial intelligence entity at the End of the day, that's kind of a sales tactic being used upon us. We are, we're being inflicted with this rather than that being a reality in the slightest. A lot of the big buzzwords are like AGI and talking about the kind of future of this technology. But for me I have a pretty cynical view of that really just being a marketing tactic. Trying to create these realized Personas, trying to create a human looking AI, even the like robotics movement right now, trying to make robots that look like human beings. Like this is all marketing, this is all a way that they're trying to deliver some kind of perceived sci fi. But the reality is like utility is the purpose of technology.
Nathan Webb:
Sometimes we can say like betterment even is a way to understand technology, but relationship is simply not the actual like goal of technology use. If anything, it's meant for the utility that leads to betterment for relationship with other human beings. But one of the like really firm grounding points I had to start this book with was like we are known, we are realized through our like spiritual image of God, our imago DEI and like that is what makes us human. And we can't confuse that with these man made technologies. And that is such a, a sticking point I think for this issue of anthropomorphism. And to be honest, how I think we're being manipulated.
Ryan Dunn:
All right, I want to play that out in a little use case scenario then. So one of the ideas that I've played around with just as a tool for my local church is to attach a chat bot to our website. And it's important to note that I do believe in saying like right up front, if somebody types a question into our church website, I want them to know they're talking to a robot. My feeling is as somebody on the, I guess the back end of this is that we're a tiny church. I'm not going to be able to respond to their questions for maybe days. Right. So if I can put this chat bot out there that may be able to address a frequently asked question that could be helpful. But you know, there might be this, I guess lure to assume that the chatbot is in some way a representation of the people in the church.
Ryan Dunn:
Do you have some guiding principles around a use case scenario like that?
Nathan Webb:
Yeah, unfortunately I think the, the answer is probably more work which is taking the steps to building out like a realized church policy that you can then put on your main website so that anybody that does find your chatbot is at least able to access that policy and be aware of where you all stand with those kind of things. And all of that I think is a, for me, like transparency. We need to be transparent about our use of technology. And that's always been kind of the reality of this thing. And I think we've done this, this work has been done. Like we are aware of putting stock images on our website of people that do not look at all like our churches is a red flag for fol. Whenever they show up on Sunday morning and they've seen these pictures on your website of these beautiful 20 somethings and then they see, you know, an aging rural congregation, they're like, I've been hoodwinked here. And so this is outside of the realm of AI.
Nathan Webb:
The way that we've sometimes fallen prey to a lack of transparency via technology. And all of this is like we as, as the church, we as leaders of this space have kind of the imperative put upon us to be leaders to exemplify a way that this can be done in a responsible and transparent way. I think like whenever we think about this history that we have with communications and publishing, like the pamphlet, like the radio, like the television, like we've seen ways that the church is, has been a leader in this space, but then we've also seen ways that it's been very manipulated in these ways. And so there are going to be bad actors in any of this. But for the local church leader or for the involved person that is in a spiritual community, I think that it's a part of our responsibility as people seeking to love God and neighbor, seeking to do good, to do no harm, is to at least take active steps towards transparency and responsibility in the way that we utilize technology so that others can then see, oh, that's how I should be doing it. That's a way that I can be utilizing this without abusing other people.
Ryan Dunn:
We can talk in generalities here, so we're not putting anybody specifically on blast, but does any situation stick out in your mind of a case where maybe a church or ministry or leader was engaged in using AI in a, well, a less than helpful way or even a harmful way?
Nathan Webb:
Yes. I mean, I think there has been this rise on the Internet of AI slop, and I'll define that for anybody that maybe is unfamiliar with the term, but this essentially is a like thing that we see happening, a conundrum that we see happening on the Internet of this rise of massively created low effort, low energy content. And so AI slot might be mass sending emails, mass sending text messages, mass creating images, videos, that kind of Thing. And so I think one of the, I don't know if I'd put this in the, in the hands of like a church or not, but one of the preeminent examples of this back whenever AI was really rising in the, you know, know, early 2000s, was this image that is inexplicable that just kept appearing. And so the way that these image generators would work is that they would take, they would start a Facebook page and they would post an image and then they post another image, another image, another image and they would see how many likes and comments they got. And however many likes and comments they got would then tell the AI, hey, create more images with this element in it. And so, so on and so forth. We see that thing layer on layer on layer.
Nathan Webb:
It learns and learn and learns and keeps building upon itself. And this strange phenomenon happened where the images that you would see were typically of a shredded Jesus, just the absolute like most muscly Jesus you've ever seen.
Ryan Dunn:
Nathan Wallace Too hot to handle.
Nathan Webb:
Yeah, you would see, you would see the too hot to handle Jesus and then next to too hot to handle Jesus, us would typically be like a veteran, maybe even somebody that's like disabled or something along those situation where they might be visible in that way. Or we would have, weirdly there was like stewardess for a while. There were like maybe like Asian American esque stewardess people showing up on this. You would have all of these elements that for some reason were performing really well on this page. And so you would just see this, this continued rise of these weird uncanny images of these character tropes or types that would get the most interactions and reactions on Facebook. And so my thought process is like when somebody sees Jesus, they might type Amen. And so since that's a comment, the AI is going to say hey, this person said Amen whenever I posted a picture of Jesus. So I'm going to keep doing that.
Nathan Webb:
Another thing that rose was Happy Birthdays. And so if, if, if they would say Happy Birthday in response to a picture they would see on Facebook, you might see more Happy Birthday signs and so, so on and so forth. Thank you for your service. Whenever you see a veteran, hope you feel better. Whenever you see somebody that is sick. I, I can't explain the stewardess, I have no idea what that one's about. But regardless, like you can see how these things start to form out of this continued learn morning. And so I'm not going to say that the church has fallen prey to that, but I do think there is a weird coincidence of it having Jesus in these pictures.
Nathan Webb:
And I think that the church could easily fall prey to that kind of mentality of, hey, think about how easily we could make images if we just utilized AI to learn what our people were looking for and then feed them that in response. And so that's my fear is that we. We try to. To check so many boxes of ease that we fall prey to what we discovered with this AI sloppification of the Internet. And so I don't think we've fallen prey to that yet. But I don't think it would take too many jumps for us to say, I wonder how we could utilize that. And I think that's a slippery slope that we don't want to go down.
Ryan Dunn:
Yeah, well, and for sure it's enticing because once you start getting those clicks, then, yeah, I mean, yeah, feels like, wow, if more people are noticing, certainly it's advancing the kingdom in some way, right? Yes, but maybe not in responsible way. I want to kind of circle back to maybe draw us towards a conclusion on this. As a person of faith who recognizes that AI is now embedded in so many of the aspects of what we do, I don't think I'd be able to perform my job for the most part, were it not for some kind of AI influence. That being said, I want to be responsible and I want to resist doing harm as much as possible. Are there some basic guidelines, recommendations that you might be able to offer to kind of navigating this point of intimidation or scariness?
Nathan Webb:
Yes. So absolutely. And people should check out the book because that's very much what I'm trying to do in that is to. To really offer what I hope is a beneficial framework for how to explore these things. I think even starting with the posture of wanting to do no harm, of wanting to love God and neighbor, that's kind of how I introduced the book is like this is the initial framework I'm providing is just like this is what I want to work within. If I can approach this with at least the discerning lens of loving God and loving neighbor, then I'm starting on the right foot. From that point, I offer this model of what I'm calling kind of the ethics engine. And what I'm wanting that to really symbolize for us is ethics is literally the heartbeat.
Nathan Webb:
I want that to be what drives us, rather than virality, rather than clicks, rather than success even. Like, I don't want any of those to be the things that are driving the car. What I want to actually be moving us forward is our devotion to ethics and to that Christian ethic of loving God and loving neighbor. And so I continue to offer ways that I think that that is probably best done. I think that that is going to be a struggle for anybody. And so hopefully this book will fill that void and we'll be able to offer a way forward in. In whatever way needs to be custom fit for you. Another reality of this is that that this is a.
Nathan Webb:
This is not like a one time situation. This is a posture that we have to take with us going forward. Because a part of writing a book on AI is the acknowledgment in reality that nothing you say is relevant tomorrow. AI moves so quickly. Technology moves so quickly. It's really hard to write a book about these kind of things or to really share any kind of insight about these things. And so what I tried to do was to create a sort of evergreen stance, an evergreen posture that at least we can take towards what it looks like to be responsible and transparent with our AI discernment. So again, not really ever saying like, should you use AI? Absolutely, yes.
Nathan Webb:
Should you use AI? Absolutely, no. I really don't want that to be what people come away from this book with. What I want them to be able to walk away with is like the reality of should you be taking AI seriously? Should you be taking it spiritually? Should you be like, considering what is the soul issue, issue at heart here? Yeah, I think that is where we should start. If you're willing to enter into this space with a posture of humility, transparency, and a willingness to like, explore the ethic of the thing, then I think we're in a great place and we will do that work of setting a standard for how AI should be used by everyone because of how the church is using it.
Ryan Dunn:
Well, I feel validated now in that we're recording this kind of last minute before producing this episode. And it. I guess that's for the best because what we're talking about could be moot in a few months. Right? It's changing that quickly. One of the things that you brought up consistently within the book that kind of got my mind rolling, the gears moving a little bit through my head is the idea of embodiment and how we are with one another. And certainly I think you make a great, great case for kind of digital embodiment with one another. I'm butchering this, Nathan. So I'll just encourage people to kind of check it out in the book.
Ryan Dunn:
However, I'm wondering if there are ways that you have seen AI help to facilitate our sense of togetherness with one another in a way that, you know, I am more present with you because of the influence of some kind of artificial intelligence.
Nathan Webb:
It. Yeah, I hope that's come through in some of the things that I've said so far, especially with the attention earlier to, like, accessibility. I think that the more accessible we can create an online space, the more we will experience one another's embodiment. And so one of the. One of the reasons that I wanted to approach this book and this topic at all is because, as I mentioned, like, I'm a pastor and a church planner of a digital first church. And the reason that I am that that is because I have a real heart and a passion for digital natives and for people that are experiencing themselves and identifying themselves through their kind of digital presence with one another. And I wanted that to be the heart that I kind of approached this topic from. And I think that AI is both something that needs to be taken seriously from an.
Nathan Webb:
An ethic of doing no harm because of my digital natives that I'm seeing being harmed by this, but also because I have hope that I think that AI truly can be a thing that can make this experience even more conducive to understanding one another's embodiment online. And so I think that one of the ways that we see this being done is, you know, there are processes going on in the background of everything, like I mentioned. And so whenever I'm able to utilize an app like Motion and to create a space where I'm able to keep my ADHD in check, I can spend more time with the people that I love online. And so all of these things are, to me, a augmentation. They are a tool by which I can better myself, my mind, my schedule, my activities, or the things that we're able to do to get things out of the way so that I can spend more time loving people, all of that is what technology is ultimately good for, is as an augmenter, so that my relationships might be able to be deepened. And that's true of. Of, you know, the. The tangible, tactile relationships that we experience physically around us or through our digital embodiment.
Nathan Webb:
Like, regardless of how we're being embodied, I think that technology at its best is serving to augment our connection with each other, rather to be a hindrance or a boundary between those things. So the more we can do to see it that way and to let that be again a part of that ethics engine, the ethics that's driving us. Yeah, I think a lot about the reality of like, sometimes I'll feel guilty, I've got two kids and I'll feel guilty about like, spending too much time on my phone or spending too much time gaming or. And then I'll have an experience with them where we'll play a game together or where we will do like silly Instagram filters with one another and that'll be like the best moment we've had all day. So it's like this is a relationship that was deepened through the facilitation of technology, not hindered instead of it. So rather than feeling guilty about the fact that there's a phone screen between me and my daughters, instead I want to come up with ways that we can utilize it to deepen our relationship with one another, to better understand one another and, and to try and take step towards that rather than just feeling guilt and shame around technology, which I think is the, the kind of polarities that we find ourselves in.
Ryan Dunn:
Well, that's certainly at work in your church, Checkpoint Church. So encourage people to check that out as well. And the book is God and the Machine. It's available in January of 2026. Nathan, thanks so much for spending the morning with us and telling us about adventures in digital land and artificial intelligence. Appreciate it.
Nathan Webb:
Thanks for having me.
Ryan Dunn:
Thank you so much for joining us on this episode of Compass. Finding Spirituality in the Everyday. If you'd like to dive deeper into this conversation with Nathan Webb or any of our other conversations, be sure to visit our [email protected] Compass there you'll find episode notes, helpful resources, and of course, plenty of past episodes to explore. A special thank you goes out to our team at United Methodist Communications for.
Ryan Dunn:
Making this podcast possible.
Ryan Dunn:
Couldn't do it without their support. And of course, listener, we couldn't do it without your support as well. If you haven't already, please subscribe to Compass on your favorite podcast platform and consider leaving us a rating or a review. Your feedback helps others discover the show and it encourages us to keep bringing you these meaningful conversations. Thank you so much for listening. We hope you'll join us next time. Two weeks time on Compass.