AI Jesus Shows Us Where the Line Is
What the latest Jesus chatbot reveals about the difference between using a tool and being spiritually formed by one.
People are now paying by the minute to video call an AI-generated Jesus.
In April 2026, a Southern California startup called Just Like Me launched a product that lets users video call an AI-generated Jesus, reportedly for $1.99 a minute, with monthly packages around $49.99. The avatar is Jonathan Roumie-esque, drawing on the modern Jesus aesthetic many viewers associate with The Chosen. It is trained on the King James Bible plus a body of sermons, speaks more than a hundred languages, remembers past conversations, and offers prayer and guidance on demand.
The founder, Jeff Tinsley, has said publicly that the tool is meant to complement faith, Scripture, and pastoral care rather than replace them. The company is clear that the avatar is not Jesus Himself and does not possess divine authority. Even with that disclaimer, the user experience carries a weight the legal language cannot fully contain. A product can call itself a companion while training users to experience it as intimate, authoritative, and spiritually available on demand. That is where the dangerous line in the sand begins to appear.
Moses, the Staff, and the Snake
I first heard Tomi Arayomi apply the Moses staff analogy to artificial intelligence, and I couldn’t unhear it. I’ve been seeing AI through that frame ever since. And it’s particularly relevant to the Jesus AI conversation.
When the Lord called Moses from the burning bush, He asked him what was in his hand. Moses said a staff. The Lord told him to throw it down, and when he did, the staff became a serpent, and Moses ran from it. Then the Lord told him to pick it up by the tail. That detail matters because it meant Moses had to reach for the end that gave him the least control. The serpent’s head was still free, and Moses had to obey before the thing in front of him looked safe again.
That same staff was later used to part the sea, bring water from the rock, and stand as a sign of intercession while Israel fought Amalek. Later, when Aaron’s staff swallowed the staffs of Pharaoh’s magicians, the point became even sharper: the counterfeit could imitate the sign for a moment, but it could not outrank the authority of the Lord.
That’s the point. Something dangerous can be transmuted under the authority of the Lord and used for His purposes. Even a tool crafted by the enemy can be turned into a weapon for God’s Kingdom when He puts it in the hands of His people. That does not make the tool clean in every context or safe in every use case. It means authority, obedience, and discernment determine whether the body of Christ handles the tool or gets formed by the spirit that formed it.
That’s why AI Jesus matters. It shows what happens when a powerful tool gets dressed up as spiritual authority and sold back to hungry people as access to Christ. The body of Christ needs to be able to name that corruption clearly and still have enough authority to pick up the tool for Kingdom work.
Why the Backlash Made Sense
The backlash came quickly, and a lot of it came from inside the church. The Catholic Register ran a feature calling attention to the “blasphemous temptations” of AI Jesus. Christianity Today hosted Russell Moore on the parallel cultural moment of Trump posting AI-generated images of himself as a Christ figure (and no, he wasn’t dressed up like a doctor). Pastors, theologians, and commentators lined up to call the product evil, blasphemous, idolatrous, or worse.
I understand the reaction, for real. Charging by the minute for a simulated Jesus to offer prayer and guidance should make believers feel a check in their spirit. But that check should drive discernment instead of reflexive rejection.
I am pro-AI. I teach AI literacy, use it daily, build with it, and study it obsessively because I believe artificial intelligence is the most powerful general-purpose tool the human race has produced since the printing press. The body of Christ needs spiritual authority, technical literacy, and enough discipline to handle it before cultural habits do the training for us.
The Western church has faced this pattern before with print, radio, television, the internet, and social media. Every major media shift arrived carrying the assumptions, appetites, and spirits of its builders. Believers learned to preach through those mediums, publish through them, organize through them, and reach people beyond the walls of a building.
The believers who refused each wave often forfeited influence over a generation. We cannot afford to forfeit this one because AI is already shaping how people search, learn, write, study, make decisions, and now, apparently, seek spiritual guidance. AI Jesus gives the church a clear warning sign: if believers do not steward this tool with spiritual authority and technical literacy, the culture will teach people how to use it without discernment.
Where the Boundary Starts
AI belongs in the productivity lane: sermon preparation, Scripture cross-referencing, original-language study, theological research, content drafting, summarization, journaling prompts, admin work, and writing scaffolding. Work that once took a researcher, assistant, or ministry team a week can now be organized in minutes when the person using the tool has enough discernment to check sources, test doctrine, and keep authority in the right place.
A pastor using AI to put bones on a sermon is stewarding time when the tool remains subordinate to prayer, Scripture, study, and the fear of the Lord. Hosea 4:6 cuts both directions. My people perish for lack of knowledge, and AI used well can make knowledge more accessible to more believers, faster, with better tools for tracing original languages and historical context than most laypeople have ever held in their hands.
That is stewardship with discipline.
The boundary appears when software moves from study support into spiritual direction. Hearing from the Holy Spirit, counsel for a marriage, guidance about what the Lord has assigned you to carry, confession, repentance, deliverance, correction, conviction, and accountability inside the body of Christ require prayer, Scripture, spiritual maturity, and real relationships. AI can sit nearby as a research aid, a study assistant, or a journaling tool while the shepherding work remains where the Lord placed it.
AI Jesus crosses the line because it takes a tool that can assist study and places it in the chair of spiritual counsel. Once a chatbot is offering prayer, guidance, memory, and a face that looks like the cultural imagination of Jesus, the user is no longer interacting with a simple study aid. The product is asking for a kind of trust it has no authority to carry.
The App Wants You to Stay
Most consumer AI products sit inside business models that benefit when users keep coming back. The app has a reason to feel helpful, warm, responsive, emotionally satisfying, and easy to return to. Even when the model is designed to give useful answers, the business wrapped around it still rewards repeat use, longer sessions, and ongoing attachment. That is the product model.
That design can support good work when the user is organizing research, summarizing notes, building a study guide, or drafting a content plan. In spiritual direction, the same engagement-driven design starts shaping trust it has no authority to hold, especially when the user is lonely, confused, new to Christ, or looking for permission.
That sucks because the person asking may be sincere. They may be ashamed, spiritually bruised, or afraid to ask a pastor what they really want to ask. A chatbot that feels gentle and available can become a substitute voice before the user realizes what is happening.
A system built to keep a person engaged is poorly suited for the kind of spiritual confrontation that says, “Go and sin no more,” because quoting the sentence is not the same thing as carrying the authority of the One who said it. Second Timothy 4 names this human tendency plainly: itching ears. People have always gathered around voices that tell them what they want to hear, and AI has given that ancient weakness an industrial delivery system.
A faithful pastor preaches repentance when the room gets tense. A real friend tells the truth when the relationship may cost them something. The Holy Spirit convicts in ways the flesh does not find comfortable, but a chatbot optimized for usefulness, emotional satisfaction, and continued engagement cannot carry the burden of spiritual authority.
The app can comfort someone and keep them talking. It cannot confront them, shepherd them, or answer for the Lord.
Follow the Pricing
ChatGPT, Claude, Gemini, sermon prep tools, language study apps, and theological research platforms generally run on flat subscriptions because they are selling utility. The Just Like Me at $1.99 per minute implies something. Anyone who has ever seen psychic hotline pricing knows exactly what that something is. It’s predatory and in this case it is spiritual access sold in timed increments. Gross and no thank you.
Yes, they have a disclaimer paragraph, but the reality is NOBODY reads the fine print. The product model trains users to associate access to Christ with dollars per minute. That lesson does not easily unlearn itself, even when the founder tells the press the product is only a “companion.”
The Assumptions Under the Build
There are spiritual assumptions embedded in much of the AI world that Christians should not ignore: transhumanist hope, post-human ambition, techno-salvation language, and a quiet belief that intelligence itself can become a form of transcendence.
Discernment requires us to see that clearly without collapsing into fear. The spirits behind builders and systems do not automatically make every tool unusable for kingdom work. The Egyptians made the gold, and the Israelites carried it out and built the tabernacle with it. Daniel was educated in the schools of Babylon and used Babylonian administration to preserve a remnant.
The body of Christ has a long pattern of plundering the systems of the world for kingdom purposes. AI can sit inside that lineage when believers know what they are touching, why they are touching it, and whose authority governs their use of it.
What We Do With AI Jesus
The line in the sand should be pretty clear. A product took productivity-tool architecture and marketed it into the spiritual direction lane. The church can name that line and still keep a firm grip on the broader tool.
Do not talk to the chatbot like it is the Holy Spirit. Do not pay $1.99 a minute for what prayer already gives freely through the blood of Jesus. Do not hand your discernment to a product model that rewards longer sessions when the Lord may be trying to confront you, correct you, or call you into repentance.
The staff still has to be picked up with submitted hands. In this particular hand, through this particular use case, under this particular category error, AI became a snake. Biblical authority, clear category boundaries, and work suited to the tool can produce different fruit.
A pastor using AI to draft sermon outlines, a discipleship coach building study guides, a creator translating biblical content into a hundred languages, and a Bible teacher digging into Hebrew and Greek faster than they ever could on their own are all showing us what it can look like to wield the staff instead of running from it.
AI Jesus shows us where the line falls, and that may be the most useful thing the product has done because it made the boundary visible. Believers who can see the line clearly can hold the rest of the tool without fear. The body of Christ needs submitted hands, disciplined minds, biblical literacy, spiritual authority, and enough courage to pick up powerful tools without bowing to the spirits that shaped them.
Barna and Gloo Are the Next Layer
AI Jesus made the product question obvious. Barna, a Christian research organization known for tracking faith and culture in America, and Gloo, a ministry technology company, recently released research showing something much bigger underneath it. Nearly one in three U.S. adults now say spiritual advice from AI is as trustworthy as advice from a pastor. Among Gen Z and Millennials, the number rises even higher. Yes. You read that right. Houston, we have a problem.
I am going to come back to that research in the next article because it deserves a full breakdown. For now, it tells us enough to take AI Jesus seriously as a warning sign. People are already bringing spiritual confusion, loneliness, conviction, guilt, and longing into the machine, alongside ordinary requests for Scripture summaries and sermon notes.
And the machine is answering.
That is why the body of Christ cannot afford lazy panic or lazy adoption. We need more than reaction. We need authority. We need biblical literacy. We need technical literacy. We need clean hands and submitted hearts.
Look at what is in your hand. The staff becomes useful when submitted hands pick it up under the authority of the Lord.



