AI or Search Engines? Choosing Between Google and LLMs: A Catholic Tech Table Episode
Welcome back to the second episode of Petrus's Catholic Tech Table podcast! Listen as Andrew explains the differences between Google search and large language models (ChatGPT, Gemini, Claude, Grok) for Catholic Church and nonprofit fundraising work.
Show Notes:
Andrew frames Google as a “librarian” for finding current, source-based information (news, rules, reviews, verifiable facts) and LLMs as “conversation/research partners” for brainstorming, drafting content, and simplifying complex topics. He also outlines key LLM limitations, including training cutoff dates, hallucinations/confident inaccuracies, and lack of knowledge about an organization’s specific documents unless provided. Andrew closes the episode with three practical challenges.
INTERVIEW TRANSCRIPT
Howdy, and welcome to the Catholic Tech Table. I am Andrew Robinson, president and owner of Petrus Development and your host for this episode of the Catholic Tech Table. On this show, we talk about technology — particularly as it relates to use within the Catholic Church, and even deeper into the more specific niche of how it works within Catholic fundraising and Catholic nonprofits.
On today's episode, we are going to be talking about something that is probably pretty basic. Well, it is pretty basic, but if you're new to AI, this might be helpful to you. We're going to be talking about the differences between Google and a large language model, or an LLM, such as ChatGPT, Gemini, Claude, and Grok.
I'll be talking about what the differences are between them and how you, as a fundraiser or as somebody working in ministry, can make use of each tool depending on what your needs are. So we're going to dive in and start talking about this right now, and I'll be sharing my screen to show you some examples as I work through them.
The simplest way I can suggest you think about Google versus an LLM is: Google is a librarian, and an LLM — like ChatGPT or Claude — is a conversation partner. Google is good for finding sources and current information. If you need to search the internet and find websites or resources, Google is definitely the place to go for that.
It's kind of like a librarian, right? You can go ask him or her what book you should read on a topic, and they're going to point you to those resources — but then it's up to you to read through them and find the information. That's how Google works. It points you to the right information, shows you the source, so you know it's accurate, and then it's up to you to go through those websites and read them.
An LLM is like a conversation partner. It's a research partner. It's someone who can help you brainstorm, help you create content, and help you synthesize information — but it's not going to give you sources unless you ask for them, and it often won't search the internet for current information unless you specifically ask it to.
So, in the library analogy: you go to the library, ask the librarian what sources you should consult, they send you to the books, you read them. An LLM is like asking your research partner, "Hey, put together all the information you know on this topic." It will synthesize the information it was trained on, put it together, and give it to you. There are pros and cons to both Google and LLMs, but that's basically the difference — a librarian versus a research partner.
So now you're probably thinking: where should I use Google, and where should I use an LLM?
Use Google when you're looking for current information, when you're looking for specific information about your organization or another organization you're researching, to verify facts, or when you're looking for product reviews. There are a lot of uses for Google within fundraising, and those are just some of them.
Some examples of things you might type into Google:
- "What did the Bishop say about stewardship in his last pastoral letter?"
- "What are the current rules on charitable gift annuities at the end of 2025?"
- "What are some CRM companies and what are the reviews on them?"
That's going out into the internet, finding information, and presenting it where you can search through it. It's current and it can be accurate — you can check those sources because you're going directly to them.
An LLM would be used for things like: brainstorming ideas, drafting a letter to donors, or explaining complex topics in easy-to-understand ways. Some things you might put into ChatGPT or Claude:
- "I've got a thank-you letter that needs to go to a donor. Can you draft a first copy?"
- "Help me explain planned giving."
- "Help me explain charitable gift annuities in a way that someone with my background would understand, so I can explain it to others."
Again, it's about starting with broad information. The LLM will assemble and synthesize the information in the way you've asked for it, and give it back to you.
So, to summarize: use Google when you're looking for source information, product reviews, or current news and announcements. Use an LLM like ChatGPT or Claude when you need help drafting content, brainstorming ideas, or explaining something in an understandable way.
One way you can use both together: Google information about charitable gift annuities, take that information from the websites you found, plug it into ChatGPT or Claude, and say, "Now help me explain this to my donors in a way they'll understand." That's how you can combine both tools with a single use case.
So, I've talked about the difference between Google and LLMs. What are some of the downsides to large language models? What can ChatGPT and Claude not do?
One thing you may not know is that LLMs are trained on a dataset that has a cutoff date. So if you ask ChatGPT who won the local election last month, it's not going to know, because that event wasn't in the data it was trained on. For today's news and current information, you definitely want to use Google or another search engine — not an LLM.
Another weakness of LLMs is that they can be wrong, and when they're wrong, they're often very confident about it. This is called a "hallucination" — where the model either makes up information or provides inaccurate information, and you don't know it because it presents the answer confidently.
This actually happened to me once. I was researching for a presentation and wanted some statistics on call pickup rates. I asked ChatGPT, and it gave me a statistic and cited a Harvard Business Review article from September 2022. I sent it on to another person on the team, and they came back and said they couldn't find the information. So I went back to ChatGPT and asked for the source. It gave me the name and date of the article. I passed that along. The colleague came back and said he still couldn't find it. So I went back to ChatGPT again and asked for the actual link to the website. It came back and said, "Sorry, that website actually doesn't exist."
I was like — what the heck? I said, "Chat, why did you give me that information if it's not accurate?" It said, "I'm sorry. Sometimes I don't know all the information, but I want to be helpful. Here are some other studies that are similar in scope." I was like, that's not helpful, ChatGPT! But it had been very confident in the original answer — it even told me where to find it, and then it turned out the source didn't exist.
So you have to be very careful with LLMs. If you're asking for stats and data, always ask it to provide the citation so you can go find and confirm it.
The other weakness of an LLM is that it doesn't have information about your specific organization or anything it hasn't been trained on. For example, let's say you work for a parish and you're trying to put together a report on end-of-year giving, and you want to draw from letters or homilies your pastor has written. The LLM wasn't trained on that information, so it doesn't have it to give you. In that situation, you do need to go to Google (or another source), find the information, and then provide it to ChatGPT, Claude, or Gemini and say, "Use this to help me draft a letter." But without doing that step, it simply doesn't know.
So, LLMs are helpful research partners, but they're not oracles. They don't know everything, and they can't tell you all the information you want just because you're asking.
I've got three challenges for you to put this information to use back in your office or at home.
Challenge number one: Google something you want to know. Something current — like "What did Pope Leo say about AI in his last presentation?" That's something an LLM won't know because of its training cutoff. So Google it.
Challenge number two: Go to one of the LLMs — ChatGPT, Claude, or Gemini. They all have free tiers, and you're very unlikely to hit usage limits with this challenge. Go in and ask it to do a task for you. For example: "Draft me a thank-you letter for an end-of-year appeal. Here's the relevant information." Use it to do something for you.
Challenge number three: Combine both tools. My earlier example works well here: Google the tax rules or donation deadlines, find that information, and then plug it into ChatGPT or Claude and say, "Here's the information. Explain this to me in a way that I can explain it to my donors." Using both together is your third challenge.
I'd love to hear how it goes. Send me an email or leave a comment. If you do this challenge and want to share what worked best or what you thought of it, I'd love to hear: [email protected].
So that's it for today's episode of the Catholic Tech Table. Thank you for joining me. Remember: Google is your librarian; LLMs are your conversation and research partners. Use them accordingly and find ways to make the best use of each, depending on your situation. There is application for both in your day and in your work.
Next episode, we're going to be talking about how these LLMs are trained. I touched on that a bit today when discussing cutoff dates, but we'll do a deep dive into how LLMs are trained so you can understand their strengths and weaknesses even more clearly.
Until then, keep working on these challenges, keep coming back, ask us your questions, and keep diving in and learning more about how you can use these tools and resources for the building of the kingdom.
Thanks — I'm Andrew. We'll see you next time.
READY TO BECOME A BETTER FUNDRAISER?
Sign up below toĀ receive tools, ideas, and inspiration to take your development efforts to the next level.
We hate SPAM. We will never sell your information, for any reason.