How To Turn Google Reviews Into Profit With George Swetlitz

Table of contents

About The Podcast

How To Turn Google Reviews Into Profit With George Swetlitz

Published
:
January 4, 2026

Host Bio

Matt Bertram is a Houston-based digital growth strategist and fractional CMO who leads digital strategy at EWR Digital, a consultancy founded in 1999. He works at the intersection of enterprise SEO, revenue operations, and AI-driven automation—helping organizations build durable “visibility + conversion” systems where trust and accuracy matter (especially in legal, industrial, energy, and enterprise B2B environments). As the host of The Best SEO Podcast, Matt translates complex shifts in search and AI discoverability into practical operating decisions, with a consistent focus on how buyers build confidence and choose vendors in high-stakes categories. Across advisory work, content, and team leadership, his through-line is building repeatable growth infrastructure: aligning marketing signals, intake workflows, and on-page/off-page credibility so customer experience and reputation are reflected clearly across the public web and AI systems. (Matthew Bertram)

Summary

Matt Bertram frames this conversation as urgent because he’s watching “LLM visibility” spill into every part of digital marketing—and, in his own work, reviews have become the bottleneck for scaling multi-location businesses. Early on, he shares that he’s “working on a strategy for a major law firm to spin up… a hundred something locations,” and that “generating reviews for each one of those locations has been quite difficult.” That lived constraint is why he brings George Swetlitz on: to get practical about review volume, review responses, and how AI can remove friction without making the brand feel generic.

Swetlitz sets the table with his operator background: he previously ran a “220-location roll-up of hearing aid clinics,” where the goal wasn’t just marketing activity—it was getting more people to “just call you because that’s the most cost-effective route.” That pushed the team deep into Google Business Profile performance and, specifically, reviews as the lever that quietly compounds visibility and conversion.

A central concept in the episode is leakage. Swetlitz describes multiple places revenue “leaks” out of the funnel when reviews aren’t treated as part of the growth system. One leakage point is simply not capturing enough reviews: “Anyone who doesn’t leave you a review that could is leakage,” because to Google, reviews function as a popularity signal. He explains that “a review to Google is a proxy for popularity”—more reviews implies more real-world demand, and the profile’s strength rises accordingly.

But the episode goes further than rankings. Swetlitz emphasizes that reviews are where decision-making happens in public, in real time. He calls review readers “the most bottom of the funnel people around,” because they aren’t browsing—“they’re making a decision.” If they read your reviews and go elsewhere, that is also leakage. Matt immediately validates this framing: “I think reviews are absolutely critical,” especially in scenarios like his law-firm rollout where a “blank” profile won’t convert.

Matt also connects this to the newer “answer-engine” behavior he’s seeing. He says, “people are asking AI what they think of your business,” describing a bottom-of-funnel pattern where a prospective buyer leans on an LLM or tool like Perplexity to summarize trust signals. In that world, reviews stop being a reputation nice-to-have and become an input to machine-mediated decision-making.

From there, the conversation moves into how AI can improve the review request process—the first major operational choke point. Swetlitz names the core failure plainly: “the biggest problem with review requests is getting people to actually write the review,” which he frames as a “request to review conversion rate.” His first lever is personalization: if the request is “more personalized, more emotional,” the recipient is more likely to follow through.

Matt’s response is grounded in SMB reality: even when clients are happy, they often don’t leave reviews. He notes that many agencies have “a lot of great clients that have never… left us a review,” and that the issue isn’t willingness so much as friction. He describes a practical constraint: “you gotta hold people’s hands sometimes because they don’t want to use the cognitive load to write the review.” The most important part of that line is the operator insight—people may be satisfied, but they won’t invest mental energy unless the process makes it easy.

Swetlitz then explains how RightResponse AI approaches personalization: pulling context from a CRM to create an emotionally specific ask. In a law-firm example, the AI can incorporate details like “the type of case,” whether it was “a negotiated settlement or a trial,” and even “the name of the paralegal that helped” so the request feels real and remembered, not mass-produced. Matt highlights why this matters in the messy real world: most businesses don’t maintain perfect CRM hygiene, but “LLMs can grab it in unstructured data… to pull it all together.” Swetlitz agrees: “it doesn’t have to be perfect… the LLM will do a good job of understanding the context.”

The discussion also covers tactics for reducing friction when someone doesn’t know what to write. Swetlitz suggests adding rotating “inspiration” prompts at the bottom of a request: “here are some things that people reading… reviews like to know about.” He even points out that location details can help discovery: asking what part of town someone lives in can be “really great for Google” when a review naturally includes that context.

The second major operational area is responding to reviews—where Swetlitz argues most businesses underperform because responses are treated as a checkbox. He describes an “agentic flow” that evaluates incoming reviews, starting with legitimacy: “Is this a legit review?” If it’s spammy or irrelevant, it’s filtered out. The system also evaluates whether to use a reviewer name (some are initials, all caps, or business names) and whether a review was updated, which matters because platforms may only show the new version. If RightResponse AI has the earlier record, it can detect the delta and respond appropriately—“should we be happy? should we be sad?”

What Swetlitz calls the most differentiating piece is the “fact library.” On onboarding, they “go back and reread the last thousand reviews” to understand what customers actually talk about, then compare that against typical response behavior. His critique is blunt: “most people use templates or they use generic AI,” and those responses “don’t add anything to the review… they just parrot back the review.” In contrast, the fact library is used to produce responses that add helpful, business-specific information—the kind of detail a prospective customer can act on.

He gives a concrete example: for a natural food store, if a reviewer mentions loving the milk selection, a generic response would merely thank them; a more effective response would add specifics (“goat milk,” “raw cheeses,” and other items) pulled from the business’s materials. The point isn’t keyword stuffing—it’s making the response useful and informative for humans reading it later.

Matt reacts strongly to this “non-template” philosophy, recognizing it as both a credibility play and a conversion lever. Swetlitz explains why it works in two directions at once. For the original reviewer, the reply feels engaged—“they’re actually sharing something with me that I didn’t know.” For prospective customers scanning reviews, the pattern is unmistakable: one business looks indifferent (“thanks for a review”), while another looks operationally present and informative. And as Matt has been emphasizing, LLMs are also parsing these signals.

The episode also touches the broader tension of AI skepticism versus practical leverage. Matt recounts talking with a VP of Innovation who believed “there’s really no use cases for AI,” and he pushes back: “everywhere I look, if I put on my AI glasses, there are use cases.” The conversation repeatedly returns to the idea that AI wins when it removes micro-friction and improves relevance—not when it tries to sound vaguely “human.”

Swetlitz gives a specific implementation “pro tip” that mirrors this: “the narrower we make that AI prompt, the better it is.” If you ask one prompt to evaluate a review against 30 facts, quality degrades—“it always gets the first one right and always gets the last one wrong.” Matt connects this to his own work building agentic flows, where enterprise-grade systems often assign “almost every data point… its own agent.” His takeaway is straightforward: “breaking it down, making it very narrow… that’s a fantastic pro tip.”

Finally, Swetlitz shares a multi-location case study: a client with “around a hundred locations” originally planned to have regional managers handle review responses, but the workflow became so simple that a single person decided to run it. With “so few negative reviews,” positives could be handled automatically, and negatives required direct outreach anyway. The outcome was improved consistency, higher quality, and reduced internal effort—regional managers could “focus on improving the business” instead of writing replies.

The episode closes with two “unknown secrets” that summarize the thesis. First, “no one is closer to the bottom of the funnel than someone reading your reviews.” Second, “more people read your reviews than read your website,” which leads to a practical directive: “bring the website to the review”—put genuinely useful information where customers (and models) are actually paying attention.

Q&A

  1. I run a local business—should I respond to every review or only the negative ones?
    Responding matters because review readers are making a decision right then. Automation can handle many positive reviews, while negative reviews often require direct outreach and follow-up.
  2. What does Google get from reviews—why do they affect visibility?
    Reviews act as a proxy for popularity. More reviews signal that more real customers are choosing the business, which strengthens the business profile’s presence.
  3. What is review leakage and how do I spot it in my own operation?
    Leakage shows up when people who could leave reviews don’t, or when people read reviews and choose a competitor. It’s lost momentum at the profile level and lost conversion at the decision point.
  4. If my star rating is good, why should I care about the written responses?
    People read reviews at the bottom of the funnel, and responses can shape trust. Helpful responses that add useful detail can influence the reader’s decision.
  5. How do I avoid replies that sound like templates or generic AI?
    Build responses around business-specific facts so the reply adds something new, instead of parroting what the reviewer already wrote.
  6. What’s the fastest way to increase review volume if my customers are happy but busy?
    Reduce friction and increase personalization. Make the request emotional and specific, and give people simple prompts so they don’t have to start from a blank page.
  7. What kind of CRM data actually helps personalize review requests?
    Use contextual details that reflect the customer’s experience—such as the type of engagement, outcome details, and the names of staff who helped—so the request feels specific.
  8. My CRM is messy—do I need perfect data hygiene for personalization to work?
    Not necessarily. Unstructured information can still be used to understand context and craft a better, more specific request.
  9. Should I include photos in review requests?
    In some industries, adding a relevant photo can increase emotional connection and make the request more personal.
  10. What prompts can I give customers who don’t know what to write in a review?
    Provide rotating “inspiration” questions about what readers care about, such as what they liked most, what was handled, or location context.
  11. How can RightResponse AI help with filtering spammy or irrelevant reviews?
    It can evaluate whether an incoming review appears to be a legitimate review of the business and filter out content that doesn’t belong.
  12. What should I do when the reviewer name is weird—initials, all caps, or a company name?
    Decide whether using the name improves the response. If the name looks unreliable or awkward, skipping it can be better than forcing personalization.
  13. How should I handle an edited or updated review?
    If you can compare the old and new versions, respond to what changed—whether the update signals improvement, disappointment, or a new issue.
  14. How do I keep review replies compliant in sensitive industries like healthcare?
    Use a dedicated safeguard step that checks for personal health information and prevents it from appearing in a public response.
  15. I manage many locations and response quality varies by region—how do I standardize without adding headcount?
    Simplify the workflow so responses are consistent and high-quality, then automate routine positives and keep humans focused on exceptions and negatives.
  16. Why do reviews sometimes matter more than my website content?
    More people read reviews than read websites, and review readers are often closer to making a decision. That makes reviews a primary trust surface.
  17. What does it mean to bring the website to the review?
    Include genuinely useful, business-specific information in responses—details a customer would normally have to visit the site to learn.
  18. If I’m skeptical about AI, where should I start without boiling the ocean?
    Start with small, high-impact steps that are easy to run reliably, then expand as you build confidence and operational clarity.

Transcript

Announcer: 0:00
This is the Unknown Secrets of Internet Marketing. Your insider guide to the strategies top marketers use to crush the competition. Ready to unlock your business full potential. Let's get started.

Matt Bertram: 0:14
Howdy, welcome back to another fun-filled episode of the Unknown Secrets of Internet Marketing or the Best SEO Podcast. We're focused on LM visibility. I'm really going to try to get that uh switched up after 12 years. We also will have a new intro coming in. We are on uh YouTube at Best SEO Podcast. So we would ask you to go check that out. Uh leave a review, a like, a comment, let us know. It's it's kind of quiet over there. Uh, but thank you so much wherever you're listening. Um, I have an exciting guest for you today. As I've been going down the rabbit hole of AI and LLM visibility, um, it is expanding into every area of digital marketing and beyond. And one of the things that I'm doing selfishly is I brought somebody on. Uh, I'm working on a strategy for a major law firm to spin up uh, you know, a hundred something locations across the United States. Uh, and generating reviews for each one of those locations has been quite difficult. And so uh I have an expert that also I just finished my Harvard uh, you know, business executive course, but I have someone here that has actually an NBA from Harvard. Also, I've been taking a lot of uh AI courses uh from Wharton, uh, and I have somebody here that has actually graduated from Penn. So so so I've done the certifications, but I have have somebody that that that's done the actual work and is uh focused on on AI. I've taken a bunch of certifications from IBM. He was a managing director at IBM, so there's some associations, but uh he's the real deal. So I want to bring on George Swetlitz with RightResponse AI. That's a uh AI review management course. So George, welcome to the show.

George Swetlitz: 1:55
Yeah, no, that was very interesting and you unique introduction. So thank you, Matt. Happy to be here.

Matt Bertram: 2:02
Yeah, no, I um I I'm I'm excited to talk to you because there's there's a lot of kind of uh points that cross over. I would first say uh I'm in a uh Oxford program right now, uh for AI, uh, which uh sat what is it called? Uh stochastic algebra and uh gradient descent and all this stuff with AI is uh really having to dust off uh what I learned in school. So it it's really challenging me. Enjoyed the the Harvard program, uh got a better understanding of kind of the Harvard associations to uh the different companies out there and how papers are structured and who writes the different books for the uh Harvard Business Review. So it was very insightful. I can also tell you that um out of all the courses I've taken, the the Wharton classes for AI were just my favorite. I don't know. Uh and I I did have a buddy uh that went to Wharton and I was like, man, I should have gone to Wharton like that. Like I love all this stuff. So um I I thought that that was uh interesting. And um, you know, what you're doing right now makes a ton of sense. Uh, there was actually uh a number of tools that people are launching, uh, even like you know, uh like a LinkedIn kind of assessing um how people write on LinkedIn in their profile to understand maybe what disk category they go in or how you should speak to them and like sentiment analysis we were talking in the pre-interview is huge. And and also if there's kind of like I I I if it's not on your roadmap right now and you're not currently doing it, I know you will, where you have agentic agents that are like figuring out, oh, hey, we need this kind of review, go get this kind of review, you upload your list. Like, there's just so much you can do with workflows today. But I would love to hear you kind of set the table for why you decided to launch RightResponse AI and kind of what is the the core um problem that it's solving.

George Swetlitz: 4:01
Yeah, no, great. So, you know, prior to doing RightResponse AI, I was the CEO of a 220-location roll-up of hearing aid clinics. And so, you know, when you're running a lot of locations, you can do paid search and you can do paid social to bring in customers, but ideally, what you want people to do is just come to you. You just want them to call you because that's the most cost-effective route. And so we spent a lot of time trying to figure out how do we get more people to just call us, and that led us to obviously the Google Business profile and your presence there. And so, as we dug into that, it was a really dynamic space. You know, you have you have leakage coming from the profile in the sense because you don't get you don't get enough people leaving you reviews. Anyone who doesn't leave you a review that could is leakage, and it's reducing the power of your profile on Google because a review to Google is a proxy for popularity. The more re more reviews you get, the more Google says, man, this is a really popular place. So review leakage is an element of that. Then you get a review, and sometimes they're negative. And so, what can you learn from that to do better so that your average rating's higher? And then the reality is that regardless of whether you're advertising or not advertising, people read reviews, they read reviews today more than they visit websites, which is fascinating. So when they come to your review, you know, you talk about terms like bottom of the funnel, they are the most bottom of the funnel people around. They're not reading your review for fun, they're reading your review because they're making a decision. And so if they read your reviews and go somewhere else, that's leakage. So we noticed in my old company that uh locations that we had that uh had great reputations had higher response rates from paid social pay search advertising. Why? And we didn't quite get it then. But the reason is is because of the reputation. They come to your profile, they read your reviews, they're great reviews, and so they buy from you. So so after we exited from this company and chat GPT came out 2022, I sat back and said, you know, I think I think AI could really help us kind of deal with this problem at scale. And so I brought a team together and we built RightResponse AI, not as a review management system, but as a revenue enhancement system, right? To cut out the leakage. So back to you.

Matt Bertram: 7:15
I I love that. And and there's a couple points that uh I I need to be taking better notes as you're talking. So I might I might miss some of these. But the the first is I love the idea of uh leakage, right? Like um, you you should try to be mapping one-to-one. That's kind of what digital is, like your digital profile versus your public, who you are. You're trying to map those, and and that's what Google's trying to do is create kind of a twin, a digital twin across the internet. And so every patient should let leave you a review. And typically the bad reviews, people leave are uh there's a higher likelihood that the data says to leave a bad review than good reviews. So you got to kind of pull out the the the good reviews from the people, but I like that idea of mapping it, like it gives you a pulse on on how you're doing. Uh, also reviews, uh, besides the name of the profile in GNB is the the number one uh kind of decision factor. And why is that? Going back to what you're saying, um, well, it's people trying to decide what they're gonna do. I'm even seeing this in the data right now. I'm doing uh a couple of different studies, and people are going to Chat GBT uh or or whatever they're using, uh, the perplexity uh you know, search engine. Um, a lot of people are starting to use that as well. I think that that's why ChatGBT just came out with theirs. But essentially, um people are asking AI what they think of your business. Okay. And so people are doing that bottom of the funnel. Let me try to make a decision. And then you said something else is the ads that you're running uh are converting better than if you have good reviews. Well, why is that? Like we can't track attribution very well. I mean, Google gets a lot of last click attribution, like Google's working great, but it in reality, it's all the other things that people are doing in that customer journey to make that decision. But the most influential thing you can do is leave a review. And then the last thing that I would say, and I forget exactly what this term is called, but people are busy today and they don't have time to do all this deep research, which I think a lot of people are starting to lean on. Um, you know, the large language models to do that deep research for them, which makes a lot of sense. But people are using uh reviews as a proxy, uh, and I forget the exact term, but I the data was like something like that. You have to see at least seven reviews to 12 reviews to it be a uh a snapshot and uh enough that someone's gonna take action on it or believe it, like if you only have one or two reviews, and so um people are just taking reviews as a proxy for is this product good or not? And then you even see the summaries that are happening on Amazon or wherever, where they're summarizing the sentiment and the reviews, and they're giving people an even shorthand form of even looking at the reviews themselves, and so that's also what the AIs are doing. Like, is this a good company or not? So I I think reviews are absolutely critical. Finding a way to to pull out those reviews, to get those reviews, uh, and to get people to relieve those reviews is a full-time job. And and uh it used to be on the different locations, and and also, you know, I've talked to a lot of the different review tools, which they try to wrap reviews into other tools and it kind of bloats the product. And and we've danced around on a couple different uh services uh from a review standpoint, and in this even project for this law firm that I'm working on, it's the number one issue. Like, if we're gonna spin these up, like you can't just have a blank review, like like that's not gonna convert anybody, that's not gonna be helpful. That might even hurt us if we have like zero reviews across you know all these different businesses. So I think reviews are absolutely critical. I would love to hear more about the logic or the thought process of how you injected AI into your review service.

George Swetlitz: 11:13
Yeah, no, absolutely. So let's like let's break it down. Let's go step by step and talk about each element. So let's start with the review request. So what what's the what's the biggest well? I won't ask you the question. I'll just, you know, put it the the biggest problem with review requests is getting people to actually write the review. So we talk about the request to review conversion rate, right? Because you know, you know, every especially if you're a law firm, like we we work with a number of personal injury law firms, and you know, you don't have that many customers. It's not like McDonald's. You you know, you every customer you have is important right in terms of getting a review. Yeah. So one of the things that we found that's really useful to drive that conversion is personalization. So if you can make that review request more personalized, more emotional, then that person has a high likelihood of writing a review. Right. So hey, so you know, how do you how do you react to that?

Matt Bertram: 12:29
Well, I know that digital marketing agencies, uh, unless you're like high volume, high churn, uh, we we have a a lot of great clients that have never less never left us a review, right? And I and I put it on the account managers to like, hey, you need to get a review. Like you've done something that should like that's review worthy this month. Like, what is one thing that you've done there a client that you can show his success factor for and ask for a review? And not only that, there's there's kind of starting to be a proliferation of all these different review sites and people are checking out different stuff. And so reviews in general are just becoming a big issue, and and I think you just have to have a process, but typically it's like an email, right? And it's not like I'm even thinking, oh my gosh, like what if you could like ask a form of like you ask a question, they put a word in, they put a word in, they put a word in, and then it generates the review for them or like a couple different options, and then they can select one because you you gotta hold people's hands sometimes because they don't want to use the cognitive load to write the review. Like they might want to give you a review, but it's it's too much friction. That's that's what I've found.

unknown: 13:39
Like that.

George Swetlitz: 13:40
Yeah, yeah, no, that's great. That's actually in our development path. But what we're doing now is providing our clients with the ability to pull in from their CRM any information that that would be useful to making that request more emotional. So think about a law firm, the type of case, it was a motorcycle accident. Was it a negotiated settlement or a trial, a victor, you know, kind of a win at a trial? Um, what's the name of the paralegal that helped? The lawyer, right? So you can pull all that information in and AI can write this really nice request that incorporates all those elements.

Matt Bertram: 14:20
Yeah, I can see that. That that's fantastic. And it is unstructured data, because I I think the the the sticking point previously that I've found with salespeople or CRMs is uh if you don't have somebody constantly focused on keeping that uh the data hygiene clean, um, you know, if you were just grabbing fields, a lot of times they wouldn't be filled out or whatever. But LLMs can grab it in unstructured data uh to pull it all together. Yeah.

George Swetlitz: 14:48
That's right. So it doesn't have to be perfect, yeah. But the but the LLM will do a good job of understanding the context.

unknown: 14:55
Yeah.

George Swetlitz: 14:55
So that's the first point. The second point is in some industries, maybe not uh law firms, but for example, real estate agents, you can include a photo. Yeah, include the photo of the couple in front of the house that you took anyway. So include that into the request. And the third kind of very different point is, and you talked about that before a little bit. Sometimes people don't know what to write about. Yeah. So if you can put at the bottom of the request, hey, if you're looking for some inspiration, here are some things that people reading of reviews like to know about. And then you can put at the bottom and it can switch up. You can have 10 questions, say rotate. What type of case did we handle for you? What did you what was the thing that you like the most about Joe, the attorney? What part of town do you live in? Because it's really great for Google if if the review says, oh, I'm in South Houston or I'm over here, wherever you are, so it knows that you are in that area.

Matt Bertram: 16:04
And I and I can see that for different industries or even different areas, you can you can build like templates that they can load in that gives them suggestions, right? Or even a gentic in the background, knowing some of this stuff with the the the background information on the company. Um, no, I I I think this is a great use case. I I was actually talking to this VP of innovation on this other project I was working on, and there was like a big article that came out. Now, now I saw the article of like, you know, now now layoffs are happening and AI's take like laying off 10,000 people or something. I think I saw that yesterday. But previous to that, there was an article out there going the AI hype's over, and there, you know, whatever. And I felt like this VP of Innovation read that same article as me because I felt like he was kind of speaking the talking points, and he was like, Yeah, there's really no use cases for AI. And I was just like, uh everywhere I look, if I put on my AI glasses, there are use cases, and um, so I think like we need to get into the language of speaking AI, and then and and we went through like four or five points almost immediately. And this is a publicly trade company, yeah. And he's like, Yeah, that that that's a good idea. That's a good idea, that's a good idea. And and and I would I would say that this is firmly in that category of uh, you know, all the friction, I just call it friction, of putting a review together. Um, AI can help you solve that, can help you reach out, can help you get it. And and reviews are probably for for digital marketing. I would say, yeah, uh, and we need to highlight this more. I wish I had a data point on it, but it's probably one of the most, if not the most important thing that you should be focused on generating for your business is making sure the leakage uh that every single review uh that's possible to get, you should get it. Like you should like I have customers that have been customers for a long time that honestly, like I have it on my list. Like, we need to get a review from them, but we haven't done it yet, or we, you know, there's not a proper way to ask for it, or like it just hasn't been a focus. And and sometimes clients end in their contract, like we finished whatever project, we built that website for them. And I was like, hey, get a review. Yeah, we're gonna get a review. My team, like, we're gonna get a review, and then they don't get a review, and so I can think of I would say probably more than half of our clients uh and past clients haven't left a review, and the number's actually probably even higher than that. So I I I and and we don't have that many clients, like we need reviews, right? Like it, like you know, we're not doing the high volume, like bring them in, and you got you know 10 to 40 uh patients a day that you get reviews off of. Like we get like you know, one to two clients new a month, you know, and I mean, so uh yeah, so I I think they're critical.

George Swetlitz: 18:56
Yeah, they're critical. Everyone's critical. And and so what we're planning in the future on the development path is is that you know, if we follow up the second and the third time, maybe the third time what we do is we say, look, we understand you're busy, just type a couple of words, like type some phrases, don't worry about yeah, perfect grammar, perfect language, just tell us simple things and we'll draft a review that you can edit. Yeah. And then they can go and they can take that. And so, you know, if you give them things, if you give them, if you give them an emotional reason, if you if you suggest things to talk about, and then later even help them draft it, the conversion of request to review is going to go up. And that's the first leakage point.

Matt Bertram: 19:44
So, yeah, so so I love the emotional trigger standpoint and the sentiment and then pulling in all the relevant data from the CRM, hooking into that. I feel like when I talk to most businesses, even medium sized businesses, I work with a lot of medium sized businesses, they Are not effective with their CRMs, like and the CRM system is like so critical, uh, for whether it be sales or client management, depending on how they're using it, what it's for. Um you need to bolt everything into that. That's your like hub and spoke, right? And to to I I haven't talked to I don't think one review company, and maybe they do it, but I I I don't I haven't heard it. Like that's critical. Plug into that data, pull that data out there, and and help that in crafting their reviews. I love that.

George Swetlitz: 20:32
Yeah, yeah. No, it that the integration part is complicated. And you know, I make the point to the to companies that like you mentioned that are you know low to medium volume, that this is so important, it's worth someone spending, you know, even if you do it in a Google She's not that hard to do. And if you can double or triple the number of reviews you're getting by by spending an extra 30 minutes a month putting these important variables into a Google sheet, it's worth it. Yeah. Yeah. And so, you know, some of our larger clients they spend the time to add fields in their CRMs that can that integrate into our systems, but that's tougher for kind of small and medium.

Matt Bertram: 21:20
So so I would love to geek out for a second and and go into kind of what's going on under the hood of where AI plugs into this process, like in the workflow. I would I would love it if you would share whatever you're willing to share about that.

George Swetlitz: 21:36
Yeah, I mean, I think you know, the the the the more interesting side of the actual workings is on the response side. The response to the review. So let's let's let's fast forward to that and we can talk about kind of what's going on under the covers. So when a review comes in, we we bring the review in, and then we have an agentic flow. So we look at a lot of things. Is this a legit review? Right. So a lot of a lot of elements to the term legit, but is this a does this seem like it's a review of this business? First step. Because a lot of times people just type stupid stuff and it's not really a review.

Matt Bertram: 22:23
I see that on Facebook right now. Like Facebook reviews are like not even helpful anymore, it's just a bunch of spam.

George Swetlitz: 22:30
Right.

Matt Bertram: 22:31
Yeah, right.

George Swetlitz: 22:32
And so that comes up sometimes in Google or other platforms. And so if that happens, we filter it out. We look for the name, like we we have an agent that looks at the name and and decides what the name, what name should be used in the response.

Matt Bertram: 22:51
Uh-huh.

George Swetlitz: 22:52
Right? Because sometimes, you know, it's a company name. Sometimes it's just a list, you know, it's it's initials. It might be all upper caps, you know. It it looks at it and says, I'm I'm confident we should use this name, or I don't think we should use a name at all. Let's just skip the name.

Matt Bertram: 23:10
Okay.

George Swetlitz: 23:10
We look to see whether the review was updated. Is this an updated review? And if so, do we have a record of the earlier review? Yeah because in Google, you don't get that, you just get the new review. So if we've been working with a client, we actually know what the update is, and we can then look at, well, what's the change? Should we be happy? Should we be sad? Right? So we do this whole series of things. And then the part that's really the most interesting, I think, and what makes our responses kind of, I think, the best there are, is that we we create a fact library for the for the client. So before when the client on boards, we go back and reread the last thousand reviews, and we look at the kinds of things that people talk about. I like then we look at the responses and we say, is there anything in the responses to those reviews that has a marketing element to it? And most of the time the answer to that is no, because most people use templates or they use generic AI. And so the responses don't add anything to the review, right? The responses don't add, they just parrot back the review.

Matt Bertram: 24:31
Wow. So yeah, so you're you're helping tweak the reviews to increase the conversion rate and where the reviews actually uh yeah. I mean, we we try to mention those things to the clients, like these are the keywords, or mention the service or whatever, but to help tweak the review where where it speaks to different emotional elements or even um marketing elements of that. And then I like that idea, which I actually haven't even done that. That's a brand new idea. So thank you, is to go look at all the reviews. Like we we build brand guides for clients, or we look at all the information and we're building you know, questions and you know what people would ask, but understanding the the the review architecture of uh of what they have, I think is absolutely critical. I like that.

George Swetlitz: 25:21
Yeah. So I mean, this point that we're talking about is is is why we have the customers we have, because we do a better job of responding to reviews than anybody else. Because, so for example, you know, I I was just doing a demo the other day and I used uh a natural food store, just I just scraped their website, put it into our system. You know, we developed a whole set of facts. And you know, if somebody writes in and says, I just love the selection of you know natural milk products, we had a fact that said if somebody talks about the natural milk products, because that's something that a lot of people talk about, talk about the fact that we carry goat milk and raw cheeses and all of these products that this store carries. And we pulled that off of the website. And so a generic AI responder would just say, Oh, we're glad you loved our selection of natural milks. Our responder says, We're glad you loved our product selection of milks. But next time you come in, make sure you look for our goaty, our natural goatis, our you know, all of these products that they have.

Matt Bertram: 26:38
So, George, you're you're you're putting that together, you're scraping that on the sales side of it, right? And so then you're saying, hey, here's what's information about you that you didn't even know, right? And then you're saying also our review tool will help enhance that so more clients will talk about it and we'll increase your conversion rates by doing it. So we have the we have the answer to this and through this software. If you sign up, that's that's a very strong sales pitch.

George Swetlitz: 27:07
Yeah, yeah, no, I know it's really cool. And so, and why is that important? It's important for a lot of reasons, right? It's important because for that customer, they're gonna get a notification of your review, they're gonna read it, and they're gonna look and say, wow, like they're actually sharing something with me that I didn't know. Okay, now a prospective customer is reading these reviews, and they're noticing they actually like in their brain, they're noticing that these responses are good. They're actually helpful. I'm learning something from reading these responses. And so if I'm trying to decide where I'm gonna go, and I look at one company and it's just thanks for a review, thanks for a review, thanks. And this one says, Hey, thanks, that's great, but we have this and we have that. Yeah, you think these guys care, they're more engaged in their business, they care about the reviews and how they engage with that. And to your point earlier, LLMs are reading these things, yeah. They're reading the reviews and they're reading the responses. If you go in to perplexity and you ask about and you ask about the reviews, and then you say, Well, what about the responses? Yeah, it'll say, uh, well, there's really nothing useful in the response, it'll actually say that. Yeah, they don't really engage, they don't, it's not really useful. Yeah, and so the whole world's changing. And so, what we're trying to do, or what we're doing, is we're uh we're trying to reduce the leakage at that point of engagement. Yeah, we want someone to come and start and look at those reviews and then decide to buy. That's the goal.

Matt Bertram: 28:56
So, so George, like one of the things that's starting to happen on this podcast, because I I mean, I we started using AI a lot, okay. And then I was like, I can't speak about AI because I don't know it enough yet, right? So then I like went on this flurry of let me learn everything I can about it. I'm constantly you know doing certifications and going to conferences and learning learning everything. One of the most insightful things, and and a lot of why so anybody that's listening, if you have an interesting AI first company, uh reach out to me because uh I I interviewed a um a driver's ed company, okay? Uh AI first driver's ed company that was an AI first, legit AI first company. They're gonna be the largest company uh in North America in the next three years based on their projections. And why is this, right? And so I'm kind of removing from from this to kind of talk to everybody on the side and say it's something like 71 or 72 percent of businesses still operate like they're in the industrial age. They're just now moving into the digital, the digital transformation age, and now we're moving into the AI age, and businesses that are are are using agentic flows, which are like employees, and they're building it in not just basic automation, but but decision making, um, and and helping you enhance what you're doing. That's why, what is it in the stock market, you got these companies that are just taking off because they are they're moving into AI first companies, and companies need to be either building themselves to become AI first or they need to be using tools and companies that are AI first to get that kind of leverage. And and I just want to say, George, some of the interviews that I've had when I'm talking to business owners from AI first companies, like when you wear the glasses of AI, it just like makes so much sense. Um and so, so this is this is uh really exciting to see see what you're doing. And I already know that you're you're doing great things, but everybody, this was off of a uh inbound pitch. So uh I didn't know Jane, uh George, he's not part of my network, but I I I love everything you're doing, George. So I just wanted to kind of insert that. What what else do when you know when you talk about this uh or you're talking to customers uh that that you think would be useful to add to this conversation? Or are there any case studies um that that you think are uh really impactful that you would like to share?

George Swetlitz: 31:35
Yeah, so so you know, when we started talking about the response stuff, we you know, I that was a little bit of a digression from the agentic flow. Yeah. And so let me go back to that for a minute. And so when that review comes in, we look at that review, we look at all those facts that the business has, and then we determine which ones are relevant. And then when we write the response, we incorporate all that. So it's exactly what you're saying. It's you know what we one of the things we've learned, and this is kind of like an AI pro tip, is the narrower we make that AI prompt, the better it is. Right. So to kind of make that real. So if somebody has 30 facts and we have a single prompt that says evaluate this review for all 30 facts, it always gets the first one right and always gets the last one wrong. It's just the quality goes down the more you ask it to do in a single go.

Matt Bertram: 32:35
So, so to make this uh tangible for people that are not uh deep in AI right now, uh, I'm building some agentic flows for content creation, for example. And previously, our our workflow and our prompting was larger chunks of information where it's processing that. And then I got on like an enterprise style tool, and you know, they have their their workbooks of how they're building it. And I was like, oh my gosh, like almost every data point has it has its own agent. So it's it's you know, ones and zeros asking the question, it's very narrow. And I was like, okay, I would do a prompt for all five of those uh outputs, and then the enterprise agentic flow is like, no, we have five different agents, one for each one of those, and all they're evaluating and what they're focused on is making that that one field or that that that one uh data point or or whatever it is, uh like a meta description or a title, like it's all focused on that. And and uh you're right, the longer you go, sometimes they'll get confused, they'll mix up stuff. Um, and and so yeah, breaking it down, making it very narrow. I think that's a fantastic pro tip.

unknown: 33:47
Right.

George Swetlitz: 33:48
Yeah, yeah. So, you know, so if it's a medical business, for example, we have a separate agent that looks for PHI, right? You know, and and all of it.

Matt Bertram: 33:59
PHI, sorry, what's PHI?

George Swetlitz: 34:01
Personal health information, okay, okay. Right. So, you know, you don't want to include anything in there that would be a violation of the uh yeah, you know, HIPAA and stuff like that. So we identify those things in a separate agent and it just sits on the side, and then later on it goes in and makes sure that none of that stuff makes it into the response.

Matt Bertram: 34:19
George, what's that called? What is that called? Like, what is the like like scientific term for narrowing the flow? Like, what would you call that?

George Swetlitz: 34:30
You know, I don't really know.

Matt Bertram: 34:32
Okay, I was just curious.

George Swetlitz: 34:33
Yeah, I don't know if there's a technical term. Uh-huh. Yeah, don't know. Okay, anyways, you know, yeah, but yeah, so so I would say, you know, I I think what what everyone needs to think about is you know, regardless of what business you're in, what are the micro steps that you can take to leverage AI? Yeah. Right. And and we do kind of one piece of what companies have. We just we're just focused on this one little element, but there's lots of elements. And I my advice to you know business managers, leaders, owners is don't start with the big things, start with the small things that have a higher likelihood of being able to run and run effectively. And you know, and and and start that way and build your way forward as you get more expertise with those systems.

Matt Bertram: 35:32
Awesome. Give us a case study before we go.

George Swetlitz: 35:36
Yeah, yeah, absolutely. So we have uh uh a client that this is a great example because they have like around a hundred locations with regional managers, and so they were looking for a tool that was that was going to do a better job helping their regional managers respond to reviews because that's how they've done it. And of course, they all responded to varying degrees of you know sophistication and all of that. So there was a lot of variability across the regions. And so they came to us and we we generated these facts and we gave them a uh trial and we got done and we put it into practice, right? We they they signed up. And I realized when I was looking at the account that they hadn't invited the regional managers into this, into their so I called up the guy I was working with and I said, you know, I thought we were gonna include the regional managers. And he said, Well, you've made it so simple. Right, that and the responses are so good that we just decided that I'm just gonna do it. I'm just because there are so few negative reviews, and for the good reviews, we generate them and send them out, you know, we generate and reply automatically. So it's just the entire process is automated. And for the negative ones, I need to go reach out to the customer anyway. So I'm just doing it all myself. So when you think about that, the quality went up, the amount of resources went down pretty dramatically, right? Those those regional managers can now focus on improving the business, running the business and not worry about this stuff. And this guy who's kind of does marketing-related stuff is just so happy because he can now you know do that job in a fraction of the time in a much higher level of quality. So that was a very recent, you know, proof point to us that we're on the right track.

Matt Bertram: 37:42
I love it. All right, and you you've already shared with some great tips, but one of the tips that we're uh asking, and we're we're trying to make some shorts and you know get into the YouTube game, even though we're a podcast. Um uh the definition of podcast has changed, everything's changing. Um, what are some unknown secrets or underlying underutilized secrets of internet marketing? There are probably things related to reviews that people haven't thought about or should consider.

George Swetlitz: 38:10
So I think there are a couple of things that people, you know, that people don't think about. And I I I said it a little bit before, but no one is closer to the bottom of the funnel than someone reading your reviews. That to me is is just something that everyone needs to think about all the time. Where are where are my prospective customers? They are reading your reviews, right? So I would say that's one, that's one secret. And uh another another unknown secret is more people read your reviews than read your website. I didn't know that. And so she will spend a tremendous amount of time on their websites, but the the secret is you gotta bring the website to the review because that's where people are engaging. You can sit there and try to ignore the fact that everyone's reading your reviews, or you can bring your website to the review. And I would say that's the second secret.

Matt Bertram: 39:17
I love it. I love it. So, George, how do people follow you, hear your thoughts, uh get in touch with you, find out more about RightResponse AI?

George Swetlitz: 39:27
So we have uh well, write responseai.com. Um and if you go to write responseai.com slash best SEO, right? That's your podcast, best SEO. If you go to write responseai.com slash best SEO, we have in there the ability, two things. One is the ability to set up a call with me directly, and the other is a coupon code that you can use to get 3,000 free credits if you upgrade. A paid account with us. So just a couple of things for all of your listeners.

Matt Bertram: 40:06
Awesome. Well, thank you, everyone that's listening. Go go check it out. Uh, I think George made a very strong case on uh how you need to focus on reviews. Um, and uh thanks so much for coming on the sore uh show, George. And uh everyone, if you need help with these strategies, if you need help uh crafting uh you have a problem, you're looking for an outcome, reach out to us at EWR Digital. They're the sponsor for our show that keep the podcast going. Um, I've recently launched MatthewBertram.com. Uh, also from an entity standpoint, we're talking about AI. Uh, George, you'll appreciate this. Uh, I've published books, I've been doing this for a long time. I used to go by Matt Bertram. My name is actually Matt Bertram. I have MatthewBertram.com. Uh, the LLMs and the search engines thought I was two different people. Uh, and so I'm trying to unify uh uh and merge uh those two profiles on mine. So uh uh check it out, guys. Let me know. Uh until the next time. My name is Matt Bertram. This is the Best SEO Podcast. Bye bye for now.