Guy Powell is the President and Founding Partner of ProRelevant Marketing Solutions, where he focuses on helping marketing leaders connect activity to measurable business outcomes. His work emphasizes customer-centric strategy supported by analytics—reducing guesswork and helping teams make clearer decisions about where to invest, what to prioritize, and how to improve performance over time. Powell is also known for training and presenting practical measurement methods to marketers around the world, translating complex ROI and effectiveness concepts into tools teams can actually use. He has authored books including Marketing Calculator: Measuring and Managing Your Return on Marketing Investment and ROI of Social Media, both centered on bringing rigor to modern marketing and demonstrating impact. With an MBA from the University of Chicago and an engineering background from Lehigh University, he blends quantitative discipline with real-world marketing leadership—an approach that maps naturally to customer experience and reputation work, where credibility, consistency, and customer trust must be earned and proven.
Guy Powell frames this conversation as a results problem first, not a “cool marketing tech” problem—because, as he puts it, marketing ultimately answers to outcomes: “it’s results … regardless of what we want to do in marketing.” That lens is what makes the discussion about online reviews—and how AI can support them—feel practical rather than theoretical.
Early on, Powell introduces George Swetlitz and sets the stage: Swetlitz built RightResponse AI after living the review-management pain firsthand. Swetlitz explains that when he was CEO of Alpaca Audiology—a business with “220 clinics”—they faced a very operational challenge: reviews were constantly coming in, they needed to analyze sentiment, and they needed to respond, all while trying to “grow organically through map … getting higher and higher on the map.” He describes the pre-LLM tool landscape as both expensive and insufficient for what they needed in 2021, and that gap became the motivation for building something new once early versions of ChatGPT appeared. In his words, it became “essentially the company that we wish we could have worked with.”
From there, the conversation moves into Powell’s key concern: how do you scale personalization with AI without taking unacceptable risks? Powell names the problem directly—“personalize at scale”—and asks about approvals and the danger of removing humans from the loop. Swetlitz’s answer is a practical risk segmentation: where AI gets “riskiest” is with “negative reviews like 1 2 and 3 star reviews that have comments.” For those, Swetlitz says they recommend not using full auto-response; instead, AI should draft and a human should review. The core value is workload compression: AI can handle the bulk of low-risk work so the team can focus on the small set of reviews that truly require human judgment and customer recovery.
That idea lands with Powell immediately. After Swetlitz describes using AI to draft “highly personalized” replies and letting staff focus on the minority of risky reviews, Powell paraphrases the shift in a way that’s easy for operators to remember: “using an AI suggestor as opposed to an AI responder.” He reinforces that it changes the job from grinding through hundreds of low-value replies to focusing on customer satisfaction and retention—work that is inherently more human.
The conversation then pivots to review generation and why personalization matters before a review even exists. Swetlitz describes the “review ecosystem” as a chain with different intents at each step. If the goal is improving map rankings, he argues the biggest lever is getting more reviews—review generation—and not just sending requests, but getting the customer to convert into an actual “review writer.” Swetlitz’s stated mechanism is personalization. He contrasts a generic request (“thanks for coming … can you write us a review”) with a message that uses real relationship context, like loyalty duration and staff names. Powell strongly validates this point because it aligns with what he experiences as a recipient: “I’m so tired of getting that … please write a review and tell us how great we did.” He says the generic versions get ignored and filtered out—“you just kind of put them into the junk box and move on”—whereas the personalized version makes engagement more likely.
This is where the conversation confronts a common tension: people criticize AI because it “takes the humanity out” of interactions. Swetlitz flips the argument. He says the humanity is already missing today—not because of AI, but because the scale is too large to handle manually. In his phrasing, “the humanity isn’t there today because the scale is so great.” The “cool thing,” he says, is that AI can “put the humanity back into it” if humans supply the context and decide what personalization should look like. Swetlitz emphasizes that this isn’t about robotic tone-matching; it’s about relevance and truthfulness in the context you already know about your real customers. Powell captures the risk on the other side with a term he repeats: “fake personalization.” He describes the kind of AI-driven outreach that scrapes public info and pretends familiarity—“I’m getting all these fake fake personalized requests”—and agrees with Swetlitz that review workflows are different because the people are actual customers and the business already has real context it can responsibly use.
A major chunk of the episode is dedicated to voice-of-customer analysis that goes beyond star ratings. Swetlitz explains the operational challenge: reviews are “unstructured text,” and a single review can contain multiple, even conflicting, signals (“I love the burgers but hate the fries…”). His approach is to aggregate and structure that chaos into actionable trends by building categories and topics per business, initially derived from the Google Business Profile description and then editable for companies that already have scorecards. When reviews arrive, they “chop that review up into phrases,” discard phrases that aren’t actually about the business, and map the remaining phrases to the topic model.
The KPI Swetlitz highlights is “percent positive,” which he defines as positive mentions of a topic divided by total mentions (positive + negative). The point isn’t to replace the star rating, but to create sub-ratings that tell managers what to fix. He explains the manager problem clearly: if you’re “yelling at the manager because his rating’s too low,” the manager’s fair response is, what exactly should I change? The structured topic view answers that question directly. Powell reacts as a customer experience practitioner would: “now I finally know how to fix things,” and contrasts that with the uselessness of an isolated number: “you got a 3.7 star rating … what do I do with that.”
That critique extends into a discussion of NPS. Swetlitz calls the single-score approach “the worst possible thing” because it tells you people won’t come back “but you have no clue as to why.” Powell adds his own field observation: NPS got gamed over time, with employees pressuring customers to give a high score for bonuses, which can distort actual sentiment and still fails to reveal what needs improvement.
Later, the conversation touches pricing philosophy—but more as a business-model reflection than a pitch. Swetlitz argues AI changes SaaS economics because marginal costs aren’t effectively zero. He describes seeing fixed-price SaaS models “breaking” as providers do more and more work without pricing alignment. He then recounts a customer-driven shift toward lower base access plus usage-based billing, so locations with low review volume aren’t forced into a flat fee that feels misaligned.
Powell asks for real-world lessons. Swetlitz shares two concrete outcomes. First, a beta customer using a personalized review requester saw a very high request-to-review conversion rate—“something like 60%”—which he attributes to using multiple personalization elements, including photos. Second, he describes restaurant chains using sentiment analysis to diagnose location differences and improve outcomes, citing average rating increases on the order of “point 2 point three” (for example, “from a 4.4 to a 4.7”) over narrower, more relevant time windows. Swetlitz stresses that Google’s visible “all time” averages can be slow to move, while what matters operationally—and for ranking—is what’s happening recently. He states this bluntly: “Google doesn’t care what your average rating was four years ago.”
The episode closes with forward-looking AI product thinking and career advice. Swetlitz expects “more and more personalization and more and more … analysis,” and he discusses how newer model controls (like adjustable reasoning/verbosity via API) can make production AI both more capable and more cost-effective. Powell also raises compliance concerns in healthcare: Swetlitz describes a “HIPAA scrubber agent” that identifies protected information in reviews, with configurable strictness, and feeds those constraints into response drafting. Finally, Powell asks what new marketers should do as AI disrupts jobs. Swetlitz’s answer is direct: “everyone in marketing has to be an AI expert,” including learning beyond basic chat usage to more sophisticated, API-level capability.
0:09
hey Guy Powell here and welcome to the next episode of The Backstory on Marketing and AI and if you haven't already done so
0:16
please visit pro relevant.com and sign up for all of these episodes and podcasts
0:22
I am the author of the upcoming book The AI Marketing Machine and you can find out more information on that
0:28
at marketing machine dot Pro relevant.com Today I'm interviewing George Swetlitz of RightResponse AI
0:37
and let me tell you a little bit about George he is the founder uh co founder and CEO and board leader of RightResponse AI
0:46
which is a reputation management platform built to help location based businesses grow
0:53
through smarter AI powered review response insights and strategies
0:59
George is a Harvard MBA and a former CEO of Alpaca Audiology where he scaled a 220 location healthcare business
1:08
he brings deep experience in operations customer acquisition and applied AI with a focus on results
1:16
not just theory and results is what the boss wants regardless of what we want to do in marketing
1:23
it's results George welcome such a so great to have you on today nice nice to be here absolutely
1:30
so what is your back story on AI and marketing how did you get involved in all of this well as you mentioned
1:35
I was the CEO of a company called Alpaca Audiology which was a consolidation
1:41
of hearing aid and audiology clinics and we struggled with these issues
1:47
we had 220 clinics reviews are coming in we have to analyze those reviews for the sentiment
1:56
we have to respond to those reviews we're trying to grow organically through map
2:02
you know through getting higher and higher on the map and we struggled with this issue
2:08
the the platforms that were out there now this is back in 2021
2:13
the platforms that were out there were expensive and frankly didn't really meet our needs
2:21
in terms of what they were capable of doing so we ended up exiting the business
2:26
it was private equity backed and we we exited the business in 2021 and then of course in 22 2022
2:33
the first versions of Chat GPT were launched and I started thinking
2:39
this might be the solution to this problem that we had and so I got my team together
2:47
the development team and we decided to create RightResponse AI as a way of creating it
2:53
essentially the company that we wish we could have worked with when we were at alpaca
3:02
yeah and you know you're right now one of the things though and certainly for larger businesses
3:09
is always an issue is is you know where you could potentially use AI
3:15
to personalize at scale uh so to speak and that sounds like what you're trying to do
3:21
how do you handle kind of the approval process if you're uh
3:26
responding to reviews and other customer inputs using AI without like a a human in the middle
3:35
so where where AI can be the riskiest
3:40
is when you're dealing with negative reviews like 1 2 and 3 star reviews that have comments
3:47
and so we recommend for example to our customers that they not auto respond to those reviews
3:57
we we do have an autoresponder and we have and what makes our product unique is the fact that we
4:04
allow people to personalize responses through the use of what we call facts
4:09
and you know 99% of the time
4:15
you can do that in a very automated way with terrific results
4:20
what we find though sometimes on the negative reviews it doesn't quite capture the context quite right
4:26
and you're better off having a human in the middle but the benefit of AI
4:32
is that it can draft these highly personalized reviews
4:37
and essentially allow your team to focus only on those those one two and star three star reviews effect
4:45
which for most businesses are maybe 50%
4:50
so they might get you know 1,000 reviews but they only have to deal with 50 and that yeah
4:57
changes the whole nature of the work effort because now you're focused on actually
5:03
working with those customers satisfying satisfying those customers you know regaining those customers as a spend to as a
5:11
as opposed to spending your time on 950 reviews where you're not really adding that much value
5:17
as a person so it's kind of like using an AI suggestor as opposed to an AI responder and focusing it in
5:25
I really like that on the on those where you might actually uh you know
5:31
I guess that's a question you know it would be interesting to know is if you were to change the sentiment of those negative
5:39
response negative reviewers to be more positive is that worth more than turning a
5:46
a positive review into a super positive experience
5:52
or a super positive individual I wonder which one you know ends up having having the most value there
6:00
yeah I I think I you know the review ecosystem is very complex and
6:06
you know there's all different elements to it starting with the review request and each step of the chain
6:13
it's for a different it's for a different person or a different party so you know
6:19
when you're when you're trying to get higher in the map rankings
6:26
the most important thing is getting more reviews and so that's review generation
6:33
and it's not enough just to request the reviews you need the person to actually write the review right
6:40
they have to actually convert into a review writer and in our opinion in our experience
6:48
the key to that is personalization so you can just send out a review request that says
6:55
thanks for coming we love you can you write us a review or you can say thanks for having been here
7:03
we value your loyalty over the last 4 years Judy was so appreciative that you came here last week
7:11
it would really help us get the word out if you left us a review we would all appreciate right it's a much more personalized review
7:18
so you're doing that you know for you know so my point is every step of the way
7:23
it has a different intent yeah well you know I like your point
7:29
cause I'd be honest with you I'm so tired of getting that well I really we're so glad that you were here
7:34
please write a review and tell us how great we did whereas I do like what you said there is uh
7:39
is to say and Judy would really love to uh appreciate uh
7:45
you know how you felt about the experience and and and that personalization is so different from what you
7:51
you know what you get so many emails with this stuff you just kind of put them into the junk box and move on
7:56
as opposed to a personalized one like that you you might actually be more likely to write a review
8:03
that's right that's right when you connect with somebody at any stage of this process
8:09
when you connect with someone you're more likely to get them to engage with you right
8:14
it's kind of just natural and what people what people write about
8:19
negatively about AI is it takes the humanity out of
8:26
the problem is is that the humanity isn't there today because the scale is so great
8:32
you get so many requests there's so many reviews it's impossible to do each one individually
8:40
you can't do it and the cool thing about AI is it actually allows you to put the humanity
8:46
back into it because you're the one you're the human that's deciding how you're going to personalize it
8:56
and that puts the humanity that that makes it real it's not AI just responding to something without any context
9:04
you're providing the context and that's what that in my mind that's how you need to use AI
9:12
AI to differentiate yourself it's not the kind of the robotic thing
9:18
but it's how do you and it's not fake personalization you know I get I get all these emails
9:23
I'm a B to B right so I get B to B
9:28
you know things all the time where people will go out and they'll look at my
9:35
you know they'll look at my they don't know me but they'll go look at my LinkedIn or they'll go look at the website
9:41
and they'll fake personalize their engagement with me oh yeah through AI right
9:49
but in the review ecosystem it's all real they they actually are your customers
9:54
you know a lot about them already and so all you're doing is saying
9:59
I want to incorporate why they were here their their existing loyalty
10:04
the names of their favorite provider whatever it is I want to incorporate that in that request that's a real personalization not fake personalization
10:13
and that is I I wrote that term down fake personalization cause that is what you get nowadays and
10:19
and I'm getting all these fake fake personalized requests for something
10:25
whatever it is you know sell me this sell me that whatever buy this buy that and that is so true
10:31
it's a you know Guy I saw this on you know blah blah blah and I really think this would be a great fit for you
10:37
and you and it's written so much the same as everything else so uh but I do like you know
10:43
like you said that that that you know had Judy especially if you know that in your case with the audiologist
10:50
you know that she was my audiologist and she's the one that spent the you know the half hour with you
10:56
and she's the one that you got to know and spoke with her and asked about her kids or whatever you know
11:02
so that that really is that's great I really like that you know now one of the things though with reviews is
11:09
um uh you know there's certainly the ratings um you know which is you know you get one star
11:14
you get five stars and I like your breakdown about the you know the negatives so to speak and the positives and
11:20
and then there's other kinds of reviews um different types of reviews of
11:25
you know one of them is this topic topic level rating tell me a little bit about that tell me about how those compare and how you use those
11:34
so so you know when you think about reviews
11:41
it's a lot of unstructured text and reviews are very complicated
11:47
I love the burgers but hate the fries the service is great but the floor was dirty
11:52
you know I mean there's you know there's so many things that are said in a paragraph
11:59
so when you look at a single review it's just like a mish mash but if you look at 100 reviews or 200
12:07
you can start to pull out trends so what we do is
12:12
we create a set of topics for every business
12:19
based on the Google Business Profile how they describe themselves we create a set of topics and you know
12:27
a topic might so a category might be in a restaurant might be food quality and a topic might be freshness
12:37
um ingredients whatever it is and that so you have this list of categories and topics
12:46
which we create but then you can edit a lot of big companies have their own scorecards already
12:52
they know how they want to think about their business so they can go in and modify these things to be consistent
12:57
with the way that they examine their business then when a review comes in we chop that review up into phrases
13:05
and we look at every single phrase and we say one is this about the business
13:11
right because someone might say I went to McDonald's and hated it and then I came to your restaurant and loved it a typical sentiment analysis will say
13:18
that first phrase is negative but it had nothing to do with your business so we look at that and we discard it
13:26
then we look at all the phrases that have something to do with your business and we map them against the topic
13:34
and then we roll all of that up and we calculate something that we call percent positive
13:40
so it's the percent of positive it's the essentially the number of positive mentions of freshness
13:48
divided by the total mentions of freshness positive and negative and then we chart all that
13:53
and so the beauty of that is that your rating is a single number you got a 4.5
14:03
but it might be because your service was great but the food wasn't or it might because the food was great
14:10
but the service wasn't and if you have a number of locations and you're yelling at the manager
14:17
because his ratings too low his response is gonna be what do I need to fix
14:24
and you can either say to that person well go read the reviews and figure it out yourself
14:30
or you can say here it is what you need to fix is something about the
14:36
you know the freshness or the ingredient something about the quality isn't working
14:42
and then you can go to the other location and say look your issue is
14:47
there's something going on with your staff they're not you know friendly enough to the customers
14:55
so essentially you get a sub rating hmm right it's a it's a rating
15:01
but it's at a lower level than the whole business that allows you to actually create change
15:08
and drive your average rating higher over time yeah that uh
15:13
makes so much sense and uh and that's actually part of uh you know some of the uh
15:19
methods for experience uh customer experience except now taking then what's
15:25
you know really the core of what's driving that experience which are each of and every one of those
15:31
I don't know if you call them touch points but attributes related to the experience and then rating those and I really like your point is
15:39
because now I finally know how to fix things whereas if I just give you well you got a 3.7 star rating yeah
15:46
what do I do with that that's that really makes a lot of sense yeah I mean you know you have this whole
15:51
there was this whole industry that grew up around you know that I I it's slipping out of my head right now
15:57
but it's kind of that single number that single score about whether people do you know what I'm talking about
16:03
yeah that's the NPS NPS right the net promoter score gotta love it
16:08
everyone needed to have an NPS everyone needed to do that and as I as I got more involved in kind of
16:14
RightResponse AI to thinking about it I was thinking like that's like the worst possible thing because yeah
16:20
you know that people don't want to come back but you have no clue as to why
16:28
you can't do anything about it and so essentially you looking at unstructured text
16:34
I think is a far superior way it's the most honest way of evaluating and
16:42
what's going on in your business yeah yeah absolutely and that's you know when NPS first came out
16:48
it was definitely groundbreaking and then it was applied to business systems and what have you and it
16:54
I think it really made a difference early on and then the employees figured out how to game the system and
17:02
you know cause I don't know about you but you you get these uh you know you experience some kind of a service
17:08
and then the employee the service provider says to you and rate and if you can rate me a 9
17:14
because that will give me a you know a bonus or whatever so of course you know hey I like the guy he did a good job
17:20
you know OK I'll give him a 9 but that isn't truly first of all it's not really representing my sentiment
17:26
and then second of all you're you're right you can't you can't make changes to your business for those people that you know
17:33
are just giving you a number cause you don't know what it is you have to do to fix it and I man I really like what you're talking about there
17:42
yeah so now one of the things you talk about as well is different pricing models and
17:49
you know and pricing of different SAS based
17:55
software as a service based services are uh you know kind of a challenge because in some cases
18:03
you know you on the one side the the the accounting guy just wants a you know a continuous price uh
18:09
price or cost so we can budget for it and then on the other hand I'm paying for stuff that I didn't get uh
18:15
so how do you how do you work through all of that yeah that that's a great question I mean
18:20
it's very complicated especially when you when your business is built around AI
18:25
you know the typical historical SAS business you you built the platform and then your marginal cost was like zero right
18:33
I mean other than your infrastructure your marginal cost is zero but with AI it's not zero you know you always have additional costs
18:39
and so you know you at you know at as
18:44
as we talked about before as things get more precise you'll you can do more things
18:51
you can generate more analyses but if you're stuck in this kind of fixed cost model
18:58
you can't afford to deliver it and now I'm actually starting to read articles about that
19:03
about how you know sass businesses are breaking because they have fixed cost models
19:10
but they're doing more and more and they're kind of losing you know they're losing money so what happened this
19:16
it started for us with one that realization that we could do more over time
19:22
and how do we account for that in our pricing but then I had a customer and he he
19:29
you know he had a number he had a number of locations and he called me and he said
19:35
you know can you give me a discount at that point in time we were charging I think it was $33 a month
19:42
and then we kind of had a cap on the number of reviews you know kind of the typical
19:49
and I said why what's going on and he said well you know some of my locations get five or six reviews a month
19:55
like I can't spend $33 for five reviews it's just it's just so much money
20:03
and I thought about it and I was like you know he's like he's right that was like what I was so frustrated about when I was at alpaca
20:11
you know Bird Eye big the podium they wanted so much money I had 220 locations you know
20:17
200 to $300 times 200 locations $60,000 a month I can't spend that kind of money
20:23
it's just I don't have it for my marketing budget so so
20:30
I look I said well what about if I charge you 10 bucks and then you pay for your usage
20:35
and he goes I love it I'll do it and then I found out that he actually had all these other businesses that I didn't know about
20:41
that he never put on the platform because he wasn't gonna spend $33 for right and he put all those businesses on the platform
20:49
and I realized that that's kind of you know that's a it's a it's a better model
20:55
and so we went back and we rewrote the whole essentially the whole billing element of the platform
21:01
to be usage based and so when you're stuck in that fixed price model
21:08
you come in and I have to charge you separately for the requestor I have to charge you if you want the map Break tracker I have to charge you for your own reviews
21:14
so with our model you pay 10 bucks you get access to the entire platform and you just basically pay for what you use
21:22
and people just love it they just love being able to turn stuff on and turn stuff off and use this and use that
21:29
and experiment and not have to worry that you know
21:34
that we're making money on breakage essentially yeah which is how well you know this is how how it all works in typical yeah
21:44
well and you know at some point um you know even though it is a usage based model
21:51
which is potentially relatively unpredictable at some point though
21:56
you get averages and it does become predictable so you know you
22:01
you know you're in to your point you know I $33 a month per uh per store or per location times 200 locations bang
22:09
it's $66,000 and um you know I can budget that the controller can budget that the marketer then says
22:15
yeah but I can't spend that amount you know it's just too much so let me spend you know let me eat you know
22:21
let me spend based on what I can eat or what I need to eat and uh uh and yet that also becomes more predictable
22:29
you know as soon as you kind of get going with it and you start you know when they 10 locations and
22:34
you know and then you can I guess uh build up to it that makes uh you know a lot of sense and it still becomes predictable because you
22:41
you know what your history is it uh and once you have a history then you can start to predict things yeah and
22:46
and what happens is that what we find is that our customers experiment so they start with the review responder
22:53
and then they start playing with the map rank tracker and then before you know what they're doing 3
22:58
you know three keywords every month across their locations and then they you know
23:04
call us about sentiment analysis and help us set that up so what we see is that they naturally grow
23:11
revenue per location per customer grows over time as they
23:17
experiment and get comfortable with the platform and as we launch new things we're always launching new
23:23
new features so you know we we now consume
23:30
you know 1,000 reviews the sentiment analysis of 1,000 reviews
23:35
and we summarize it so that a marketer can send it to the location manager
23:43
so we built that now and we charge like two bucks per report
23:51
well now they're like oh well that for two bucks I'll send it to all my managers well they have 50 locations
23:57
but that's still it's 1 hundred bucks right so now we've just increased our you know our revenue for that customer by $100 a month
24:04
yeah so yeah and and it's for value right it's for value we delivered them something they wanted
24:10
and they're willing to pay for it so it I I really I think the usage based model is where things are going
24:18
stripe just launched you know their usage based product right
24:24
so you can kind of hook into stripe to do your billing we did it before stripe had a usage based model
24:30
so we had to do we had to do all of that ourselves but but that's kind of the direction that sass is headed
24:39
yeah yeah yeah exactly so um well give me a couple of for your application
24:46
uh um uh RightResponse AI uh tell me give it give me a give us a couple of different like
24:53
case studies and really some real world lessons that you've Learned from analyzing your
24:59
your your your customers activities and reviews yeah well
25:05
you know what what what we find is that you know so for example we have a a business that never really did a lot of
25:17
they were they didn't really have a lot of reviews and they didn't really know how to go about
25:23
getting a lot of reviews and so we launched when we launched the review requester
25:29
we asked for some beta you know customers to work with us and we got them to work with us and
25:37
the the personalization of the review request or what they found was that their
25:44
their request conversion rate was super high
25:49
it was something like 60% so 60% of the requests that they sent out were being
25:55
turned into reviews which is really really high in the industry and it was because they were using all the elements
26:03
right they were they were personalizing the request they were enclosing photos
26:10
right and and and so it was such a personalized request that the conversion rate was very
26:16
very high so we've seen that kind of thing we've also seen people be um
26:24
really drive their average rating through leveraging the sentiment analysis
26:32
really figuring out we have had restaurants that struggled kind of chains
26:38
restaurant chains that struggled with kind of understanding what was the difference in their locations
26:46
and we were able to see review average review score increases of you know
26:54
point 2 point three so like from a 4.4 to a 4.7
26:59
because they were able to fix the problems that they were experiencing in the restaurants
27:05
hmm yeah wow and uh you know and that especially when you get up near the top
27:11
where it's so hard to you I mean you can't get higher than 5 I guess in your scale you know as you get up towards the top
27:17
it gets harder and harder to go from 4.7 to 4.8 to 4.9
27:22
and uh being able to do that is uh is pretty that's pretty pretty remarkable yeah
27:28
so I mean and just to kind of let me just when we talk about those things so what people think about is what you see on Google
27:38
which is all time reviews and all time ratings all time average rating so all time reviews
27:44
all time average rating those numbers can be very hard to change because you have a huge denominator right
27:52
that you're dealing with but Google Google doesn't care
27:59
what your average rating was four years ago it's kind of irrelevant
28:05
right what matters is what's happening in the business today Google doesn't tell you that
28:12
we do right so we tell you what your average rating and your
28:18
average monthly reviews is for the last two months and the last 6 months
28:24
because that's what matters what's happening now
28:30
and when you're trying to change things you wanna see well what's my
28:35
what's my average rating over the last six months versus my average rating over the last two months am I getting better or worse
28:42
and so when I talk about those improvements it's kind of over those much narrower time frame
28:49
which is really what drives your ranking anyway yeah yeah
28:54
yeah absolutely so what's the future where do you see things going
29:00
more and more personalization and more and more
29:06
I would say analysis so everyone complains about chat GPT5
29:14
like all you every day you read oh chat GPT5 was this big failure blah blah blah
29:19
but it wasn't like if you're in the if you're in the business of using AI in production
29:29
chat GPT5 gives you this whole world that you didn't have before
29:35
so you can go into chat GPT5 you know using the API icon if you're in production mode
29:40
and you can set the level of uh reasoning and the level they called verbosity
29:47
like how much output and how much reasoning and the ability to go in and say
29:54
here's a prompt that does a very specific thing that before I needed to use 4 o
30:00
which was a very expensive model but now I can use 5 mini with a medium level
30:09
of reasoning and get a better answer for less money
30:17
so what does that allow you to do it allows you to do more and keep your pricing the same
30:24
or it allows you to cut your pricing and that is huge and so for us you know
30:31
we use a lot of different models we have backups on but but the you know right now we're really experimenting
30:37
working diligently on how do we take advantage of this much more granular
30:45
you know a you know settings capability within five that we never that we didn't really have
30:52
you know quite the control over before hmm yeah interesting
30:59
uh just curious um when you're personalizing reviews and let's say in the medical
31:05
you know you came out of audiology and um how do you make sure that you're not violating
31:12
you know hipaa on your you know on the way that you let's say requesting reviews or uh
31:19
I guess it's really requesting reviews how do you make sure that you know you're not
31:25
you know giving up any uh you know hipaa potentially hipaa related data yes so we actually have
31:31
like a hipaa scrubber agent that we use for medical businesses
31:38
and so the business can turn it on and say I want the agent and the agent analyzes
31:46
talk about like a review and a review response it analyzes the review
31:52
it identifies all the protected information
31:57
in the reviews we have different settings so like strict setting cause you know
32:03
strictly speaking you can't say anything just responding I'm from a strict perspective
32:11
just responding acknowledges that that's your patient and is technically a violation of hipaa
32:19
but I I don't know anyone who's ever been held to account for that so we have like a moderate version which says
32:27
you know if somebody says something I got a test result and it was terrible
32:32
it'll say test result can't mention it just talk about your experience right
32:38
it'll the the scrubber goes through identifies the phrases
32:44
and then suggests what should be done and then that the results of that agent
32:51
then get put into the responder which incorporates that
32:58
when writing the response yeah interesting actually as you were talking about that though
33:04
I could also see you know if I had you know I just got some uh
33:10
kind of a a negative news you know my my X y Z score was you know
33:16
out of range and and it's bad uh you know my Mia's now giving a review
33:24
uh you know about you know what happened it's gonna be clouded by this uh you know I had a
33:30
the outcome was I had a great experience but the outcome was bad therefore uh
33:35
the great experience I can't rate it great it has I have to I just it just the way my sentiment works
33:40
I'm gonna rate it lower does that does that happen or or how do you scrub for or how do you
33:46
you know fix stuff like that yeah I mean that's always tough you never really know why somebody
33:54
says the things they say um
34:00
you know we find that you know if you
34:06
you know if you respond in a nice way you know we
34:11
we also try to do matching right so we we we try to provide our
34:17
customer with the information about who who provided that review
34:22
right because we know we sent the review we know when we sent it if we get a review in
34:28
within a relatively short period of time we can sometimes tie it back and so we can we can do some matching and and and so
34:37
you know often times you can go back to people
34:42
outside the review process and ask them to modify the review you know
34:49
when you see something like that you know something similar to that is when there's a we call it the review response
34:56
or the review rating mismatch somebody leaves you a great review but they're confused and they
35:03
they star it as a 1 star so we have an agent
35:08
very simple agent that just looks for mismatches yeah right yeah yeah and of course
35:14
if it's a bad review with a 5 star we don't say anything but if it's a right
35:20
but if it's a great review that's a one star or two star we we highlight that so that in the response we can say
35:30
hey we think you got the star wrong you know it's great review you know could you go and change it to a
35:39
you know a higher star and so you know there's things that you can do
35:44
to manage that process yeah well or you could just exclude them I guess all together um
35:52
or something like that so um well this this has been fascinating I've been taking notes over here and really fascinating
36:00
um so but we've got a break here but I did have one last question for you and that is
36:07
you know AI especially in marketing is definitely disrupting the employment marketplace
36:15
so what would you what would be your advice for an up and coming new marketer as they're trying to break into the
36:21
into the marketing workspace yeah I I think that everyone in marketing has to be an AI
36:31
expert I I don't think it's possible to be in marketing
36:37
and not be able to leverage these tools in a sophisticated way
36:42
and so I think it means that people are gonna have to you know not just get chat GPT
36:51
but have access to the API know how to
36:57
you know know how to leverage and understand the details like
37:02
you know I'm in my 60s and I'm learning that stuff
37:08
and so if you're in your 20s or in your 30s you have like
37:14
you have a long road ahead much longer than mine and so you gotta start now
37:19
you gotta be in there you gotta know how to use it in a very sophisticated
37:25
you know in depth way you if you think you're gonna get there by using somebody else's applications
37:33
or playing with like chat GPT in my view that's not gonna cut it
37:39
yeah yeah yeah thank you for that that's I think that's great advice
37:44
um George this is wonderful and man I wish I could I've Learned quite a bit and I really appreciate that
37:51
and as I said I took a bunch of notes and so definitely thank you for participating today
37:57
and where would you like viewers to learn more about you and your company where would you like them to go
38:03
so RightResponse AI dot com that's our that's our site uh
38:08
we even have the ability to schedule an appointment you know anybody can schedule appointment if they want to talk to me
38:14
they just say we want to talk to George and they just put it right into the appointment request and I'm the guy who will show up
38:19
so we're you know we're very service oriented we love talking to people
38:25
um and the that's the best way to get in touch fantastic well
38:30
I want to talk to George too so but for the audience please go to RightResponse AI dot com
38:37
RightResponse AI dot com and otherwise uh please stay tuned for many other videos in this series
38:44
of The Backstory on Marketing and AI and if you'd like to learn more about my upcoming book
38:50
The AI Marketing Machine please go to Marketing machine. Pro relevant.com George
38:56
thank you so much this has been really eye opening and fantastic way to apply AI
39:02
to a critical piece of the of the marketing puzzle well thanks guys it's great to be here it's great talking to you
39:09
it's a wonderful conversation thank you absolutely thank you