073: The Risks of AI-Generated Content at Scale: How One Business’s Strategy Led to Traffic Declines (Office Hours)
In this special episode, Senior SEO Strategist Mandy Politi reveals key strategies for mastering SEO in the age of AI, focusing on Google’s EEAT guidelines. Through the compelling case study of a real Optidge client, Mandy and Danny highlight the risks of relying too heavily on AI-generated content for the sake of ramping up quantity, emphasizing the importance of aligning content with user intent. The conversation shares expert insights on on-page and local SEO, comprehensive content audits, addressing technical challenges, and best practices to ensure long-term success in search rankings.
An Optidge “Office Hours” Episode
Our Office Hours episodes are your go-to for details, how-to’s, and advice on specific marketing topics. Join our fellow Optidge team members, and sometimes even 1:1 teachings from Danny himself, in these shorter, marketing-focused episodes every few weeks. Get ready to get marketing!
Key Points + Topics
- [3:10] Mandy stresses that AI isn’t inherently wrong; it’s about following Google’s EEAT (expertise, experience, authoritativeness, and trustworthiness) guidelines. When content prioritizes search engine rankings over users, it leads to unqualified traffic and poor performance, as we see with Client X.
- [5:14] She details the dangers of using AI to mass-produce low-quality, unoriginal content. These “bad” practices target irrelevant keywords, leading to poor user experience, high bounce rates, and long-term damage to website performance, including not being indexed.
- [6:37] Mandy says driving qualified traffic is crucial because it positions content in relevant search results, attracting users genuinely interested in your services or products. Unqualified traffic results in users quickly leaving the site, signaling to Google that the content is irrelevant, harming the site’s performance.
- [7:25] Targeting irrelevant keywords negatively impacts a website’s performance and rankings. Users will quickly leave when they realize the content isn’t catering to them, answering their questions, or prompting them to learn more. In industries like healthcare, failing to establish the EEAT criteria will lead to losing credibility, making the site seem irrelevant and harming long-term success.
- [9:48] Mandy describes how client X targeted irrelevant keywords, such as whether cats can have ADD, leading to traffic not aligning with their services. Although the blog post appeared in search results and brought thousands of clicks, the audience that the business serves was children – not cats – so users were quickly exiting the site and the traffic was producing low-quality, irrelevant leads given the business’s target audience.
- [11:02] This approach signaled to Google that the content was irrelevant, hurting the site’s performance despite high traffic numbers.
- [11:18] Mandy illustrates how, over time, this website’s performance dropped as their reliance on the SEO consultant increased. The proposed and adopted strategy of targeting unqualified traffic resulted in users leaving the site quickly. Despite initial gains, this caused a significant decline in overall traffic and page performance.
- [12:58] In all of Mandy’s years of SEO experience, she’s seen many focus on upward traffic trends. But upon closer examination, she found that much of that traffic was irrelevant or heavily branded, which can often cause confusion about whether traffic is attributed to organic search or the SEO work being done to a site.
- [13:40] As she says, true optimization isn’t just about clicks or growth; it’s about targeting the right transactional keywords that drive meaningful engagement.
- [14:42] Mandy explains how Google specifically targets “Your Money Your Life” websites, such as those in medical, financial, or legal fields, which must meet higher standards of expertise, trustworthiness, and experience due to their potential real-life impact on users’ well-being.
- [16:37] Mandy details how harmful duplicate content can be and how it dilutes SEO value, as seen in Client X’s case. The duplicate pages they had were competing for the same keywords, confusing Google and resulting in none of them ranking well.
- [18:29] Mandy stresses that subheadings are crucial for on-page SEO optimization. They help search engines understand content and allow users to assess its relevance quickly. But, Client X had multiple pages with the same structure, which confused both Google and readers, reducing overall effectiveness.
- [21:38] Mandy indicates that Client X’s site had major SEO issues, including duplicated meta titles, poorly optimized meta descriptions, missing H1 tags, and images without alt text, which are basic yet essential for search engine optimization.
- [22:22] She added their pages linked to external websites rather than internal content, which could drive users away from the site and weakened its SEO performance by directing valuable keywords to other websites.
- [23:40] Mandy indicates that the website also had poor anchor text optimization.
- [23:55] The website also lacked a clear differentiation between linking to blog posts and key service pages, with no calls to action or mention of the services offered.
- [24:15] She adds that basic SEO practices, like avoiding duplicate content and targeting relevant keywords, were neglected, which should have been foundational knowledge for any SEO effort.
- [25:41] Mandy explains that canonicals are a soft hint to Google about which page to prioritize when multiple variations exist, commonly used in e-commerce. But in Client X’s case, using canonicals instead of redirects to address duplicate pages led to potential technical issues, causing Google to sometimes ignore the hint and harming site performance, especially for a medical website where confusing URLs can signal spam.
- [31:48] Mandy advises that it can be helpful to use AI for location pages, but suggests the need for a human reviewing and optimizing the content. While AI can help create templates, a human touch is essential to ensure relevance, avoid duplication, and add localized details.
- [37:30] She predicts that with the proper audits and action plan, Client X will see an initial drop in traffic as irrelevant pages are removed. Still, they’ll see increased qualified traffic and improved conversion rates over time.
Guest + Episode Links
Danny Gavin Host
00:55
Hello, I’m Danny Gavin, founder of Optidge, a marketing professor, and the host of the Digital Marketing Mentor. Today, we have a very special guest, Mandy Politi, who’s a senior SEO strategist at Optige. Mandy is our resident SEO genius and we’ve been lucky enough to have her on the podcast before, so this will actually be her third time she’s been on to break down SEO best practices, run through case studies, and learn from her expertise.
Today, we’re back with another scenario breakdown content at scale. We’ll dive into Google’s EEAT guidelines, best practices, and why content at scale through AI can lead to problems at scale, using Client X as the example. So we’re not going to tell you who the client was. We want to keep that confidential, but you’ll still learn from the case study and get information. How are you, Mandy?
Mandy Politi Guest
01:39
Good, good. I’m very excited to be talking about this case study.
Danny Gavin Host
01:42
Yes, I know you put a lot of effort into the original audit and case study, so I’m excited that we can now bring it to everyone else. Client X thought they were on the right track with their SEO strategy. They were cranking out tons of AI-generated content and racking up hundreds of backlinks weekly. Now, they were working with an SEO specialist. This SEO specialist is known to get a lot of quick wins and traffic quickly, and any time you promise things quickly leads to issues. Some SEOs swear by this for quick wins, but the cracks started showing when Google stopped indexing their new content. Despite all the effort, their rankings weren’t improving. That’s when they turned to Optage for answers.
02:25
It became clear that, while this approach might seem practical at first and bring a lot of traffic, recent Google algorithm updates have left their site and others like it vulnerable to penalties. Before we dive into Client X, let’s first discuss some of Google’s changes and best practices, most recently less than a year ago, in February actually, it’s more than a year ago. I need to remember that we’re in 2024. In February 2023, google provided specific guidance on AI-generated content, stating that the ranking systems aim to reward original content that aligns with EAT, which stands for expertise, experience, authoritativeness, and trustworthiness qualities. So, Mandy, that’s a lot of like jargon. What does this mean in lay terms, and why should we be concerned?
Mandy Politi Guest
03:10
So, basically, it means that AI is not inherently bad. It’s just another tool. There are best practices and bad practices. Of course, before AI, google would recommend that we create original, human-generated content. Using AI to create content that way wouldn’t be a best practice.
03:35
But since February 2023, Google updated the guidelines because AI is a reality, and people are using it. It came down to the fact that we don’t mind how you create the content. It needs to follow the guidelines of quality and relevance and be optimized for users and search engines, but primarily for users. Now, the problem is that if these guidelines are not followed, many industry experts have conducted research proving that websites may perform a little bit better. Then Google would catch up, traffic would drop dramatically, and the entire website would beat the index, which happened to our client. They started seeing traffic drop, and then they started seeing a big portion of all this content they were creating that wasn’t being indexed and wouldn’t bring any traffic. So, at the end of the day, it comes down to using AI that follows the guidelines and does not try to manipulate such results.
Danny Gavin Host
04:33
Basically, yeah, that’s really good. So, to stress the point, as you said, this concept of EEAT making sure that your content has these qualities isn’t directly related to AI. It’s something that’s existed for a while, but with AI coming along, it’s easier to produce content correctly. So, it’s a lot easier to create content that doesn’t have these qualities. And if you’re producing content on a mass scale and it’s missing this quality, especially with our client who is medically related or health-related, then you’re right; it’s a recipe for doom. What is considered bad practice? How do you risk violating the guidelines?
Mandy Politi Guest
05:14
First, it creates content for ranking reasons rather than for users. It’s trying to create content on a scale that doesn’t help the users. In a way, it just aims at putting your website in the search results for any given, even remotely relevant, keyword. It’s also using automation to generate low-quality or unoriginal content at scale in a way that, again, comes down to the content being useless. It doesn’t provide any information other than aggregating different pieces of information from here and there and just trying to create enough text to give the website some chance of appearing on the search results. And it’s also websites that, because they focus so much on creating the content, they don’t pay enough attention to poor user experience in a way that indicates exactly with this client, the content was not optimized. The links, the way, and the paragraphs were too long and wordy. There was a lot of information because, again, the focus was to create as much text as much content as we could, but at the same time, when it comes down to creating content, you also need to ensure a good user experience. You need to include things that will help the user go through the content, read it, understand it, and get something useful out of it. Another thing is that the content is created just to target every possible keyword. It’s content that doesn’t drive qualified traffic relevant traffic. We have examples of that to discuss in a bit
06:37
But imagine this: trying to target every single keyword by default. It’s not going to work. Not every keyword, even if it’s a little bit relevant to what its client does, you shouldn’t be targeting all of them. Some keywords make sense. Some keywords don’t make sense. Some keywords seemingly make sense, but if you really think about it, the search intent behind the query is so different that you shouldn’t be driving it. You shouldn’t be targeting it. Driving it, but you shouldn’t be targeting it. And something else: Why is it important to drive qualified traffic? It’s not only because you want to position yourself in the relevant search results and to drive the users interested in reading your content and potentially asking for more information about your products or services. It also has to do with whether you drive someone uninterested in what you offer and they click back; this signals your content is irrelevant.
07:25
So, in the long term, this impacts performance. It impacts your website’s rankings; imagine now having that on the scale. Imagine having 1,000 pages of the all-target irrelevant keywords. Over time, people will consider your website irrelevant. It’s not, and it shouldn’t be performing, especially because you’re in the medical health industry and want to make sure, as we said in the beginning, you need to establish your expertise, experience, authoritativeness, trustworthiness, all of these qualities. So people land on your website because you’re trying to clickbait them into clicking on the website, and then they leave because what they see is not what they want to find on the website, and they click back.
Danny Gavin Host
08:07
Over time, your website is going to start seeing drops in performance. It’s as if the SEO specialist they were working with was just taking a machine gun and shooting wherever possible, but not realizing that damage happens. So because you produce a bunch of content, it’s like, okay, some of it’s good, some’s good, some of it’s bad, but the bad, who cares? The problem is what we’re about to say. We’ll see that, actually, it can hurt things. So, coming back to Client X, they had tremendous content, right, a tremendous amount of content that Google was not indexing, and remember that was the first sign, hey, Danny and Mandy, for some reason, our newest blog post. So, we did a deep dive audit. I’m saying we, but primarily Mandy needed to determine some of the reasons why this was happening.
08:54
I will give a quick overview and then go through this in depth. You’re not even keeping track of what’s going on. There’s a lot of duplication that can happen. They had suboptimal on-page SEO, so a good part of SEO is the content, how you format it, your tagging, and all that was missing. They had non-indexed pages like we spoke about, and most of their pages were performing poorly. Finally, they created content designed for search engines rather than content intended to attract and be valuable to their target market, resulting in unqualified traffic. The traffic coming in is not people interested in their services but rather just popular traffic that has nothing to do with anything. So, let’s now go through this in depth. Can you share an example of what driving unqualified traffic could look like?
Mandy Politi Guest
09:48
With this specific client, they created. As we said, they were trying to target every possible keyword, so one of them was whether cats can have ADD. If you think about that, they performed for this term, and the blog post they published appeared in the search results for relevant queries. But at the end of the day, you must consider two things here. First, do you want to position your website or the first page’s search results, or the top three positions, even for a term about whether cats and animals can have ADD? The second thing is, what is the search intent behind that? Is the person who is looking for that looking for general information or a fun fact something they might have thought about looking for in the night? Or are they looking for the exact services you offer for human beings and not animals? And then what are these people going to do? They’re going to find your website. Okay, they’re going to click on it. They’re going to start reading. They will decide that this is not relevant. It’s an irrelevant website. As we said, they will click back on this; it signals that the content is irrelevant to the search intent. Are they going to spend time on the website? Are they going to click on other internal links? Will they actively ask children or adults for information about ADD?
11:02
No. So again, no matter how much traffic this blog post brought, it brought thousands of clicks, yes, and thousands of impressions.
11:13
But all of the non-branded terms that it drove traffic for were irrelevant.
11:18
They were relevant to whether cats could have this kind of symptoms, this kind of issue, and, yeah, at the end of the day, this blog over time started dropping performance, and this is also another point that even some pages that performed a little over time, because of all these signals of people leaving the website, not staying in, all of that, it started dropping over time. So you would see that the line would go up, even for relevant terms. But because the overall strategy was poor, after a while, you started seeing that the page performance would drop. If you see that over time, group the pages, and filter accordingly, it’s devastating to see. You could see that the website, all the traffic was just sinking. Again, you have a lot of content at scale targeting unqualified leads, basically, like the example of the cats, whether they can have ADD or not. The confusing thing for the client is that you know they’re seeing this report, and they’re seeing all this.
Danny Gavin Host
12:17
You know the traffic coming in, it’s growing, great, great, great. But when you break down that traffic and see, wow, 10,000 clicks are for people searching for cats. Do they have ADD? I mean, it’s just totally misleading because we can assume that most people who are searching for that don’t care for a site that helps children with ADD, and it’s just totally misleading. So that’s why, digging into your queries and seeing where the traffic is coming from, from what pages, from what queries, to see, is this increase in traffic that we’re getting? Is it good, is it valuable? You know that’s important.
Mandy Politi Guest
12:58
What I have seen in the years that I have been working in the industry is that a lot of people report on traffic and the trend line going upwards, but at the end of the day, if you go and check the queries, most of the time they’re either branded or irrelevant. It’s not only about seeing the line going upward. It’s not about seeing, month over month and year over year, the number increasing. Is it for the keywords you want to target or the keywords you include in your blog post? And is it for your money keywords, your transactional terms, that you want people to reach out to your websites to get the quote or ask for more information about the services? This is where true optimization lies, and this is where you can measure success.
13:40
It’s not about just the number of clicks. It’s about what exactly it is that it’s paid for performs for, and in this case, you might have 10,000 clicks, 20,000 clicks, or 30,000 clicks over one blog post, but if you look at it, it’s irrelevant, and most people don’t realize that over time it can actually hurt you. It’s not just let’s have the clicks; let’s continue creating content. You need to fix that. You need to remove this content. You don’t want it there. Over time, it will hurt even the content that you have on your website, which works a little bit better.
Danny Gavin Host
14:08
Yeah, Google sees that the traffic that comes from this is irrelevant, and therefore like you said, if the article about cats having ADD itself over time, we’re already starting to see it drop, how much more so those articles that are actually relevant but are hidden amongst all the junk? Of course, they’re going to be affected as well. Let’s pivot over to EEAT. Since EEAT is essential for all content, why is it critical for clients like Client X, who specifically deal with the medical industry?
Mandy Politi Guest
14:42
Basically Google. When they wrote this update, they wanted to target specific websites that fall into your money your life category. This website content can impact the life of the users, their well-being and welfare in general, or their financial ability. For example, this is anything related to medical, economic, or legal services. These specific websites are monitored, let’s say, even more broadly, by Google. They put more weight on reviewing and making sure that the websites that perform on the first page of Google align with the guidelines on how to establish your expertise, trustworthiness, and experience because these websites can actually have a real-life impact on the users and especially when you’re operating the medical vertical, you need to be even more careful about that. You cannot afford to signal Google that your website might be spammy. It might have scant content in that it’s not optimized and a little bit black hat, let’s say so, for this specific client. Of course, this had an impact because they fall into the category of your money or your life websites.
Danny Gavin Host
15:43
Right. So if you have content that’s not optimized or the wrong type of content, you actually feel the effect in many way worse than, let’s say, a site if it was about baseball cards.
Mandy Politi Guest
15:55
Exactly, and it’s generally a bit more challenging to establish an online presence for these websites based on the ranking results precisely because Google is a little more strict about whether you are 100% following the guidelines.s? Are you doing all that you can to establish your expertise on the results? So, it’s already a little bit difficult to position these websites based on the first results. Imagine now if you’re actively doing something that violates basically the guidelines. It gets a bit more tricky to manage performance and even optimize later to ensure that you take all this bad strategy away from that.
Danny Gavin Host
16:25
So, with a lot of AI content. We discussed how often multiple articles are written on the same subject. In the words of SEO, that means duplicate content. Mandy, is that really that bad? Is duplicate content harmful?
Mandy Politi Guest
16:37
Yes, because basically, you are telling Google that I have five pages that target the same thing, and okay, I can’t decide which one I want to keep. Then, Google doesn’t know to choose which one to perform, and it ends up that none of them will perform. Again, this is what happened to this client. You have page competition; if you dilute the SEO value, you will perform poorly. Google doesn’t know what to rank from your content, so it just decides not to rank anything, and it’s something that you can easily resolve. It’s a simple SEO analysis and a content audit, identifying this duplication and creating one page that better targets the content in a way that will be helpful for the user.
17:22
For this client, they created a lot of pages for a specific issue; let’s say children with ADD have difficulty concentrating. They ended up with five pages. They were very similar. They had headlines, and they were duplicates. The content was overlapping. The questions that they were trying to answer on the subtopics were the same. They were not optimized, and if you check, you will see that none of these pages are performing. They would get the same clicks and impressions, indicating that Google couldn’t decide what to run for the specific keyword, so none performed.
Danny Gavin Host
17:58
Going a little bit deeper into some of the titles, like why do you need a blog post about concentration in ADD and then another one about managing concentration in children with ADD and concentration difficulty symptoms and treatment? All three of those basically cover the same topic. There’s no reason to have three separate? Unless you’re not keeping track, it gets messy and troublesome; I know that not only did we find that the headlines were very similar, but we found that many of those articles’ subheadings were also very similar. So, Mandy, why do subheadings matter?
Mandy Politi Guest
18:29
First, they are a very important part of on-page SEO optimization. Secondly, If used correctly, you can target keywords and help Google and other search engines understand your content. Thirdly, they are perfect for users, especially in the digital era, who like to read through the content quickly, as can headers, headings, and subheadings. They help the user quickly understand if the content is relevant if it covers everything they want, and how it is, and then they decide to stay on the page and read the paragraphs. If you have a good subheading structure, optimization, and strategy, you can tell what the blog post is about just by seeing the heading structure. You shouldn’t even need to see the paragraphs.
19:12
Now, imagine having five pages with the same structure. Google gets confused. Then you have also subheadings. Again, they try to market every possible topic. It doesn’t even make sense to have such a long blog post. For example, no one will read something that is more than 5,000 words. So they’re very important for SEO and the users. They must be optimized, and you must be mindful of what you create and how different sub-handlings and pages are over.
Danny Gavin Host
19:39
Yeah, and in addition to that, I would also say, like, there’s the concept of content cannibalization, and I would explain that if you have five articles on the same topic, imagine if you just had one article and it got 100 visits. It’s not that now that you have five, those 100 visits get spread out. So, 20 visits to this article, 20 visits, and the same 100 visits-no! When you have five articles competing for the same keywords and topic, it affects everything. Often, you’re not going to show at all, like less or at all, because Google just doesn’t know what to do, and therefore, instead of getting 100, let’s say, visits over five articles, you may just now get 40 or 30, just really depends. But that’s what cannibalization does. It spreads out the visits and takes them away and can destroy the effect you’re trying to accomplish.
Danny Gavin Host
21:16
We mentioned that on-page SEO is really important for some people, which is obvious. Obviously, you need title tags, obviously you need certain tags. But we saw with this client that a lot of on-page SEO was missing. Do you want to get into a little bit more about what happened in this specific case with the on-page factors and what’s important when it comes to on-page?
Mandy Politi Guest
21:38
So of course, you want to have optimized meta titles and meta descriptions. In this case, we found that meta titles were basically duplicated by eight ones, or eight tools or headings. The meta descriptions were either too short or too long. They didn’t include the keyword. They didn’t include a call to action. Again, meta descriptions are not; let’s say Google doesn’t write them, but they still have a purpose to play and must be optimized. Then we noticed many pages didn’t have an inch-long tag, which is fundamental SEO. It’s one of the first rules you ever learn when you start working in the industry. Every page needs to have a unique H1 optimized and not spammy. It helps with the search and understanding of the page. It helps the users as well. A lot of images didn’t have alt text.
22:22
We had a lot of pages that were linking to external content, and that’s an issue because links do help a lot. They help, in my opinion, tremendously when it comes to optimizing content, re-optimizing content, and indexing new content. But one they didn’t link to was existing internal web blog content. So we had blogs they were talking about something. Instead, the SEO expert who worked on that decided to link to an external website, effectively driving the users away from the client’s website. And two, they were linking to external pages with the keywords they wanted the page they were optimizing to run for, which is bad practice. You don’t want to take Google, but there is another external page that basically, maybe even explains better or develops better the topic that you are trying to get Google to index and rank you for. So that was really. It was really astonishing for me to find it because it is basic SEO.
Danny Gavin Host
23:17
The example I would like to give is to imagine if you’re McDonald’s, and when someone walks in the door, there’s a big sign that says “Ah, the best burgers are at Burger King. Don’t eat here.” That’s crazy. Do you think McDonald’s would actually do that? But it’s crazy. A lot of people do that in their content. They take those keywords that ideally the home base should be on this page, but then they link them out and say, hey, no, Google is really the place for this, and it just confuses anyone. Should I eat at McDonald’s or Burger Burger King?
Mandy Politi Guest
23:4O
On top of that, they also had poor anchor text optimization. So, the links, even internal links, didn’t have the correct way of adding the anchor text. So, you’re going to link differently to a blog post. You’re going to link differently if you link to your main service pages, if it’s, for example, the service that you want to promote. They didn’t have that. There is no CTA and no mention of why you chose this specific client or the services they offer.
24:15
So it’s not only that you see content at scale that overlaps, creates duplication, targets, irrelevant keywords, and all of that. This is basic SEO. When you start in the industry, the first thing you learn is how to optimize a basic page, but many people don’t really pay attention to that. They don’t think that these things matter. They try to implement all sorts of crazy weird, like how they see your strategies. At the end of the day, what is this thing with bad content from good content is. Do you even follow these basic ten10 rules, which is literally the first thing that someone learns? When I analyzed this website, it hurt me to see that someone who wasn’t a search person would do that, publish a piece of content that was not optimized for anything, and link to external pages for keywords we wanted this website to run for.
25:06
So, again, it’s not about the quantity. A lot of people can generate content and start publishing. It’s better to have five pieces of well-optimized content that drive less but more qualified traffic than 10,000 pages in these states. Over time, it will help you, and it did help this client, as we will see.
Danny Gavin Host
25:24
Awesome. Thank you for that overview. So we know that the main issue, or the issue that raised the flag, was that the new content being produced wasn’t being indexed. So, let’s talk a little about indexing and canonicalization. I know that’s a hard word for some, so let’s first say the purpose of a canonical tag on a page on a website.
Mandy Politi Guest
25:4I
It’s a soft hint. You tell Google that you have a page similar to another page and would prefer these other pages to rank, not the non-canonical version. Basically, canonicals are used a lot in e-commerce websites. So, for example, if you have a, let’s say, a product, a t-shirt, and then you have variations of this t-shirt, like in different colors, let’s say black, red, white, and blue, you don’t want all of these different variation pages to start competing on the same results. So basically, you tell Google, okay, I have a blue t-shirt, a red t-shirt, a white t-shirt, and so on, but the main one they want to perform like is the main t-shirt page. So, you implement canonicals to help resolve a bit of cannibalization and competition of pages when it comes to having similar pages. This makes sense in e-commerce, but it doesn’t make sense on other websites like this one.
26:37
Another way is to send redirects if you have a duplicate page. In this client’s case, there was a technical issue generating variant pages of the same blog post, and canonicals were set instead of redirecting to resolve the issue. So basically, you are telling Google I have two pages. I prefer to write this one.
26:57
But remember, a canonical is a soft hint. Sometimes, Google doesn’t really follow the hint; they just ignore it. I’ve often encountered that I’ve been setting canonicals, and Google would prefer, for whatever reason, to write the non-canonical version. So, in this case, it was also a problem of poor technical issues. You had a problem that could quickly be resolved by setting redirect. But a canonical was chosen, and this would also impact performance further because Google couldn’t really design and they would end up performing with a page with variants. Now, the URL does impact performance, and you don’t want the users, especially the medical website, to end up seeing a weird URL because usually, weird URLs with weird characters might be spammy, might be non-legitimate websites, and again, that’s another signal if you operate in the medical industry.
Danny Gavin Host
27:50
I think it’s worth saying that the client’s platform, specifically for their website, was Webflow. I know that WordPress, which we highly recommend, or Wix, for example, those are two platforms that we think are really good from an SEO perspective. I know they can also have their problems, but it seems like with Webflow, these parameter pages are being generated automatically, causing many issues that maybe, if a different platform had been used, wouldn’t have existed. And yet there are the canonical tags that you can put there to try to solve the issue, but you know it just makes it messy and even more difficult. So, as we discussed, Mandy, they ended up with a lot of non-indexed pages, including a slew of location pages. Do you want to tell us more about that?
Mandy Politi Guest
28:30
So, locations were very important to these clients. However, looking at the data, we found that a lot of them were not indexed. Also, over time, the general performance of all the location pages increased slightly and then dropped dramatically. And then, when we looked into the performance, the good performance in the first place, we found that almost all of it was branded. If we were to remove the brand’s name, we would end up with one non-branded term that didn’t even include the location. It was an irrelevant term that brought one click over a year and disappeared. So, yeah, that was a big hit because, again, as I said, especially if you offer this type of service, like this client, location pages were very, very, very important.
29:19
So we started looking at why and what they did, and basically, the location pages were duplicate content, the same exact content, just changing the location. There was no interlinking, and there was no structure to organize them into a way that made sense, into, let’s say, regions, sub-regions, etc. And, as I said, most importantly, everything was duplicated and AI-generated. Now, with location pages, inevitably, you will have some overlapping content because, of course, if you’re operating, let’s say, in all the states, and you have different regions, it’s impossible to generate 100% unique pages. But there are ways to tackle this. Try paraphrasing, mixing a bit of the structure, and adding unique local content that will resonate with the specific location you want to target. This is a very good way to deal with the problem of duplication. And then, of course, the basic issue, like we said, optimizing the data. Make sure you have a call to action and include your services, which is very, very important.
30:18
You cannot have a page without saying that I am this provider of medical services and that this is what I can do, which again was the case for this client. It worked until Google caught up, and you started seeing performance drop. But looking at it again, from the expert perspective, like digging into data, we figured that it actually never worked in the first place because every term that drove traffic to these location pages was basically branded. It wasn’t non-branded, which is the point of optimizing with an SEO strategy.
Danny Gavin Host
30:49
So, what would you say is the best practice for using AI for location pages?
Mandy Politi Guest
30:55
I think you can use it to generate a template and then start again, as Google recommends, to generate the content but have the human being review it, paraphrase it, optimize it, and add unique information. There are ways to create prompts and add unique content to each. You have your main template, which you then start paraphrasing as much as you can, as much as possible. Then you start creating specific content or prompts for each specific location, and that way, you target ultra-local terms and make sure your pages are not duplicating, which is, for example, a good way of using AI. You save time by generating a template, and then you want special local, let’s say, prompts that will allow you to target specific content that is not duplicated and is highly relevant to the location.
Danny Gavin Host
31:48
Love it! And I’m going to stress Mandy, who’s the queen of doing things manually. She is not saying “no” to AI, but, once again, you need to use AI in a way that helps you, but you still have to make things unique, special, and valuable. The biggest problem is when you turn things on autopilot. You get a website that we are reviewing here today. We’ve just gone over all the site issues, but how did they affect the performance at a higher level?
Mandy Politi Guest
32:15
First of all, from the top 10 keywords they ran for, only a few were relevant. If you check on non-branded, it was almost nothing. So, the top terms they were performing for were irrelevant content. Second, the general performance over time decreased, and for the amount of content they had, the total amount of clicks didn’t match. So, if you have, I don’t want to say that you have 100 pieces of content, like 100 blog posts. You need to see an extra amount of clicks. But if you have 1,000 blog posts, you expect to see a certain amount of clicks if it’s healthy traffic. So, for the amount of content and the impression they’ve made to create all of this content, they didn’t really have any valuable, relevant, non-branded, qualified traffic, which is a shame because you spend money to create this content. Then you see that it has no return on investment, and you don’t get the right users. Then the important thing is that the most important pages, the ones we call the money pages, which are the service pages and the location pages, they basically don’t perform at all, which, again, you do a blog strategy because you want to complement and promote these pages.
33:32
Exactly, you do a blog post. You offer useful information. You target these people that they are looking, they’re considering, they don’t know, they might not know that they need your services. They might not know they’re considering you, but they don’t know if they want to sign up with you. If they have these problems, then link them to your service pages. They didn’t have that. You link to your location pages. They didn’t have that. So the main pages that were supposed to be boosted by this strategy just didn’t perform almost at all, and even if these seemed like they performed for a while, it was unqualified traffic.
Danny Gavin Host
34:01
And just to give our listeners a bigger perspective on numbers. So imagine, like we spoke about earlier, cats and ADD had over 10,000 clicks over a period of 12 months. ADD therapy, which is what they’re primarily focused on, had only 130 clicks, right, which is, you know, it’s almost 1%. And then ADD therapy, for children, which is even more specific and would be like their best phrase, had zero clicks. So you can clearly see that all this wonderful traffic they saw wasn’t really wonderful. It was a lot of traffic, once again going to the wrong things and all those keys, but in truth, if you go deeper, as we said before, and you start to look at the parts, it’s like, ah, this doesn’t really add up. So, Mandy, what’s next? How would this client, client X, fix the situation? Or someone from another company who would be in a situation like this? What would they need to do to get things fixed up?
Mandy Politi Guest
35:07
So, the first thing is a complete content and technical audit. You want to have full visibility on every single page of the website. You need to have everything in one document and start analyzing it to figure out exactly what is wrong and try to fix it to align with Google’s best practice guidelines. What we recommended was to do a content audit. Basically, what does that include? You export all the website pages and then start analyzing them in groups one by one, using data to drive your decisions over what to optimize and how.
35:39
A content notice is very useful too for us as search experts because it allows us to identify content gaps, duplication, cannibalization, and low-quality or poor-performing pages. From there, we can decide what we can consolidate, how to tidy up a bit of content, what needs to be removed because it’s irrelevant, and traffic like the dogs and cats having ADD.
35:58
So we recommended doing the content audit, resolving content duplication, eliminating anything irrelevant, reviewing SEO or base SEO elements, and recommending optimizations.
36:11
We also recommended doing a technical audit because we found technical issues that required fixing, such as the canonicals.
36:22
Usually, when you see that the blog post is not optimized and doesn’t even follow the basic SEO guidelines, chances are that your technical SEO also needs some attention. So we always do a content audit and a technical audit just to make sure that Google can see the pages, maybe find out if there is anything else that might be blocking Google from indexing the pages, and try to identify any errors that might actually impact performance. These two usually go hand in hand. We suggested rewriting metadata because their website-wide metadata was just not optimal. We also recommended a vigorous internal linking strategy to help improve a little bit the way that Google understands the structure of the website and try to guide users’ ad search engine more to the services application pages to help with a little bit more the structure of the website and try to guide users at certain in more to the services application pages to help with a little bit more the performance of them.
Danny Gavin Host
37:16
So I know we can’t predict the future, although we try, or at least people expect us to. But if client X or anyone else chooses to move forward with the recommendations that you just listed, what do you see actually changing related to performance?
Mandy Politi Guest
37:30
So you may see some drop in your traffic, especially when you have relevant traffic again. But then what I have seen in my experience optimizing websites with similar issues is that a little bit later, we start seeing the line going up again, and it’s for the exact keyword you want to target. It’s for the right search intent; it brings qualified traffic. You see your conversions increasing. You see that people stay on your website. They spend more time and stop clicking, and if you do all of that, you will start seeing your rankings and organic traffic increase over time. It also helps, like I said, to target the correct keywords for the correct search, and then it allows the right clients and customers to find your content services and your website. It helps increase traffic and conversions and gives them a better user experience. People who land on your website stay there because of what they find relevant. And also it helps a lot with local.
38:33
Local again, it can be a little tricky because of all that we discussed how you can target the same things and change other locations, but with a good SEO local SEO strategy, you can also see improvements in local results, and that’s very important because if you look specifically for local, you tend to have a higher intent of signing up for a service or buying product. So, you might not have thousands of clicks per month because local clicks have lower search volumes by default. But they have higher search intent. So it’s a different thing from bringing 20 people that they don’t care interested in buying something, they just look around, and it’s a different breaking tool that they’re already almost convinced that they want to sign up with your services or buy something that you sell.
Danny Gavin Host
39:17
This SEO analysis illustrates the profound impact content at scale, using AI in particular, can have on website traffic and the consequences of poor quality content at scale. The changes that Optidge suggested are about more than boosting traffic. They’re about driving meaningful engagement, increasing conversion, and improving the overall visibility of client X’s essential services. What we learned today is that you really have to look behind. What are the strategies? What are you doing? Are there SEO best practices? Is the traffic that I’m getting really what I want?
39:51
And even if it is something that I want, but if I’m doing a lot of things that are against the guidelines, how long is that traffic actually going to last and when is it going to turn a corner, like where we saw, where new things aren’t going to be indexed? Pages that did work well are starting to drop; therefore, you have to consider the strategy when choosing an SEO company. Are you going to take the long path, which eventually leads to the short path, or will you take that short one, which seems reasonable, but when you get to that, there’s a long path waiting for you? Mandy, I didn’t know if you wanted to share any closing words.
Mandy Politi Guest
40:29
My general comment is that many SEO experts and clients are using AI content. Initially, with my linguistics background, what I believed in, and how I liked working, I was reluctant to try it. I said no, it’s a reality. So that’s what I said at the beginning. AI is not inherently bad. It’s a tool. The thing is, how do you use this tool? With a best practice strategy in a way that does not hurt you and will not give you short-term results that will last for a month or two for irrelevant terms, then you’re just risking harming your website longer, you know. So, it comes down to ensuring you follow the best practice guidelines and everything you need to implement.
AI, but use it intentionally and consciously as a tool. You wouldn’t go export a keyword list from SEMrush and just target everything. You still need to think about how to target specific terms, where to target them with what kind of content, where to put them on the website, etc. So, you can’t take the human factor and the expertise out of the strategy. It’s not all about feeding a few keywords on chat, with the end creating 10,000 blog posts. It’s about what the strategy is and how you implement basic SEO, which is the essence of what we do on a daily basis.
Danny Gavin Host
41:47
No, I agreed, and I think I will stress the basic SEO and SEO fundamentals again. If you’re going to miss that, it’s like building this beautiful house and not having that really good foundation.
41:58
So, yeah, I think the bottom line is this was the perfect storm of possible issues that could occur all in one website. We hope that Client X considers this and that all the listeners who may have tried these SEO strategies may step back and think, is this the best thing you’re doing long-term for your brand or website? Mandy, thank you so much for being a guest on the Digital Marketing Mentor, and thank you, listeners, for tuning in. We’ll speak with you next time.