026: Sitemaps, Search Console, and SEO Content Quality – A Case Study on What Makes Successful SEO (Office Hours)

C: Podcast

Get ready to dive deep into a case study with a real Optidge client. Their website had more than 12,000 pages, but they were getting little organic traffic. That, coupled with a “Couldn’t Fetch” error for the sitemaps in Google Search Console, is what led Optidge down the rabbit hole that is “low-quality content” in SEO. 

An Optidge “Office Hours” Episode

Our Office Hours episodes are your go-to for details, how-to’s, and advice on specific marketing topics. Join our fellow Optidge team members, and sometimes even 1:1 teachings from Danny himself, in these shorter, marketing-focused episodes every few weeks. Get ready to get marketing!

Key Points + Topics

  • [1:25] Company ABC* (anonymous company name) came to Optidge for help with their SEO. Their website had over 12,000 pages and functioned as a search engine for social media influencers. Based on the quality and abundance of content, they believed they should have received much more organic traffic than they had received. They had tried submitting the sitemap to Google Search Console but kept getting the error “Couldn’t Fetch.”
    • Google Search Console is a tool created by Google that every website should have set up for itself. It shows you information about what pages have been indexed, if there are issues with your site, and give you a lot of information about how your webpages appear in SERP, like what the actual search queries are for which it appears, how many clicks and impressions a given page has received, and more. It’s very powerful. 
    • Sitemaps are a file you provide to Google (and other search engines) that shows Google what pages you have on your site that you want them to index. This way, Google doesn’t have to manually crawl all websites. You can have multiple sitemaps and tiered/nested sitemaps. If your website is built in WordPress and you use the Yoast Plugin, your sitemap(s) will be automatically generated. 
  • [6:00] At first glance, the issue with Company ABC’s website seemed simple – a basic, technical miscommunication in Google Search Console. However, when we began our analysis, we realized the content quality might be the bigger issue. So, we began a content quality assessment to determine which pages were high quality and which were low quality. For more info on how to write high-quality content that works great for SEO, be sure to listen to Mandy’s episode
  • [6:55] High-Quality Content vs. Low-Quality Content
    • High-quality content is beneficial for the user. It provides reliable information that helps the user learn something or utilize some knowledge. 
    • Low-quality content can be borderline “spam” content, such as scraped or copied content from another website. It’s not unique. Other times it is thin content with little text and doesn’t provide users with much useful information.
      • Low-quality content isn’t always created to thwart Google intentionally. Sometimes it is simply the way that a website is designed and functions. Search results pages are often recommended to be blocked from indexing, as they are viewed as low-quality content pages. Regardless of the intention of the content, Google views certain things as low quality, and you have to put yourself in the shoes of the end user of your website when creating new content and new pages. 
  • [10:19] We started analyzing the website and discovered it had a low-quality content issue. As a result of the site’s function (a social media influencer search engine), many of the pages were copies of social media pages already in existence. Many of the pages were search result pages. Others were numerous account pages. One of the issues with having so much content that is low quality in Google’s eyes is that it prevents Google from finding all of your high-quality pages. All of the info on these pages is actually good for the user, as it functions as an aggregator tool. However, Google realizes that most of this information is found on other sites like social media pages; therefore, Google would rather show those pages (the original source) in SERP. 
  • [14:50] What do you do if you have low-quality content?
    • You have a couple of options.
      • You can get rid of it – this doesn’t necessarily mean permanently deleting or removing the pages from your website, though that is an option. You can also block the pages from being seen by Google via the robots.txt file or with a “noindex” tag on the page. 
      • You can improve the quality of your content. This means adding content and value to those pages by optimizing them, adding more content, more useful content specific to the assumed search intent of the terms likely to bring users to that page. This is certainly the more time and effort intense of the options. 
  • [17:54] Po-tay-to or Po-tah-to, does it really matter? Keyword research says the words you use to refer to your services, products, and offerings DO matter. And they should match what people are searching for, not necessarily what YOU refer to them as internally. Company ABC didn’t have content that featured the terms users were searching for. They needed to see what phrases and verbiage people were using and what terms their competitors were targeting and ranking well for. Since this company had not done keyword research before, the only page ranking well was the homepage, and it was bringing in traffic primarily from branded search terms ( think someone searching “Company ABC search site” in Google because they didn’t know the actual site). Finally, in addition to using keyword research to improve the content on the pages, you also want to interlink pages of related topics. Not only does this help site performance, but it’s also truly helpful to users. 
  • [21:30] Let’s try an example case. Someone searches “Sports influencers in Houston TX” because they’re running an advertising campaign and want to hire a couple of influencers to promote their new energy drink. Company ABC’s website HAS the information they’re looking for, but with 12k+ pages, how would Google even find those pages? From the keyword research, you know this term gets around 1,000 searches per month. So, you build a static page (not a dynamically generated one based on a search result page). This page will be focused on this content and include local influences and location-specific information. 
  • [22:48] Then you can start to think about your navigation. Originally, there was no real site navigation/menu. It was basically just a search bar. Now, you have the content to create categories like influencer niches, locations, and more, all linked through your homepage navigation. Now Google can easily crawl, analyze, index, and rank those pages much more easily. 
  • [26:35]  Imagine if this site did NOT have a low-quality content issue. Then they would still have a “couldn’t fetch” issue in Google Search Console. In our analysis, we checked all of their submitted sitemaps, and all were accessible, with many of them already indexed by Google. Despite knowing that Google was seeing them, they were still getting this annoying error. As a result of our research and the generous sharing of experience and knowledge in the digital marketing community, we learned that submitting your sitemap in Google Search Console is mostly just a suggestion. If Google doesn’t accept it, for whatever reason, your only real choice is to be patient.
    • There are a couple of ways to improve your sitemaps. Company ABC’s sitemaps had a bit of extra code and script present on theirs, which we recommended they remove. Additionally, there are certain best practices regarding the schema on sitemaps that weren’t followed yet. Finally, you can try an alternate file type like .xml, .txt, or an RSS feed. If all else fails, there’s also a Google Search Console API and a Pinging tool that you can utilize. 

Guest + Episode Links

Full Episode Transcript

Danny Gavin    00:05 

Hello everyone, I’m Danny Gavin, founder of Optidge, Marketing Professor and the host of the Digital Marketing Mentor. Today we have a very special episode where we’re going to take a deep dive into a real life case study where a company came to Optidge with an SEO problem. And at first glance it was, you know, a simple technical issue. But when you dig a little deeper there was a lot more to be found. So I’m really excited that Mandy Politi is with me here today. She’s a senior SEO strategist at Optidge, and Mandy’s first episode on the digital marketing mentor was a huge hit. So it’s a really awesome to have her back here to discuss the special case study with me. How are you Mandy?


Mandy Politi    01:06 

Good danny, thanks a lot for having me again in the podcast. I’m really excited.


Danny Gavin    01:11 

Yeah, this is going to be great. So we had a company and due to privacy, we’re not going to give the name of the. Company during the episode, but we’re going to talk a lot about it and you’ll still get a good idea of who they are and what they do. Company came to Optage with a website of about 12.000 thousand. Pages and in general, this website that they have is it’s like a search engine, the ability to search social posts and different creators and different things like that. So it’s a website. Filled with content, a lot of really good content. So naturally first glance when you have a website that has a lot of content, it’s like, oh, Google’s going to love this. It is a very rich website and honestly Google should index all the pages. We should be found on multiple Serps and we should be getting a lot of traffic. And obviously that wasn’t the case and they looked around for a lot of different companies to see, you know, how could they could help them. In particular, there was a technical SEO issue. With Search Console. So Google Search Console is a free tool that Google has that allows you to see the back end of your website from a search engine perspective. Some of the things that tells you are like technical errors, also search query data, which means what actual phrases do you show up in Google, both which are the ones that bring you clicks but also which ones you just show up for, which would be impressions what your average position. There’s a lot of other neat little things inside the Search Console, and since it’s a free tool, really there’s no reason why anyone shouldn’t just connect it to their website and get access to that data. So one of the areas of Search Console is submitting a site map. So for those who don’t know what a site map is, it’s usually an XML file, which is a type of file. The bottom line what it is, it’s a list of web pages, all the pages that are on your website. So instead of Google, naturally, which is a crawler coming in and. Going through your website, starting at the home page and going through a link from one click link to the next, and figuring out all the pages on the website. Here you provide them with one file which could lead to other files as well. You could have multiple site maps and we’ll talk about in this case. That’s actually the situation with this client, and Google then has the ability to see all of the pages that are on the website all in one spot without having to uncover them manually by scaling the website and crawling it. So this client had quite a few site maps. Most of them were indexes, which means one site map which led to a bunch of others. So for example, let’s say they had pages of people starting with the letter A for their name. You know, so imagine that there’s going to be a link that says, OK, these are all the pages from A to M and then these are all the pages from N to Z. And when you clicked on that, then you would go to the other site map. So imagine you have like a like a master site map. And then smaller site maps which break it up. And part of this reason is because it allows you to keep things more organized and allows you to update the site maps. But also, there are limitations with site maps, you cannot have more than 50.000 thousand links on one site map, and it’s crazy to think that would even happen. But there’s many reasons why you’d want to break out your site maps. A cool thing like on WordPress websites. And if you use like the Yost plugin, Yost will automatically generate your sitemap and will break it down into like a video sitemap and an image sitemap. So there’s a lot of cool things with sitemaps. But getting back to the case at hand, when submitting their sitemap to Google search console, they got an error and that error is couldn’t fetch and if you look around. With a couldn’t fetch error, there’s a lot of reasons for that, right? It could be that Google just hasn’t had time or there’s a problem with your site map and there’s a lot of potential issues of where it could happen. But the bottom line is the client came to us because they thought that their site had a lot of content and they were not getting the amount of organic traffic that they expected. And they felt that one of the reasons for this was because inside Google Search Console, for some reason Google could not fetch those site maps. So imagine we’re telling Google, hey, here are all these pages, check them out. But Google’s saying, sorry, I can’t fetch them, I can’t read them. So in their mind, they’re like, ooh, we’re not getting traffic because Google can’t fetch our site maps. That’s the problem here. How can we go ahead and fix this issue so that Google can now read our site maps? And hopefully our content will now actually rank and we’ll get all that traffic which we should be getting at first glance. It’s a pretty, you know, simple issue, right? It’s like a Search Console issue. It’s a technical issue that for some reason Google can’t read that site maps. At the end of our talk today we will talk about solutions for this problem. However, in the initial analysis we saw that content quality might be the bigger issue here rather than the site map. So therefore we decided, yes, we need to come up with a solution for the sitemap issue, but we need to go a little bit deeper and we need to assess which pages on the website are high quality or low quality. Like what’s the situation here? Maybe the reason that Google doesn’t like the content on this website is because of the actual content that’s there. And just in general, for more info on how we approach SEO content in general, be sure to listen to our SEO content episode with Mandy where we go a lot deeper into how one can optimize content for SEO. So now Mandy, I’m going to turn it over to you. Let’s talk about before we dive in and say what was the actual issue on this website, let’s first talk about what’s the difference between high quality content and low quality content.


Mandy Politi    07:01 

If you think about what high quality content is, it’s basically content that is really useful for the user. It provides reliable information and it’s there to really help the user learn something or do something versus just ranking or providing information for the search engines. So in the end of the day, high quality content is content that is really helpful and provides a good experience to the user. Lands on the page and read it. Now when it comes to low quality content, we basically have content that can even be considered borderline spammy. Like anything that’s for example script content, like for example copied content from another website that it’s not really unique. Thin content, for example thin pages on web on websites that they only include images or just a little bit of text. So this pages by default they don’t provide really any useful information, so users most likely are going to land there and then just exit to find something like more relevant to what they’re looking to find for. And then we have also search results pages for example pages. They shouldn’t even be indexed, that’s. They should be blocked in the first days because they are internal pages that they shouldn’t be there. This is a common problem with the commerce websites where sometimes the clients don’t really block these pages and then you end up with a lot of pages that they are just. Search results and then users find them on the search results and they click and they go and it’s just empty. Pages like this is basically the definition of a low quality page.


Danny Gavin    08:38 

And Mandy, the crazy thing about low quality content is like, by definition it’s like, ooh, you put it there to try to trick Google or to easily rank without doing any hard work. But in truth, that sometimes you don’t do it in a bad way. It’s just the way that your site is, the way that you’ve set up your content. The content is considered low quality by Google and therefore Google’s not going to like it. And that really was the issue that we had here with this client. A lot of the content, right, even if they had 12.000 thousand pages, but a lot of the content in our eyes, we really felt that Google would likely consider this as low quality content. And you think about it, you know, the foundation of a website where it’s you’re using it as a search engine. To help try to find creators or social media and different things like that. A lot of that content, right, could be low or thin. So let’s talk a little bit of some of the things that we actually found.


Mandy Politi    09:37 

I think it’s also a very good point what you just said, Danny, that sometimes clients don’t want to create low quality content just to manipulate the results or because they want to run by creating a lot of spaces. Sometimes it just happens. You don’t really think of, OK, probably I need to add more content. You just create the page you think that’s useful. But This is why it’s really important to actually put yourself on the source of the user and think, OK, if I land on this page, am I going to learn what I came here to learn? Like am I going to find this useful information? So it’s really an interesting point that sometimes just happens. It’s not always just because you are trying to manipulate the search engines to make your website run better. Going back to the case study, I think that. That’s really important point that we started analyzing the content itself to identify what’s going on there before we go to the site map. And then what we found is that basically this website indeed had the low quality content issue. They had a lot of copied content from other from the company social media accounts, they had a lot of search results pages, but they were empty of content and they also had a lot of account pages. Now this is not for example, the account pages is not an issue. You actually, but if you think about it how Google would look at it, they’re empty pages. They don’t have a lot of content, so they wouldn’t really provide the user the information that they would be there to find. So the problem now is that when you have a lot of low quality content, you are basically preventing Google from finding your good content, your high quality content, and that’s an issue. So when you find that this is the problem, you need to sit down and think, OK, I have all these pages. Do I want Google to actually see all these pages? Is there a reason why I have them there? And then what if I did use them or I tried to improve them so that I help the actual high quality pages to be found and be discovered not only by Google or the search engines, but also by the users? It’s going to be easier for a user to navigate in a smaller, more targeted website than like a website that has 12.000 thousand pages, but you cannot even include all of them in the navigation, you know?


Danny Gavin    11:41 

Yeah and so the crazy thing here, and I want to clarify, is that all this information is actually, in this case, is good for the users, right? I’m going onto a site and I’m searching for an influencer, and then I want to see the information about him. I want to see his social media posts. That’s all great. But the problem is that this website does the function. It’s an aggregator tool. It brings everything together. But the problem is that in Google’s eyes, that’s great that you’re aggregating this information. This information is found to other places, right? That information is already on Instagram. That information is already is on Facebook or the search results pages, right? You’ve got these search results pages. A lot of the results are similar or there’s not really enough attributes on the page to make them unique. So the search result page is still like it’s helpful for the user because, you know, I’m searching for, you know, this influencer in this area, and then I get a list of all those people. That’s helpful to me. But in the way that Google analyzes websites, you know, they just, they don’t want you to give them, you know, 10.000 thousand search results pages and suddenly we’re going to index all to them. That’s actually a problem. So it’s funny here in the whole product, you know it that they created, you know is helpful for the user, but. Just because that content’s helpful for the user doesn’t mean that Google likes it, and therefore it was like the total opposite. It’s like the fact that you’ve set up your thing this way. Google’s not gonna like it exactly and also if you really think about it from the Google perspective, you have a page that is duplicate from for example, let’s say Instagram. So why would Google prefer to show the website result and not Instagram directly? Because probably. It identifies it as the right result for the specific user who types the specific query so again. It is useful, but at the same time is it really that useful because perhaps the user actually wanted to find the Instagram page. Instead, so again. You need to consider how to create something that’s going to be good for Google like to pass. Let’s say the assessment of this is good useful content, but also to make sure that you’re putting their page that it’s exactly what the user is looking for this specific.


Danny Gavin    13:59 

I love that example, Mandy. Because remember, what we’re talking about here is, you know, they’re looking for people to find them through Google search, but when someone is typing, you know they’re looking for the content of, let’s say, a specific influencer within Google Search, right? They’re most probably looking to go to like the actual Instagram page or something else like that. It’s different, right? So that’s the intent of the person coming to Google. So therefore Google wants to give them the actual source of the information while inside this actual website. It’s a really good website, but it has a different function and therefore that we’re going to start talking about this like part of the solutions of maybe they should have created different content in order to rank in Google, right? If you wanted to actually rank in Google, what are we going to do? So we’ll talk about keyword research in a little bit. But before we do this, I think now’s a great time to talk about, OK, if you have low quality content on your website either because you know, well, let’s take in the positive, Not because you did it on purpose to like, try to ruin Google, but it’s just because you’ve got pages that eat or are thin, they’re copying different things like that. How can we deal with that? And I believe there’s two ways of dealing with it. One is to get rid of it. The other is to add content or value so many. Why don’t you go ahead and explain that to us?


Mandy Politi    15:08 

So like you said Danny, there are two solutions. The first solution is get rid of it. You don’t necessarily need to delete or remove it completely from the website, but what you can do is you can block it. You can either block it through the robots DXT file or you can use no index not follow that. This is a good solution because it doesn’t. You don’t need to remove content that probably you want to keep or if you think it’s useful for your audience. The same time you help a little bit control this 12.000 thousand page situation and try to focus on the page that are actually bringing traffic or are very useful and unique to the user and. By extension to the search engines.


Danny Gavin    15:46 

And blocking through the robot dot text or adding a robot meta tag is a very common thing and especially in the case of this client where they need all those pages, right? They don’t want to get rid of them, but they just we want to block them from Google and say, hey Google, we have these pages. But you don’t need to see them because the goal for these pages is not for people to find these pages through Google, but rather it’s for people to find these pages through our website. Once they’re at our website, we want people to see these pages and we see this from ecommerce as well, right? The best practices is on the search results page. You know, every time you type in a query and then a search result page pops up. Ideally you want to block those and once again, you don’t get rid of them. You don’t get rid of your search engine on your website, but you block the pages so that when Google finds them, you tell Google hey it exists, but please don’t take a look at it, because it’s not required and it’s a really fast and efficient solution to be honest. So the second solution now is review these pages and optimize them, improve them, add more content. Can by extension add more value to them. Now this requires a little bit more work because you actually have to go through all these pages, review them, see what kind of unique content and useful information you can add. And there are also location based pages which in the case of this kind was also an issue. They were using a dynamically created pages that the location would change, which again would create a lot of pages with thin content. Pages that you would have the title with the offering plus the location. And in this case for example, you can keep the location pages because sometimes this is how the users are looking for. They’re looking for results near them. But try to add unique location focused information to create a page that it’s going to be relevant to the location that you want to target and this is a good way of also fixing the problem. It will take a little bit more of time, but you will have to actually go and which takes us to the next step, the keyword Research that you need to 1st find what the users are looking for and try to either optimize your existing pages according to that or create new pages that target these terms in a good and useful way exactly so just to kind of introduce the whole keyword research concept. So why would a website like this need to do keyword research? Because even though they had 12.000 thousand pages, they were getting the same amount of. Organic traffic, as some of our clients with fewer than 200 pages, So 12.000 thousand pages, 200 pages. What does that mean? It’s because the majority of the content on their website doesn’t match what people are searching for on Google. Now, of course, the information on the website is very valuable to people, but people aren’t necessarily searching for that in Google, right? So if you want to generate traffic from Google to your website, you need to find those things that people are actually searching for and that will provide value, right? You know, to put once again, like we said before, to post the same exact page that’s on Instagram, that’s not going to provide value. Instagram can do that better. But if you can provide a piece of content, a landing page, an article that is going to provide value to that user and match what they’re actually looking for, then that’s magic. And the more of those pages that you can create, the better. Let’s talk a little bit further, specifically in this case study. Why it was so important for them to do keyword research?


Mandy Politi    19:02 

They didn’t have content that was matching what the users were looking for. Or they might have this content, but they didn’t actually target the right terms, like they didn’t actually have the keywords within the page. So it was really important as a recommendation for us to the client that they need to actually do this. They need to conduct keyword research. They need to find what the users are looking for, that it’s relevant to the offering of this client. They would also check what competitors were targeting, what was ranking and what was appearing on the first page of Google for these terms using all these insights. Then they would have a nice list of keywords and of terms that would match the right search intend of the user and would help bring the right traffic in. So that was the basic recommendation, but because they didn’t do that up to this point, that resulted to the website ranking basically only for branded terms and not only with the home page, despite the fact that they had 12.000 thousand pages. So they didn’t really perform for relevant non branded terms that users were using to find. Services and products similar to what they were offering. So that was very important for us to recommend to the client that they need to perform keyword resets and try to build, either optimize existing pages or build new pages that target these terms well. In the usual on pages your ways and a part of this recommendation was also to make sure that you interlink these new pages with optimized anchor text to make sure that you guide the users and the search engines to the correct pages. At the same time you help for other relevant pages discoverability. Because also that’s really important. It’s really important for sentences to be able to go through your website and find and discover other relevant pages. But it’s also really useful for the users, because sometimes users they are looking for something but they are interested in also finding other relevant. Pages, similar content, so it’s really important. So it’s not really about doing the keyboard research. It also boils down to actually implementing well, either by optimizing existing content that you have identified that it can potentially become high quality, like for example the local base pages. Or you can create new pages, it can be landing pages, it can be blog posts, anything that you identify during your research. Can target these terms. They are in a way that is useful for the user and interlink so that also can help certain end users find your relevant content.


Danny Gavin    21:26 

So to break it down, like with an actual example, if let’s say someone on Google is looking for sports influencers in Houston, texas and why do they want to look for sports influences in Texas Cuz they’re running a campaign. And they want to be able to go ahead and, you know, hire a couple sports influencers to promote their new energy drink. So, you know, that makes sense. That phrase is something that someone would type into Google and actually search and look for someone. So it could be that this website has those results, right? Right because you could find sports employees in Houston. But remember, they’ve got 12.000 thousand results pages or even more. So how’s Google going to even find that? So what we’re saying is, okay, you did the research. You see that’s something that people search for. There’s a lot of volume. Let’s imagine, you know, a thousand people are searching for it a month. So now you want to come to your website and actually build a static page, right? An exact page that says sports influencers in Texas or in Houston and you know, show a couple of the top sports influencers and why these guys are the best. And you know, potentially bring some other examples or related people to that and so on and so forth and build out this really good page. Now that you have this really good page, now you can start thinking Okay, now maybe in our navigation, right until now we didn’t, we really didn’t have a navigation, it was just a search bar. But now we can talk about Okay. We’ve got different categories, we’ve got influencers, we’ve got photographers and then we’ve got different areas, Houston, so on and so forth. And in that navigation you say Okay influencers, click on that. Now you have a page with all the links. Like, OK, we got influencers in Texas, influencers in New York, and then we click on the Texas one and then it takes us to this page. But imagine we’re now having like this really organized structure and Google understands what’s going on. There’s some organization and rules what’s happening. And now we’ve got this wonderful page that’s integrated in an understandable site architecture that Google can index. And now when that person actually searches for something like that then. The person will be able to find it. So once again, that content right? The best influencers in Houston, texas, it’s on the website, but it’s not in a way that’s Google friendly or in a way that Google can digest it and show it to the person who’s searching for it. And by doing something like this, then you could really take advantage of okay. You want the, you know, you want organic traffic to this website, then you need to start. Building content out like this, starting with the Cure research exactly and another very important point is that in this case, this example is really useful because it’s a perfect example of a quick win opportunity. Like we said, this specific client, they got most of their traffic on the homepage. So identifying the rights like the best categories for them to target in terms of what is very important for them as a business. But also what is it that most users are looking for example in Texas relevant to this offering? It’s a great idea to actually link this main category directly from the home page, because then you are taking advantage of the fact that the home page is right now the page that performs the best. And this is where Google is going to start, by looking at the other pages. So and also where most users are most likely are going to land. So it’s a really good example of identifying quick win opportunities that with the minimum of a change that’s adding a static page on the home page, you can get a lot of results faster.


Danny Gavin    24:59 

And it’s funny, Mandy, like I always think, let’s do an experiment. Why do we even need category pages? Let’s just have 12.000 thousand results pages that’s or twelve, thousand product pages. And that’s exactly like what this was, right? That’s what This site is and it shows you like when you don’t have. That structure that under, like giving Google that idea of like, okay what’s important. Google’s not just going to go and take 12.000 thousand pages and figure out on its own and decide what it’s going to be, but by going in and actually creating, figuring out what are the categories, what’s important, what’s not important, using things like schema markup and other, you know, types of HTML identifiers. It gives Google an idea. And then when you can help Google understand what’s important, what’s not what. You know what Should they index? What they shouldn’t? Then they’re going to do it. But if you don’t, here’s a perfect example where it’s not going to work too well, even with how powerful Google is.


Mandy Politi    25:55 

Yeah, I couldn’t agree more, and I think in this case of these specific clients, a mix of the two solutions was the best. Like try to identify what you should set to no index and block Google from showing it up on the search results to the same time identify your. Low quality with a good opportunity to turn to high quality pages and work on them and also see if you need to create something new. So with this they should be able to resolve the problem.


Danny Gavin    26:22 

So obviously this discussion is more about the journey than the destination. But I’m sure all of you are wondering, okay Danny, what’s up with the couldn’t fetch errors on the site map? So let’s talk about what our recommendations were in that arena. So once again, all things being equal, imagine if this website. Had the right content to set up in the right way, then this couldn’t fetch or might have been more of an issue, but let’s talk about how we approached it. So number one, we checked all of the site maps and they seemed to be accessible and many were actually already indexed by Google. So by going into Google you can see is the page actually indexed within Google’s index? Checking the pages that exist on that file, are they indexed? And we saw that they were. So Google is seeing them, right? Like the action, it was proof that if it if the if the site map shows up in Google itself, it means Google saw it, right? Despite this, they were still returning couldn’t fetch errors. And I think that’s the annoying part. One of the beautiful things about the digital marketing community and specifically the SEO community is that on social media, even when you haven’t met people are very friendly. And are very generous with giving advice. So I reached out to some of my colleagues and friends on social and between them and our own research, that if you submit a valid site map to Google through Search Console, it’s mostly just a suggestion. And if it doesn’t work right away, the main option is just to wait. So, like, what that means is you can submit a site map, and that’s not the key to everything. It doesn’t mean Google’s going to listen to you. It doesn’t mean that Google, like Google, can still see your site even without it. I understand it’s frustrating if there’s an error there that says couldn’t fetch, But really what it boils down to is it’s just one of those things where it’s kind of out of your hand. And got to wait and hopefully Google will change that error message. But like what we see in this specific case, Google, we could prove that it was seeing everything else already, right? It was seeing the pages, it was seeing the site map. But for some reason in Search Console itself it Google wasn’t ready to take that error message and update it. Some other potential ways to improve site maps to better meet Google’s preferences specifically in this case was we found some extra code. And script at the top of some of these site maps. So we recommended that the client remove those. Now remember if you’re using like Shopify or WordPress and using the proper plugins, that’s going to generate automatically. In this case this was a custom built site so naturally they were building these site maps on their own and that can lead to, you know, errors and things like that. Number two, within the site map they were using multiple schemas and insite maps have their own schema type of language and categorization. And they were kind of doing the way which didn’t go with the best practices. So we recommended that they would remove those as well. And another option was that like we started at the beginning, something about an XML site map. That’s the traditional and most common way of providing a site map to Google’s through this XML file. However, you can also provide Google with site maps using an RSS feed or a text file. So another suggestion was, hey like if you’re really gung ho on getting these site maps to show up, why don’t you try an alternate way of sending in those site maps And maybe that will be the trick on how Google says OK yeah, we’ve read it. The final alternative option is that you can submit site maps not via search console but you Search console has an API or there is a ping tool and you can use some of these other tools to submit your. Site map. So that’s basically that’s the, you know, those are the solutions regarding to this specific problem. But I think what we can truly see from this case study and just in general is we could prove that Google saw everything and the reason that the website was not ranking was not because the site map couldn’t go through, but it was because the majority of the content on the website was not Google friendly, it wasn’t content that Google wanted. Or even understood how to index and a lot of it could have been low quality content, spammy content, not on purpose but just in Google’s eyes. And therefore even if you got 12.000 thousand pages, but it’s not going to generate the traffic that you want unless you go ahead and do some significant changes. Mandy, any last thoughts?


Mandy Politi    30:54 

No the only thing I would like to add is that this is a really good example why when it comes to SE optimizations, it’s never what you think it might be in the 1st place. You always need to check a little deeper. You need to consider how the website is set up, the content. It’s not just for example. In this case it’s not just the Search Console issue, it’s not just the sign of issue, it’s not always like just. One specific kind of issue, it’s always the symptom of a bigger problem. So I think it’s really exciting to have this as an example, and it also provides a lot of insight of how you should approach this kind of issues. When clients approach you with the problem, you should never just look on what you think it might be, but spend the extra time and go the extra mile to go and say perhaps something else. Spend this additional do this additional effort just to make sure that you have ticked all the boxes of the list of what you have to check before you actually make a recommendation. Because in this case we figured out that it was something actually bigger than just the problem of the site mode i love that lesson and thank you so much for reiterating it and like that made me think when we approached the client with sort of our analysis and solution like it was totally left field for them but then they’re like it really opened up their eyes like ooh. Yeah, there is something deeper here and you know we’ve been trying so long and we think you know and we thought that this was the problem. When you’re looking to do a site audit or any sort of audit, you know just going on to a tool and you know printing out the technical issues isn’t necessarily going to do anything. But one really has to go deep and like we spoke about Mandy last time, you know I know you obviously use tools, but you like to manually go in and actually. You know, go into the site and breathe it and feel it and that’s really good because as we see in this case, if we would have just said, OK, that’s the problem, here’s the solution that you know that we would have lost all this value of trying to educate a client and ooh, you have to think a little bit deeper exactly yeah, exactly. And also I am a firm believer that no tool will ever tell you what’s the issues are with your website. It’s the like you just need to go to Google itself. Like i am a firm believer of that and actually going to the website and clicking around trying to. Understand how it is structured, reading the content, It’s never just downloading a report from a tool beat, search console or assemblers or anything else. So it’s really important that I think this has brought the most value to our clients that we actually go and check everything. No, it’s not just an export of a report, you know.


Mandy Politi    33:24 

Well, Mandy, it’s been wonderful discussing this special case study with you today. Another awesome Office Hours episode. I’m sure we’re going to do many more together. So thank you for being our guest today to help me present this case study. And thank you listeners for tuning into the Digital Marketing Mentor. We’ll talk to you next time.


Suscribe Now