Clean Up the Clutter: Making Your Website More Search Engine Friendly

 

Broken links, useless directories and a large amount of thin content can drastically reduce your authority and ultimately your SERP rankings. Unfortunately, for most webmasters, clutter can be hard to detect; it’s usually not as visible as say a missing navigation bar or a broken image. So how can clutter reduce your rankings? Simply put, it wastes crawler resources, reduces usability for both bots and humans (Something Google has been focusing on recently) and gives your site an unmaintained appearance.

Crawl Errors

Let’s begin by looking at the crawl errors in Google’s Webmaster Tools. If you’re noticing a lot of Soft 404 or Not Found errors you’ll want to determine which of these pages should be blocked from the crawlers view and which should be redirected. Here’s how I classify my pages with errors:

They should be 301 Redirected if:
  • It’s an old article that you’ve recently replaced.
  • They’re Pages you’ve pulled information from in order to create a single more concise page.
  • They’re Pages that were useful to your visitors such as old contact or location pages.
  • It’s an old page that has a relevant replacement and contains a good amount of high authority links.
They should be 404′d if:
  • They’re pages that don’t support your business’ main focus.
  • They’re pages that are too outdated or useless to your visitors.
  • They’re old product pages you no longer carry or support and have no replacements for.
They should be blocked from a crawler’s view if:
  • It’s a category or tag directory.
  • It’s an excess page type your CMS creates on its own, such as individual pages for attached photos in WordPress.
  • It’s your plugin or similar directory, “wp-content/plugins/” for WordPress.
 You should introduce a canonical if:
  • You’ve created duplicate pages nested under different parent pages on purpose.
  • Product identifiers such as color or size are creating unique product page URLs.
  • Category pagination is creating duplicate content errors. (rel=”prev” & rel=”next” also work for this).

Crawl Errors Webmaster Tools Report

Don’t forget to check the mobile tabs as well.

Once you’ve implemented the fixes for the list of errors Google has served up it’s time to mark them as fixed within Webmaster Tools. Marking them as fixed helps you determine if a problem has actually been fixed or is still causing an issue with the bots. If a certain URL keeps returning an error after you’ve attempted to fix it you may need to try a different approach such as including a “NoIndex” meta at the top the offending page.

Broken Links

Once the crawl errors have been addressed I then search for broken links. Not only are these inconvenient to both the bots and human visitors they also make your site appear “broken”. An unmaintained appearance, at least to bots, can reduce your rankings – Google doesn’t want to serve up sites in the SERPs that are not regularly taken care of or updated. So how do I find these broken links? I use tools like Raven, Screaming Frog and Broken Link Checker.

Note: Screaming Frog (Up to 500 URLs) and Broken Link Checker are both awesome and free to use.

I’ll use Broken Link Checker for this example. To begin, enter the URL of your website into the input box and then click the “Find Broken Links” button.

Broken Link Checker

After clicking the button you’ll be redirected to the results page. Here you’ll have to wait patiently as the bots crawl through your website looking for broken links. Once they begin to find links you’ll see a list being generated at the bottom of the page similar to this one:

Broken Link Checker Results Page

As you’ll notice this tool will give you the exact URL the problem was found on and what type of error was being returned by the server; either a 404 or a Bad URL in this example. The fix will depend on the type of error occurring. The bad url in this case can easily be fixed by adding the “h” to http:// and some of the 404s appear to be pointing to posts that were moved or removed – a 301 redirect may be the answer.

Ninja Tip: Use the Broken Link Checker to find broken external links on sites you want links from; do this by contacting the webmaster and suggesting replacement articles/pages. In some cases you may need to write a completely new article to gain the link.

Thin and Non-Focus Content

Now that you’ve addressed the link situation it’s time to weed out the thin and non-focused content. It’s really survival of the fittest out in the SERPs so you want to make sure your website is bringing its “A” game. Before diving in, take a step back and ask yourself: “What’s the main focus of my website?”. Once you’ve come up with an answer begin auditing your website’s content – if it doesn’t fit within the focus of your website you really need to think about completely removing it. Why remove excess content? The truth is you can’t rank for everything making it very important that you put most of your resources towards the site’s main focus.

Example: Let’s say your website is a candle eCommerce store, that only sells designer candles. Now let’s say you’ve got tutorial and recipe sections filled with content on how to make your own candles. Sure, this is a similar industry but is it helping you sell more candles? Probably not, in fact it may be hurting sales and not only that, your product pages may be getting pushed down by your tutorials in the SERPs.

It’s also important to keep your audience in mind – people buying designer candles probably aren’t the same ones looking to make a batch of their own. If you’re running into a similar situation it may be best to 404 these old pages as scary as that may sound. Unfortunately it can look suspicious if you 301 redirect hundreds of pages to a single URL such as the home URL.

Note: Not all instances of non-focus content are going to be this easy to identify, you may have to sit down and take a good hard look at your website.

I mentioned thin content above as well so let me give you an example of that too. The most common (at least from what I’ve seen) instance of thin content happens when companies try to target multiple geo-locations. Many companies offer services in many different cities and subdivisions so in order to rank well for “Carpet Cleaning in Dublin” or “Car Detailing in Detroit” they make separate pages for each location. Now, creating a page for each location isn’t all that bad, it gets shady when the exact same content is copied over and only the city names are being replaced. Ask yourself: “Would having 23 pages with the exact same content (minus location names) be useful to my customers looking for information on my products and services?” – usually the answer is no. So how do you fix this?

You may not like this answer, but the best way of fixing this is to create a unique page for each location highlighting characteristics and features of the area. If you own a water garden supplies store, for example, you may want to write about the types of fish and foliage that work best with the area. It’s not an easy task creating a unique page for each store front location; if you find it to be a waste of resources you may want to completely remove the location pages and put more time into local listings or even local ads.

Now that you’ve learned all about website clutter you no longer have an excuse. In all seriousness, if you want to gain authority or remain an authority you need to constantly maintain your website. There’s no bigger turn-off than a 1990s Geocities background and page full of outdated and broken links.

Geocities Website Example

Have I missed something or is there another tactic you use to keep your site well-groomed? Please feel free to share it with me in the comment below.

  • http://www.techmaish.com/ Tech Maish

    You have forced me to look into these issues for my blog. I just gone through my GWT and found some issues you have highlighted. Thanks for sharing Phillip. I will get back to you soon with some questions. Hope you will not mind.

    • http://phillipmoorman.com/ Phillip Moorman

      Hey Tech Maish, you’re welcome and thanks for checking out my blog. I’ll answer any of your questions feel free to send them over.

  • http://www.techmaish.com/ Tech Maish

    Phillip what about the broken links in the comments. If they are so many in numbers then should we care about it or just ignore them.

    • http://phillipmoorman.com/ Phillip Moorman

      I wouldn’t worry about comments – Google and other search engines don’t pay much attention to them.

  • http://www.techmaish.com/ Bilal Ahmad

    I agree but I have ranked a blog on the first page of Google for a keyword with just the links from Blog Commenting. And they were nofollow.

    So they do pay attention to these links. Although they deny it but reality is that do consider them in backlinks.

    That is why I am just wondering if broken comment links can also effect the reputation of a site.

    • http://phillipmoorman.com/ Phillip Moorman

      It wouldn’t hurt to fix them but if that’s not possible then I would ignore them.

  • http://www.techmaish.com/ Bilal Ahmad

    Thanks Phillip.

  • http://www.technotrait.com/ Rafaqat Ali

    Hey Philip thanks for the sharing valuable words, you have mentioned, I would like to clear one thing and it is about the removal of outdated content. If the Permalink url having date in it, i think it will be difficult to remove this outdated content ? rather then to rewrite them…?