RESOURCES

Date: October 29, 2018 | Author: infotech | Category: Digital Marketing

Top 10 Technical SEO Issues and (How To Fix Them)

In the age of “Penguin,” “Panda,” “Hummingbird” and other big Google algorithm updates, winning the search engine optimization (SEO) game means publishing high-quality content that earns links. But all the quality content in the world won’t help your search rankings if your site has structural or other technical issues.

Here are the top 10 SEO technical issues, along with tips on how to address them.

(1) Overlooking 301 redirect best practices

The problem:

Search engines consider http://www.example.com and http://example.com to be two different websites. If your site has been linked to from other sites using a mixture of the two URLs you can be effectively splitting your ranking potential in half. The screenshot below, from Open Site Explorer, highlights this as the www version of the site has 143 unique root domains with links pointed towards it while the non-www version has 75.

The solution:

Choose the domain you prefer (www or non-www) and implement a 301 redirect rule on all instances of the other, pointing that redirect to the one you choose, to consolidate all of your ranking potential.

(2) Duplicate Content

The problem:

Similar to the first point, search engines have a harder time knowing which pages on your site deserve to rank if you have a lot of duplicate content. This duplicate content can stem from common CMS functionalities, most frequently from sorting parameters. This is an issue because search engines only have a finite amount of time to crawl your site on a regular basis, and too many duplicate content pages can cause them to weigh your pages less ideally than if you were serving just one “clean” version of a page.

The solution:

Avoiding duplicate content is always going to depend on the CMS used and a variety of other factors, but one of the most common methods of telling search engines which page you want to rank is to have a <rel=canonical> link on all of the duplicate pages that point towards the page you do want to rank. As a general rule of thumb, every unique page should only be accessible under one URL. Alternatively, there are some cases where you would be better off using 301 redirects or excluding certain site pages via the robots.txt, but be sure to do your research first.

(3) Not Helping Search Engines

The problem:

Again, search engines only have a fixed amount of time to crawl sites on the web. Helping them efficiently crawl your site ensures that all of your important pages get indexed. One common mistake is not having a robots.txt file for the site to identify which sections of your site you DON’T want the search engines crawling. An even bigger mistake is not having a sitemap.xml file, which helps show the search engines which pages on your site are the most important and which should be crawled most frequently.

The solution:

Be sure to include a hand-built robots.txt file on the root of your server that includes pages/sections of your site that you don’t want appearing in search results. Also on the root of your server, be sure to include a curated sitemap.xml file that helps outline all the unique pages on your site. This will help Google crawl it more efficiently.

(4) Poor Meta Tag Usage

The problem:

Some CMSes auto-generate page titles for you, but in many cases this is less than ideal. The <title> tag and meta description tag are two of the most important pieces of “off-page” information you can serve to search engines as it tells them how you would like your pages to show up in search results.

The solution:

Where possible, handwrite your meta title and meta description for every page on your site. Keep meta titles under 65 characters and meta descriptions under 150 characters to avoid truncation. The title tag is an ideal place for the keyword(s) you want a given page to rank for. Think of the meta description as a headline. Try to convince a browser to click on your site rather than a competing site in the search results.

(5) Disorganized URL Structure

The problem:

Presenting a clean URL structure to the search engines is very important. Search engines will look at your URLs to see how you silo off sections of the site. A search engine can follow that www.example.com/widgets/3-inch/heavy-duty/blue-widgets is a page about 3″ heavy duty blue widgets. Some CMSes don’t present such clean URLs out of the box, such as the screenshot below, but having them is very beneficial to SEO.

The solution:

You should always aim for URLs that a reader can look at and know exactly what type of page they are on. This is the best recommendation on how to restructure bad URLs that will help both the user as well as search engines. How to get there depends on the CMS at hand and can be an arduous task to fix. When migrating an entire site from “ugly” to “clean” URLs, one-to-one 301 redirect mapping of the almost the entire site is usually needed to ensure that minimal SEO value is lost.

(6) Poor Use of Local Search Data & Structured Markup

The problem:

One, or rather, two big missed opportunities are sites who don’t take advantage of Local Search Data or Structured Data Markup. In 2014, Google started recognizing local search intent better than ever and sites that ensure that they have a presence on all the local search data providers such as Yelp, Foursquare, Facebook, Bing, YellowPages, etc… can see boosts in local searches within their immediate city scope. Also, taking advantage of Structured Data markup can qualify certain sites for “Authorship” in Google search results which can show a picture beside the link, giving users a more enticing reason to click. There are dozens of kind of schema markup as well for products, breadcrumbs, publisher, local business, and more.

The solution:

Ensure that you go out and claim all the major local search listings you can for your site, with specific respect to the free “big name” ones. Where relevant, look into all the kinds of Schema markup that could be implemented on your site to help differentiate your site in the search results when compared to competitors.

(7) Shady Link Building

The problem:

One big issue that some sites still run into is buying links from some SEO companies. In 2011, Google start cracking down on what they consider unnatural linking practices with specific respect to sites that accumulate lots of “artificial” links and are simply trying to game the search engines to be ranked higher. This was known as the “Penguin” algorithm update. They also took a strong stance against sites that repost low-quality content from other sites – known as the “Panda” algorithm. Both of these algorithms are still updated with great regularity.

The solution:

Only link to sites where it is natural to do so and vice versa. You should never have to pay for a link unless it is a sponsored-type link. If this is the case, the link should include the rel=nofollow attribute or else this also risks setting off a red flag to Google. Buying X number of links from SEO companies is usually a very bad practice that will likely lead to eventual penalization in search engines as well. See the screenshot above? You don’t wanna be that guy.

(8) Broken Links/404

The problem:

A “broken link” is a hyperlink that points to a page that is no longer active (also known as a 404 page). There are few things more frustrating than finding a resource you need only to follow the link and find out the resource no longer exists. Search engines recognize this and will downgrade rankings of sites that accumulate large amounts of internal 404 links.

The solution:

Avoid this by keeping an eye on 404s that Google finds on your site within Google Webmaster Tools. Do regular “housecleaning” on your site to ensure you keep this number of 404s to a minimum, implementing 301 redirects when moving resource from one URL to another, or combining heavily similar pages.

(9) Slow Site Speed

The problem:

The speed at which pages load for your visitors may be so slow that they abandon your pages, or circle back and click another search result. Having site pages render quickly provides a good user experience while the opposite can cause visitors to leave. If your site speed proves to heavily affect the experience of your mobile visitors, Google will weigh thatwhen serving your site in mobile search engine result pages.

The solution:

Regularly monitor your average site load speeds in Google Analytics and also run your site through the Google Page Speed Tool. Follow the recommendations provided to help increase your overall and mobile site speeds.

(10) Quality On-Page Content

The problem:

At the end of the day, one of the main ranking factors remains on-page content. If your page about blue widgets only has 100 unique words of content but a competitor’s page about blue widgets has over 3,000 unique words of content, the search engines will almost always give more algorithmic weight to the site with more quality, unique content – all other things equal.

The solution:

If you want one of the simplest ways to potentially improve your ranking, plan to continually revisit top pages of your site to revise and expand on the content every few months. As long as it’s high quality, the more content the better. Every time new pages are launched, be sure they include plenty of content that’s helpful for your users and describes the product/service in question. Always write for your visitors, not the search engines. Avoid intentional keyword spamming just to try and rank higher.

Wordpress Social Share Plugin powered by Ultimatelysocial

E-557, Budh Nagar, Delhi

D-36, Vibhuti Khand, Gomti Nagar, Lucknow

info@needinfotech.com

Monday to Friday

© 2024 NEED INFOTECH