Top rated Atlanta SEO Company
Call Us Free: 1-855-LIBELTY
1-855-542-3589

10 Common Technical SEO Problems and Their Solutions

Technical SEO can be very helpful. However, the problems that you will encounter from it can be very frustrating, especially when you experience them over and over again. Listen below are ten of the most common ones, compiled by SEOmoz. Read on and know how you can solve them.

1. Lowercase vs. uppercase URLs

This is common with those websites that use “.net”. This has improved with the enhancement of search engines but there are some instances when this is not properly done. Solve this by getting a URL rewrite module to enforce lowercase URLs.

2. Multiple homepage versions

This is also another problem common in “.net” websites. Search engines can also solve this issue but it would be better that you take action on your own. The best solution is to add a 301 redirect to the page’s duplicate version that points to the correct version.

3. Query parameters at the end of URLs

This is common in database-driven eCommerce websites. Solve this by deciding which attributes you want indexed. If they are already indexed, add the URL structure to your file robots.txt. if they are not, use a plaster which is the rel=canonical tag.

4. Soft 404 errors

This error means that you will not be able to spot broken pages as well as those areas in your site where users have a bad experience. To fix this, look for a developer who can set your page to go back to a status code instead of 200.

5. 302 redirects, not 301 redirects

This is something that developers will find, as website users will not be able to detect this. Find URLs that are 302 redirected with a deep crawler and ask the help of a developer to change it to a 301 redirect.

6. Broken or outdated sitemaps

XML sitemaps are mostly neglected but they can be very useful to search engines. To solve this, find broken links on your sitemap then ask your developer to make your XML dynamic so that it automatically updates regularly.

7. Wrong order of robots.txt file

This can lead to some pages being indexed or crawled that were blocked with robots.txt. This is actually something that Google state in their guidelines. Make sure that you properly check your commands to ensure that they are properly ordered.

8. Invisible characters in robots.txt

Fetching the file via the command file shows an invisible character that seems to have made its way into the file. To fix this problem, simply rewrite the robots.txt file then run it through the command line once again to recheck.

9. Google crawling base64 URLs

This can lead to a big increase in the number of 404 error reports. Webmaster Tools can actually solve this on its own, although it will take some time. You can add Regex to the robots.txt file so Google will not crawl on these URLs.

10. Misconfigured Servers

This can result with the site losing ranks, even when everything seems to be working fine. Solve this by changing the user agent to Googlebot. This way, it will not influence the HTTP headers and regain the lost rankings.

Comments are closed.

Request A Free Consultation

Location

Corporate Office - Mailing Address
931 Monroe DR, STE 102 #125
Atlanta, GA 30308

Office - Walk-in Address (within Nex Atlanta)
318 Cherokee Ave SE #104
Atlanta, GA 30312


Areas Serviced

Georgia - Atlanta, Marietta, Alpharetta, Lawrenceville, East Cobb, John's Creek, Sunday Spring, Roswell, Danwoody, Norcross, Peachtree City, Suwanee, Bogart, Athens

Contact

(404) 720-0814 - Local Number
sales@libeltyseo.com
libeltyseo.com