How to Perform a Website Audit According to SEO Terms and Conditions

 

A website audit is performed to a site after completing its designing and on page elements. Before going to do off a page for a site, it is necessary to do Website Audit.

This audit helps you to identify all the site errors you should know what you are doing right, what makes to improve and need to identify issues that anything may be going to hurt your website. Whether you are going to analyze your own site or handling your client site or checking for website related issues this website audit helps you to find them and make it resolve.

Crawling Errors

The first step for a website audit is to make your website crawl to diagnose all the issues with the site. In order to do this, you might have to use some tools such as Screaming Frog’s SEO Spider and Xenu’s Link Sleuth. Let us know more about these crawling tools.

 

Screaming Frog’s SEO Spider

 It is a powerful tool that crawls your site and quickly analyzes all the data of SEO perspective. It gives all the information about internal links, 404 errors, Title tags and descriptions, meta keywords, site response time etc. This tool is free for first 500 URL’s and in order to remove this limit you can buy license addition for just £99 per annum (Excluding VAT). Its gives overall reports into a CSV file or even you can export reports individually.

Google or Bing Webmaster Tools:

 Another best tool where you can find all your sites errors is Google Webmaster tool. If you haven’t registered with these tools do it now!

The actual analysis starts from here.

Accessibility In Website Audit

Check whether your site is indexing properly in search engines if not that should because of your site “robots.txt” . Let’s make sure that your site was not facing the following accessibility issues.

Robots.txt

The robots.txt file is used to restrict the access of search engines to crawl your website. Manually check your site for robots.txt file and make sure it is not restricting the search engines access. You can check this file by adding extension as /robots.txt to your domain Eg: http://www.example.com/robots.txt. You can also find the status of your site robots in Google webmaster tools.

Below example indicates restricting all search engines to crawl your site.

User-agent:  *

Disallow: /

Default Robots.txt file that allows all search engines to crawl your site.

User-agent: *

Disallow:

Robots Meta Tags

Robots Meta tags apply for a particular page or a link to a site to tell search engines whether they are allowed to index or follow a particular page or a link on a website.

Eg:

<meta name=”robots” content=”no follow, no index” />

<a href=”signin.php” rel=”no follow”>sign in</a>

HTTP Status Code

perform seo audit

Check with the site URL’s which are displaying errors such as 4XX, 5XX HTTP status codes (includes soft 404 errors). These URL’s are unable to access by search engines as well users also.

Find out these URL’s that no longer exist on your website and make redirections for relevant URL. Make sure the redirection includes only 301 but not any 302, meta refresh redirections; javascript based redirections because 301 redirection passes 99.99% link juice to the destination URL.

Site Map

The XML sitemap is generated to produce the roadmap of your site for search engines. A site map should be propagated according to its protocol. See more at here sitemap protocol

After generating an XML sitemap, submit it to google so that you’re all site pages are indexed correctly.

Update your sitemap frequently and make sure all pages are listed in it.

Site Architecture

Hierarchy

The categories of your site must be shown according to their flow that makes the user easily understandable and accesses your site it should not confuse a user.

Landing pages

Each and every landing page must be equally set up with respective to the home page. Relevant internal linking must be done for passing link juice.

Number of category pages

Avoid displaying huge subcategory list and only keep enough categories list on their demand.

Pagination/Faceted Navigation

Displaying paginations on your site may arise duplicate content issues, make sure you use no follow attributes for your paginations to avoid same results on search engines.

Technical Issues

Proper use of 301’s

Check whether all redirections are done with 301 instead of 302. But there should be a limit in using 301 redirections.

Use of JavaScript

How popular is your site architecture, doesn’t matter if your site having navigational elements that are not accessible to search engines.Even search engine bots are getting too smart and intelligent it’s better to avoid navigational elements using flash or JavaScript

Use of iframes

Do not maintain your content to pull from iframes it may not rank well in search engines.

Use of Flash

Flash websites are still not SEO friendly. If your business websites are built with flash then you are in a problem from ranking in search engines.

Site Performance

Most of the users have very limited attention time span to stay in front of a site. If your site loads too long they will definitely leave away which increases bouncing rate of your website.

Similar way search engines crawlers also have very little time span to crawl your site. A site that loads fast is frequently crawled by search engines that a website that loads too long

Here are some of the tools that check your site loading time and performance such as Google Page SpeedYSlow, and Pingdom Full Page Test. Find below excerpt for site speed of my site Dailyseotips.in

Indexability

In order to check how many pages your site has been indexed, we know the popular command “site:” which is used to check website indexed pages in Google. Almost all major search engines accept this command.

Here you can see the number of pages indexed in Google search engine. But this might be different from an actual total number of pages in your site. There might be many pages that are not still indexed in Google search engine. If you have a sitemap for your site that having all links of your site from development stage you might know how many pages are still pending for indexing.

This indexing scenario can be discussed in three ways.

A number of indexed pages nearly equal to a total number of pages in sitemap: Here your site is performing well and search engines are successfully crawling & indexing all your website pages.

Indexing count is very low when compared to actual count: In this scenario, some of your web pages are not crawling by search engines. You need to figure out the actual cause for the issue whether your site is being penalized by search engines or not.

If indexing count is high when compared to actual count: Here your site is arising duplicate URL’s and having the same content on multiple urls. If you suspect there is duplicate content issue Google’s command “site:” also helps in finding those suspicions.

Simply append “&start=990” at the end of the search URL as shown below.

website audit

And then look for the Google warning message at the bottom of the page which looks similar to this.

Also, check whether all important pages of your site have been indexed or not. If not make sure you try to index them or investigate the cause for penalties.

Brand Searches

Here when you search your website name in Google you should see first your site on search results. Whether it might be your site name or brand name your site must come in top position. At any case you site not getting shown up then it’s time to investigate for penalties.

Penalties

If your suspect your site got penalized check for the root reason for being penalized. Generally, Google will send the message to all webmasters for the reason you got penalized. Check for the messages in your webmaster tools whether you got any message. If you have received the message from Google then you are almost done without any investigation.

Fix the issues for being penalized and request reconsideration to Google. For more information check this support for Reconsideration Requests