08 Sep SEM Tips ~ Making friends with Google
Having Google crawl your site on a regular basis is a little like making a new friend. There needs to be a level of trust established before Google can come out to play…. (good neighbourhood…. family isn’t crazy.)
So how do you appear to Google as a legitimate authority of information? With the plethora of auto generated ad websites, blogs, spam, and redundant content, it is getting harder and harder for Google to tell the difference between the good and evil. There are a few key things that you can get in place prior to launching your website, as well as several tasks that will need to be maintained on an ongoing basis.
To see how Google has indexed your site so far, the following tools are recommended:
- Google Analytics (http://www.google.com/analytics/)
- Google Sitemaps (http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=156184)
- Google Webmaster Tools (https://www.google.com/webmasters/tools/)
- Reviewing current Google cache of your website ( just type site:http://www.yourdomainname.com into Google)
The frequency and depth of how often Google visits your website is greatly affected by the following factors:
- Original content and page uniqueness.
It’s good for all your important pages to have significant and unique original content.
- Frequently udpated content.
Updating content regularily is a key factor in Google’s continued interest in crawling your website.
- Domain importance.
Your site’s crawl rate and depth of crawling are roughly proportional to your page rank (visit this link to find out how you can determine your sites pagerank) or you can also install the Google Toolbar
- Follow the Google webmaster Guidelines:
These are essential tips for any developer or business to keep in mind as they are growing their website.Visit the Google Webmaster Guidelines here.
PR is computed based on backlinks, which are absolutely central to indexation. If a site’s page count is growing fast but the site is not earning enough new links, this may suggest to Google that the content is of low quality (guaranteed reduce your crawl and indexation rates).
- Deep Linking.
Backlinks to individual pages within your website is an effective way to ensure the indexation of those pages and their keep in the main Google index (as distinct from the supplementary index). Internal links to the same pages also help. Make sure that at least your most important pages get enough of both kinds of links.
- XML sitemaps.
Sitemaps are setup and managed via the webmaster tools at Google. Most of our new website’s are built using WordPress, to which we integrate automatic XML generation. This plugin will generate a special XML sitemap which will help search engines like Google, Bing, Yahoo and Ask.com to better index your blog. With such a sitemap, it’s much easier for the crawlers to see the complete structure of your site and retrieve it more efficiently. The plugin supports all kinds of WordPress generated pages as well as custom URLs. Additionally it notifies all major search engines every time you create a post about the new content. Sitemaps generally support <changefreq> and <priority> attributes, whose use may influence the crawl, although the impact is likely to be minor.
- Duplicate content reduction.
In general, duplicate content on a site is not a significant problem and does not entail “Google penalties.” However, on very large sites high-volume duplicated content (identical pages sitting under different URLs) can confuse Google and impede proper indexing. One classic example of duplication occurs under different forms of site URLs: those that include the www. subdomain and those that don’t (e.g. http://example.com/file1.html and http://www.example.com/file1.html typically have the same content). The way to handle this and other kinds of duplication it is via some form of URL canonicalization (see next item).
- Unique title tags.
If you use the same title tags across multiple pages, Google may assume that those pages are duplicate and be reluctant to index them. Make your titles unique.
- Manual crawl rate setting.
Google’s Webmaster Tools offer a choice between letting Google determine the crawl rate automatically and setting it manually via a slide bar. Although setting it manually to max is unlikely to boost the crawl rate dramatically, it may brings about marginal improvement.
- Social Media.
Links from social media, although they are nofollow, help Google discover and index new content. Including sharing buttons on your pages and promoting them on social media sites can help get your pages into the index faster.