Posts Tagged ‘seo guidelines’

Dealing with Crawlers – Make effective use of robots.txt

Prepared by Mohammad Jubran

A “robots.txt” file tells search engines whether they can access and therefore crawl parts of your site. This file, which must be named “robots.txt”, is placed in the root directory of your site .

You may not want certain pages of your site crawled because they might not be useful to users if found in a search engine’s search results. If you do want to prevent search engines from crawling your pages, Google Webmaster Tools has a friendly robots.txt generator to help you create this file. Note that if your site uses subdomains and you wish to have certain pages not crawled on a particular subdomain, you’ll have to create a separate robots.txt file for that subdomain. For more information on robots.txt, we suggest this Webmaster Help Center guide on using robots.txt files

There are a handful of other ways to prevent content appearing in search results, such as adding “NOINDEX” to your robots meta tag, using .htaccess to password protect directories, and using Google Webmaster Tools to remove content that has already been crawled. Google engineer Matt Cutts walks through the caveats of each URL blocking method in a helpful video.

User-agent: * Disallow: /images/ Disallow: /search

(1) All compliant search engine bots (denoted by the wildcard * symbol) shouldn’t access and crawl the content under /images/ or any URL whose path begins with /search.

(2) The address of our robots.txt file. 

Keep a firm grasp on managing exactly what information you do and don’t want being crawled!

Best Practices

Use more secure methods for sensitive content

You shouldn’t feel comfortable using robots.txt to block sensitive or confidential material. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don’t acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don’t want seen. Encrypting the content or password-protecting it with .htaccess are more secure alternatives.

Avoid:allowing search result-like pages to be crawled- users dislike leaving one search result page and landing on another search result page that doesn’t add significant value for them

allowing URLs created as a result of proxy services to be crawled

 

Robots Exclusion Standard A convention to prevent cooperating web spiders/crawlers, such as Googlebot, from accessing all or part of a website which is otherwise publicly viewable. Links robots.txt generator  http://googlewebmastercentral.blogspot.com/2008/03/speaking-language-of-robots.html 
Proxy service A computer that substitutes the connection in cases where an internal network and external network are connecting, or software that possesses a function for this purpose. Using robots.txt files  http://www.google.com/support/webmasters/bin/answer.py?answer=156449
Caveats of each URL blocking method  http://googlewebmastercentral.blogspot.com/2008/01/remove-your-content-from-google.html

 

Dealing with Crawlers

Google Bot - Crawler

Be aware of rel=”nofollow” for links

Combat comment spam with “nofollow”

Setting the value of the “rel” attribute of a link to “nofollow” will tell Google that certain links on your site shouldn’t be followed or pass your page’s reputation to the pages linked to. Nofollowing a link is adding rel=”nofollow” inside of the link’s anchor tag (1).

When would this be useful? If your site has a blog with public commenting turned on, links within those comments could pass your reputation to pages that you may not be comfortable vouching for. Blog comment areas on pages are highly susceptible to comment spam (2). Nofollowing these user-added links ensures that you’re not giving your page’s hard-earned reputation to a spammy site.

Automatically add “nofollow” to comment columns and message boards

Many blogging software packages automatically nofollow user comments, but those that don’t can most likely be manually edited to do this. This advice also goes for other areas of your site that may involve user-generated content, such as guestbooks, forums, shout-boards, referrer listings, etc. If you’re willing to vouch for links added by third parties (e.g. if a commenter is trusted on your site), then there’s no need to use nofollow on links; however, linking to sites that Google considers spammy can affect the reputation of your own site. The Webmaster Help Center has more tips on avoiding comment spam , like using CAPTCHAs and turning on comment moderation (3).

(1) If you or your site’s users link to a site that you don’t trust and/or you don’t want to pass your site’s reputation, use nofollow.

<a href=”http://www.shadyseo.com” rel=”nofollow”>Comment spammer</a>

 

(2) A comment spammer leaves a message on one of our blogs posts, hoping to get some of our site’s reputation.

(3) An example of a CAPTCHA used on Google’s blog service, Blogger. It can present a challenge to try to ensure an actual person is leaving the comment.

Glossary

Comment spamming Refers to indiscriminate postings, on blog comment columns or message boards, of advertisements, etc. that bear no connection to the contents of said pages. CAPTCHA Completely Automated Public Turing test to tell Computers and Humans Apart.

About using “nofollow” for individual contents, whole pages, etc.

Another use of nofollow is when you’re writing content and wish to reference a website, but don’t want to pass your reputation on to it. For example, imagine that you’re writing a blog post on the topic of comment spamming and you want to call out a site that recently comment spammed your blog. You want to warn others of the site, so you include the link to it in your content; however, you certainly don’t want to give the site some of your reputation from your link. This would be a good time to use nofollow.

Lastly, if you’re interested in nofollowing all of the links on a page, you can use “nofollow” in your robots meta tag, which is placed inside the <head> tag of that page’s HTML (4). The Webmaster Central Blog provides a helpful post on using the robots meta tag . This method is written as <meta content=”nofollow”>.

<html><head><title>Brandon’s Baseball Cards – Buy Cards, Baseball News, Card Prices</title>

<meta content=”Brandon’s Baseball Cards provides a large selection of vintage and modern baseball cards for sale. We also offer daily baseball news and events in”>

<meta content=”nofollow”>

</head>

<body>

Make sure you have solid measures in place to deal with comment spam!

(4) This nofollows all of the links on a page.

Links 

Avoiding comment spam  http://www.google.com/support/webmasters/bin/answer.py?answer=81749Using the robots meta tag  

http://googlewebmastercentral.blogspot.com/2007/03/using-robots-meta-tag.html

 Source: Google SEO Guidelines

SEO Basics

Search engine optimization is often about making small modifications to parts of your website. When viewed individually, these changes might seem like incremental improvements, but when combined with other optimizations, they could have a noticeable impact on your site’s user experience and performance in organic search results. You’re likely already familiar with many of the topics in this guide, because they’re essential ingredients for any web page, but you may not be making the most out of them.

Even though this guide’s title contains the words "search engine", we’d like to say that you should base your optimization decisions first and foremost on what’s best for the visitors of your site. They’re the main consumers of your content and are using search engines to find your work. Focusing too hard on specific tweaks to gain ranking in the organic results of search engines may not deliver the desired results. Search engine optimization is about putting your site’s best foot forward when it comes to visibility in search engines, but your ultimate consumers are your users, not search engines.

Your site may be smaller or larger than our example site and offer vastly different content, but the optimization topics we discuss below should apply to sites of all sizes and types. We hope our guide gives you some fresh ideas on how to improve your website, and we’d love to hear your questions, feedback, and success stories in the Google Webmaster Help Forum.

 

Source

Things To Do When SEO

Designing a web site to make the search engines happy does not have to be a remarkable feat, just a reasonable focused effort to follow the guidelines, avoid spamming, and making your visitors comfortable by taking care of their needs. Here is what needs to be done.

Whatever search engines you use make sure you follow their guidelines about the submission of sites, the kind of sites they will permit and what they recommend for optimized content. Watch for updated guidelines from Google for webmasters on likely forms of illegal search engine manipulation that is considered spam. All search engines hate spam. Google has stated in their SEO Guidelines the advice webmasters need to heed when picking a SEO.

AltaVista, Bing, Yahoo, have similar SEO Guidelines that are much on par with what Google has done.

Sometimes, even without knowing it, webmasters will inadvertently integrate techniques that might cause your site to be penalized in the site’s rankings on the search engines. If you do the proper research, like keeping track of the SEO guidelines for the aforementioned above, watch for changes, and then tweak your site to conform. If you offend you will be penalized. You must then contact the search engine that penalized your site, remove the offending material, then get back with the search engine to be back in good graces again. To avoid all this, just do not spam.

Remember to build your sites for your visitors not the search engines. To do this you need to make your site as user friendly as humanly possible by including high quality, relevant information that will help your visitors with what they are looking for and ensure a return to your site.

Your purpose, if you are in business on the net, is to convert your visitors into paying customers. You can only do this by having an easy way to navigate your site, with simple, descriptive, quality copy. Make sure all your forms and shopping carts are all working and up to speed. What will please a visitor will always please a search engine, so user friendly is the way to make your bank account grow.

SEO

SEO

New Blackberry phones on sale | Thanks to Business Opportunity, Highest CD Rates and Registry Software