Posts Tagged ‘seo tips’

Update Your SEO Vocabulary

SEO Tips

by Terri Wells

The SEO field changes constantly in response to the search engines. These days, though, visitors find your site via Facebook, Twitter, and other resources in addition to Google. That means your approach to SEO needs to expand – and so does your vocabulary.
SEO now embraces a range of fields. A well-rounded SEO understands some of the technical aspects of websites, the wider field of Internet marketing, conversion metrics, how to test ads, and much more. If you want to keep up with the field and understand where it is going, you need to get acquainted with the concepts I’ll be explaining in this article. Give yourself a pat on the back if you’ve already heard of these, but don’t be surprised if you see a few you’ve never encountered before.

Let me give credit where it’s due. Tad Chef, writing for SEOptimise, covered these terms and more. If you want to see them explained in greater detail, he links out to an article for every single one of them.

We’ll start with a number: 503. You know about 404 not found codes, 301 redirects, and probably even 200 (which means the URL is accessible). What is a 503 code? It tells search engine crawlers, and anyone else who visits your site, that it is temporarily down for maintenance. Google notes in a Webmaster Central blog post that using a 503 HTTP result code is a way “to deal with planned website downtime…that will generally not negatively affect your site’s visibility in the search results.” You can even specify the length of time your site will be down in an optional Retry-After header, to let the Googlebot know when it can come back for something to crawl.

The next term I’d like to discuss is “A/B testing.” Also known as split testing, this is not a new technique. You’ve probably known about it for years. Google even offers ways for you to do A/B testing of your ads with them. In its simplest form, split testing involves comparing two versions of a page to find out which one gets more visitors to do what you want them to do: make a purchase, sign up for a newsletter, request more information, and so forth. Doing it right is both an art and a science, as small changes can sometimes lead to big differences in a page’s conversion rate.

Things To Do For Search Engine Optimization

SEO Tips
Designing a web site to make the search engines happy does not have to be a remarkable feat, just a reasonable focused effort to follow the guidelines, avoid spamming, and making your visitors comfortable by taking care of their needs. Here is what needs to be done.
Whatever search engines you use make sure you follow their guidelines about the submission of sites, the kind of sites they will permit and what they recommend for optimized content. Watch for updated guidelines from Google for webmasters on likely forms of illegal search engine manipulation that is considered spam. All search engines hate spam. Google has stated in their SEO Guidelines the advice webmasters need to heed when picking a SEO.
AltaVista, Bing, Yahoo, have similar SEO Guidelines that are much on par with what Google has done.
Sometimes, even without knowing it, webmasters will inadvertently integrate techniques that might cause your site to be penalized in the site’s rankings on the search engines. If you do the proper research, like keeping track of the SEO guidelines for the aforementioned above, watch for changes, and then tweak your site to conform. If you offend you will be penalized. You must then contact the search engine that penalized your site, remove the offending material, then get back with the search engine to be back in good graces again. To avoid all this, just do not spam.
Remember to build your sites for your visitors not the search engines. To do this you need to make your site as user friendly as humanly possible by including high quality, relevant information that will help your visitors with what they are looking for and ensure a return to your site.
Your purpose, if you are in business on the net, is to convert your visitors into paying customers. You can only do this by having an easy way to navigate your site, with simple, descriptive, quality copy. Make sure all your forms and shopping carts are all working and up to speed. What will please a visitor will always please a search engine, so user friendly is the way to make your bank account grow.

Designing a web site to make the search engines happy does not have to be a remarkable feat, just a reasonable focused effort to follow the guidelines, avoid spamming, and making your visitors comfortable by taking care of their needs. Here is what needs to be done.
Whatever search engines you use make sure you follow their guidelines about the submission of sites, the kind of sites they will permit and what they recommend for optimized content. Watch for updated guidelines from Google for webmasters on likely forms of illegal search engine manipulation that is considered spam. All search engines hate spam. Google has stated in their SEO Guidelines the advice webmasters need to heed when picking a SEO.
AltaVista, MSN, Yahoo, have similar SEO Guidelines that are much on par with what Google has done.
Sometimes, even without knowing it, webmasters will inadvertently integrate techniques that might cause your site to be penalized in the site’s rankings on the search engines. If you do the proper research, like keeping track of the SEO guidelines for the aforementioned above, watch for changes, and then tweak your site to conform. If you offend you will be penalized. You must then contact the search engine that penalized your site, remove the offending material, then get back with the search engine to be back in good graces again. To avoid all this, just do not spam.
Remember to build your sites for your visitors not the search engines. To do this you need to make your site as user friendly as humanly possible by including high quality, relevant information that will help your visitors with what they are looking for and ensure a return to your site.
Your purpose, if you are in business on the net, is to convert your visitors into paying customers. You can only do this by having an easy way to navigate your site, with simple, descriptive, quality copy. Make sure all your forms and shopping carts are all working and up to speed. What will please a visitor will always please a search engine, so user friendly is the way to make your bank account grow.

Latest SEO Tips

Latest SEO Tips

Latest SEO Tips by Jubran

Several of the latest SEO tips can take your web content and put it in front of the widest audience possible. Search engine optimization will help your website be successful.
When you think about SEO, you probably assume that the number one trick for getting your content in front of the largest audience possible is using your main keyword frequently throughout your text. Unfortunately, the landscape of SEO has changed enough that this is no longer the case. If you repeat your keyword too much, especially in places where use of your keyword would seem awkward or out of place, your placement can now be penalized by popular search engines. Improper use of keywords might cause some websites to be delisted entirely.

Add Your SEO Agency

One of the latest SEO tips for avoiding accidental keyword stuffing is simply varying the keywords you choose to use. When you pick your main keyword, research a list of alternate keywords as well. Try to avoid using your primary phrase more than two or three times within your content. Instead, try to work in alternate words in places where they will sound natural. Through doing this, you will provide better and more interesting content while helping a search engine see that you are not simply stuffing keywords. As a benefit, your readers will be able to find your content with alternate searches as well, allowing you to reach a wider audience and drive up traffic. This new strategy will help you succeed with modern search engine algorithms.
||Teaser-Tag||

Dealing with Crawlers – Make effective use of robots.txt

Prepared by Mohammad Jubran

A “robots.txt” file tells search engines whether they can access and therefore crawl parts of your site. This file, which must be named “robots.txt”, is placed in the root directory of your site .

You may not want certain pages of your site crawled because they might not be useful to users if found in a search engine’s search results. If you do want to prevent search engines from crawling your pages, Google Webmaster Tools has a friendly robots.txt generator to help you create this file. Note that if your site uses subdomains and you wish to have certain pages not crawled on a particular subdomain, you’ll have to create a separate robots.txt file for that subdomain. For more information on robots.txt, we suggest this Webmaster Help Center guide on using robots.txt files

There are a handful of other ways to prevent content appearing in search results, such as adding “NOINDEX” to your robots meta tag, using .htaccess to password protect directories, and using Google Webmaster Tools to remove content that has already been crawled. Google engineer Matt Cutts walks through the caveats of each URL blocking method in a helpful video.

User-agent: * Disallow: /images/ Disallow: /search

(1) All compliant search engine bots (denoted by the wildcard * symbol) shouldn’t access and crawl the content under /images/ or any URL whose path begins with /search.

(2) The address of our robots.txt file. 

Keep a firm grasp on managing exactly what information you do and don’t want being crawled!

Best Practices

Use more secure methods for sensitive content

You shouldn’t feel comfortable using robots.txt to block sensitive or confidential material. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don’t acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don’t want seen. Encrypting the content or password-protecting it with .htaccess are more secure alternatives.

Avoid:allowing search result-like pages to be crawled- users dislike leaving one search result page and landing on another search result page that doesn’t add significant value for them

allowing URLs created as a result of proxy services to be crawled

 

Robots Exclusion Standard A convention to prevent cooperating web spiders/crawlers, such as Googlebot, from accessing all or part of a website which is otherwise publicly viewable. Links robots.txt generator  http://googlewebmastercentral.blogspot.com/2008/03/speaking-language-of-robots.html 
Proxy service A computer that substitutes the connection in cases where an internal network and external network are connecting, or software that possesses a function for this purpose. Using robots.txt files  http://www.google.com/support/webmasters/bin/answer.py?answer=156449
Caveats of each URL blocking method  http://googlewebmastercentral.blogspot.com/2008/01/remove-your-content-from-google.html

 

Dealing with Crawlers

Google Bot - Crawler

Be aware of rel=”nofollow” for links

Combat comment spam with “nofollow”

Setting the value of the “rel” attribute of a link to “nofollow” will tell Google that certain links on your site shouldn’t be followed or pass your page’s reputation to the pages linked to. Nofollowing a link is adding rel=”nofollow” inside of the link’s anchor tag (1).

When would this be useful? If your site has a blog with public commenting turned on, links within those comments could pass your reputation to pages that you may not be comfortable vouching for. Blog comment areas on pages are highly susceptible to comment spam (2). Nofollowing these user-added links ensures that you’re not giving your page’s hard-earned reputation to a spammy site.

Automatically add “nofollow” to comment columns and message boards

Many blogging software packages automatically nofollow user comments, but those that don’t can most likely be manually edited to do this. This advice also goes for other areas of your site that may involve user-generated content, such as guestbooks, forums, shout-boards, referrer listings, etc. If you’re willing to vouch for links added by third parties (e.g. if a commenter is trusted on your site), then there’s no need to use nofollow on links; however, linking to sites that Google considers spammy can affect the reputation of your own site. The Webmaster Help Center has more tips on avoiding comment spam , like using CAPTCHAs and turning on comment moderation (3).

(1) If you or your site’s users link to a site that you don’t trust and/or you don’t want to pass your site’s reputation, use nofollow.

<a href=”http://www.shadyseo.com” rel=”nofollow”>Comment spammer</a>

 

(2) A comment spammer leaves a message on one of our blogs posts, hoping to get some of our site’s reputation.

(3) An example of a CAPTCHA used on Google’s blog service, Blogger. It can present a challenge to try to ensure an actual person is leaving the comment.

Glossary

Comment spamming Refers to indiscriminate postings, on blog comment columns or message boards, of advertisements, etc. that bear no connection to the contents of said pages. CAPTCHA Completely Automated Public Turing test to tell Computers and Humans Apart.

About using “nofollow” for individual contents, whole pages, etc.

Another use of nofollow is when you’re writing content and wish to reference a website, but don’t want to pass your reputation on to it. For example, imagine that you’re writing a blog post on the topic of comment spamming and you want to call out a site that recently comment spammed your blog. You want to warn others of the site, so you include the link to it in your content; however, you certainly don’t want to give the site some of your reputation from your link. This would be a good time to use nofollow.

Lastly, if you’re interested in nofollowing all of the links on a page, you can use “nofollow” in your robots meta tag, which is placed inside the <head> tag of that page’s HTML (4). The Webmaster Central Blog provides a helpful post on using the robots meta tag . This method is written as <meta content=”nofollow”>.

<html><head><title>Brandon’s Baseball Cards – Buy Cards, Baseball News, Card Prices</title>

<meta content=”Brandon’s Baseball Cards provides a large selection of vintage and modern baseball cards for sale. We also offer daily baseball news and events in”>

<meta content=”nofollow”>

</head>

<body>

Make sure you have solid measures in place to deal with comment spam!

(4) This nofollows all of the links on a page.

Links 

Avoiding comment spam  http://www.google.com/support/webmasters/bin/answer.py?answer=81749Using the robots meta tag  

http://googlewebmastercentral.blogspot.com/2007/03/using-robots-meta-tag.html

 Source: Google SEO Guidelines

New Blackberry phones on sale | Thanks to Business Opportunity, Highest CD Rates and Registry Software