Pages

Showing posts with label SEO. Show all posts
Showing posts with label SEO. Show all posts

Wednesday, November 16, 2011

Introducing G+ Business Pages


Up until this point, Google+ users have had only the ability to connect with other people. This Monday, however, Google announced the introduction of Google+ Pages, a service that allows you to connect with organizations and brands you care about. 
As Google describes it, “Google+ has always been a place for real-life sharing, and Google+ Pages is no exception. After all: behind every page (or storefront, or four-door sedan) is a passionate group of individuals, and we think you should able to connect with them too. For you and me, this means we can now hang out live with the local bike shop, or discuss our wardrobe with a favorite clothing line, or follow a band on tour. Google+ pages give life to everything we find in the real world.”
It’s likely that Google+ Pages will have a high level of visibility thanks to their integration with basic Google search.  Pages places will generally turn up in the regular search results below an organization’s own website.  Google also plans to enable a feature they call Direct Connect. Limited pages have this feature at the moment, but Google states many more are coming in the future. You will be able to go direct to pages that have Direct Connect enabled simply by adding the '+' in a search query followed by the name of a page.
Although the service is promising, a few hangups still exist.  Currently, only one user can be registered to any given page, making management of a page difficult for a larger organization or department.  Some early users have also noticed that it is far too easy to accidentally publish information to an account  unintentionally.  Finally, a Google+ pages steer users away from companies’ actual websites in favor of their G+ Page.  This may be a drawback to some users, however, owning a space on Google+ is yet another web presence that no business will want to miss out on.
Ready to get your business set up on Google+ Pages?  Head to the Google+ registration page to get started. Businesses are asked to classify under one of the following categories:
  • Local Business or Place
  • Product or Brand
  • Company, Institution or Organization
  • Arts, Entertainment or Sports
  • Other
Businesses are allowed to create as many pages as they like, providing you the ability to have a separate page for a particular product, product line, or event.  Your Google+ page will have a +1 button, similar to those now found around the web.  As a page however, you won’t have the ability to +1 other pages. 
Local business pages have additional functionality which allows you to provide a phone number and address that will add a map and local contact information to your page.  Although similar to the already existing Google Places service, they presently remain two separate services that must be managed individually. 
We’re excited about the new ability to connect with our audience on G+.  Check out the SEO.com G+ page and let us know what you think.

Wednesday, November 9, 2011

Google’s Latest “Freshness” Algorithm To Affect 35% of Searches


Google announced the freshness algorithm update this morning. It’s intended purpose is to provide the best results for the most recent information you’re seeking.


This algorithm update follows several Panda updates and last year’s caffeine update, which focused on the speed of indexing new content.


    If I search for [olympics], I probably want information about next summer’s upcoming Olympics, not the 1900 Summer Olympics (the only time my favorite sport, cricket, was played). Google Search uses a freshness algorithm, designed to give you the most up-to-date results, so even when I just type [olympics] without specifying 2012, I still find what I’m looking for.


Las Vegas Freshness SERPSSeveral searches appear as normal, in my quick review. One that stands out is a search for Las Vegas (We’re attending Pubcon next week). It shows a few more universal search components (maps, images, news, etc.). One another unrelated search, I noticed that Google+ Circle information is beginning to appear next to a picture of those in my circles.

Friday, October 7, 2011

TOP 5 Steps to Killer Keyword Research For SEO


Without keywords you have nothing in SEO. You can consider the keywords the vehicle of your campaign and inbound links, your fuel. Much of your success and results will be driven through each of the phrases targeted.
In this initial process of keyword research, it can be easy to go one of two ways—just barely scratching the surface or going overboard and wasting time. With a process set in place, you will do much better at achieving quick, quality results.
Quality keyword research includes 5 phases:

1. Baseline Keywords

To begin your keywords research you will need a starting point. This will usually be a smaller list of phrases that define what products, services, or pages you would most like to promote. Here are a few great ways to get your baseline keywords:

Keyword Rankings or Top Referring Keywords



Knowing what kind of ground your website has already gained in the search engines is a great way to determine how your are viewed by them. Whether your are assessing rankings manually or through a ranking tool this should be a fairly quick snapshot of potential opportunities that should not be passed up. Take those keywords and add them to your preliminary list.
This step is very similar to checking which keywords you are ranking for. Naturally, those that rank well (or rank at all) are going to be those that drive traffic to the website. Typically I would not recommend pulling both rankings and top referring keywords to get your baseline. Go ahead and choose one or the other and you should be good to go.

2. Keyword Expansion

Let Search Volume Set Priority

Google Adwords Keyword Tool - Keyword Research
While keyword research is not determined solely upon search volume it certainly is a major playing factor in the final selection of target keywords. Not only this but—as one of the only few ways we can clearly communicate the potential value of targeting a certain phrase—this can be a great motivator to clients. The process of compiling a list of keywords to filter through and discover your trophies should be fairly easy and quick when using the Google Adwords Keyword Tool.  Knowing how to use this nifty tool correctly is important.  Check this post out for the best practices for the Adwords Keyword Tool.

3. Keyword Filtering

There are a few questions to ask yourself as you begin eliminating keywords from the list. For example, how quickly you would like to rank? What type of keywords from the list support one another? Which keywords  and phrases here represent the biggest money makers of the website in question?
With these questions answers will come best in the following processes:

Use Your Client

If you haven’t already done this in the “baseline” process make sure to speak with your client about what keywords you are looking at.  Determine which of these represent their best money makers. Having this bit of knowledge in mind could hugely change the end result of what is selected. A lower search volume phrase with the potential to drive only 10% or even less in traffic as compared to another could earn much more.

Clustering

This should be a process familiar to most—the idea of bringing supportive and similar phrases together to target one page.  The idea is to spread the value of link building and on page optimization across multiple phrases while keeping the content natural and not forced. For instance, when you are writing an article on “link building” it is likely you will use the phrases:
  • link acquisition
  • link bait
  • building links
What we have here are 4 different phrases that can be used almost interchangeably to support one another.
When beginning your clusters, first choose what you would consider your “trophy” keywords.  These will typically be those that are super competitive, have a high potential for traffic or are the big money makers. From these phrases start breaking your list out into iterations following them.

What Does The Competition Look Like?

When asking yourself how quickly you need to achieve a quality rank, and the answer is “NOW!” then stop doing keyword research for SEO and start building out lists for a great PPC campaign. If your answer is “soon” you will want to make sure the keyword has less established competitors. Typically I like to fan out the time frame possibilities for ranking and competitiveness in the final target keywords—classing them into 3 different levels of difficulty and estimated time of performance. To evaluate your competitors for a certain phrase consider these processes:
First, run a search for your phrase. Then take your top 4-5 results URLs. These are your main competitors.

Open Site Explorer


For a quick, in-depth analysis of your competitors link portfolio, the SEOmoz Open Site Explorer is your source. While this tool can be used for a plethora of different purposes, for our intent here you will want to pay attention to the following:
  • Linking Pages – take a quick look at what type of pages the website has gained links from.  Is there only one or two with a super high authority or is the entire first page of linking pages high in authority. If you have a wide spread of authority in linking pages you can assume this is a more competitive website and will take more time to overcome. (At this point you should also be asking yourself if the acquisition of these same links is possible for the website in question. If so, how difficult will they be to acquire?)
  • Full Metrics - once you the snapshot of linking pages in mind go to the Full List of Metrics. In a spreadsheet gather the data found here to compare to the other competitors and your website too. This will be a great determinant of what you are up against.

Page Rank

The page rank provided to your competing URLs will also help you gauge the difficulty level. Use a toolbar or extension to grab those digits quickly and throw them into the spreadsheet as well. These will give you a comprehensive enough snapshot to know your difficulty level relative to other keywords in question.

4. Keyword Selection

What Are The Trends?


At this point your list should begin looking narrowed and nearly final. For quality assurance, you will want to check the trends of your keywords. The trending graph you see above is from Google Trends. This is a good way to get a snapshot for a Year-over-Year (YOY) comparison in the demand of key phrases.
Search Optimization is a long term approach to your online success. With that in mind even search volume, as it stands today, should not entirely determine your selection of keywords. I will usually implement Google Trends at two different points of my keyword research: First, when I am selecting my “head keywords” or those I will then select iterations from to build a cluster of supportive phrases as explained above. And second, at the final quality assurance check of the target keyword clusters selected.

5. Keyword Mapping

What Are You Targeting?

Throughout the entire process of keyword research, we as SEO’s have link building in the back of our mind.  This is the end goal and the very reason we are choosing key phrases to target. But just as important as the selection of phrases is where you are going to be pointing them to: the target page.
Choosing which URLs to target can be part of the process throughout or at the end. The poignant matter to be aware of is whether or not building new pages is in the realm of possibility. It is likely you will complete your research and determine that one or two of your ideal keyword clusters may not have sufficient, relevant content to link to. At that point a new URL and webpage is essential.

Saturday, September 24, 2011

SEO and PR — Getting the Best of Both Worlds

Search engine optimization and public relations need to become best friends. Why? Because SEO as we knew it two years ago is dead. As our approaches to link building are constantly evolving, a stronger tie between these two marketing fronts is being seen.
With that said, let’s examine just how these two audience-building genres can play nice. First off, just what is public relations?
Public relations is the process of building or maintaining a company’s reputation and image through positive offline and online coverage. Link building, on the other hand, builds the site’s “popularity” by acquiring a large amount of diverse links from various domains in an effort to establish top ranking positions in search engines. Where I see SEO and PR butt heads is in the work rather than the perspective. SEOs think, “how can I gain rankings, traffic and ROI,” where public relations experts think, “what coverage will provide us with the most prestigious image to engage in positive events and conversations both on and offline.
In the end, both thought processes will achieve similar results. But here’s the good news—there is a way to get the best of both worlds. We can achieve top rankings, as well as a solid reputation through a cohesive approach and understanding. Here are some pointers for the best approaches to killing two birds with one stone:

1. Optimize press releases for SEO

When crafting press releases, be sure to do so with keywords in mind. Most companies have a specific SEO campaign underway. Within that campaign will be a designated list of phrases designed to increase search rankings and relevant traffic. Narrow the list according to relevancy to your subject matter. Also, be strategic in adding these keywords to your content. Place focus keywords in the headline and beginning of the body of your content.
Outside of keyword optimization, remember that links and branding are HUGE ranking factors. Be sure your website is linked to the company name. Within the release you should also include other branding signals search engines use to identify a business:
• Address
• Phone Number
• Email
Ensure identical information is used in press releases and other content online.
Finally, don’t forget that getting your press release seen by pushing it out through the correct distribution channels is essential.

2. Host “SEO friendly” contests

No doubt about it, social is now a big part of SEO. Whether affecting it directly or indirectly it plays a role in the success of organic rankings. Hosting contests on various social mediums like Facebook and Twitter will benefit your site by achieving brand awareness as well as driving social signals to your site. It is also important to keep in mind that often times those who are finding your site through social media, are typically those who are most likely to link to your site. Contests and giveaways via social networks are a great way to generate interest and conversation. Get your target market involved in your organization, and acquire links along the way.

Friday, September 16, 2011

SEO Tips for Unavoidable Errors



One of the most common habits of Internet users is to use search engines, often when we are stuck on any random topic we open a search engine and type what we want to search. The results that will flash on your screens within seconds are enormous. Although no one ever reads how many results that particular search engine has found in fraction of seconds all that we, care is about the first page results and often for almost we select the first 3-5 results. We use Interne to search for the latest recipe to make Italian Omelets, to search interesting blogs and to search for the companies that are related to our business.

SEO (Search Engine Optimization) is a technique, which integrates your websites in such a manner that it gets higher rank and found easily in all the other websites having same product and feature line. In short we can also say that a tool which helps to generate increasing traffic to your website by effective use of search engine.
Search Engine Optimization Tips to nullify common errors:
  • Graphic Header: Using company logo, which occupies most of the width on front page, never does it. Search engine are not able to make use of images so the prime position to place your keywords and text is wasted. Use hybrid forms, place logo in one corner and use remaining space for text header.
  • Graphic navigation Menu: Same situation like above and avoidable error. SEO ranking is achieved by keywords that have internal links. If it is impossible to remove this, option then at least specify correct ALT attributes to all images.
  • Script Navigation: Never use scripts for site navigation as search engine robots cannot read and execute scripts. Rather if you want to use scripts then provide duplicate HTML so that they can be visible to robots and human visitors.
  • Session Identifier: Search engines have algorithms to identify mirrors and pages with some content. Correct recognition and indexing of Session ID’s is necessary. If possible, best to avoid Session Identifiers.
  • Redirects: As these will hamper your SEO, results if possible avoid using Redirects in your site.
  • Hidden Text: Sometimes use of hidden text which is invisible to human eyes but visible to robots is used for optimization. This is a deceptive method of SEO, which may result in banning of sites. Worse, it will be excluded from the database of the search engines.
  • One Pixel Link: Search engines are programmed to consider the use of tiny, very small graphic image links up to only 1pixel wide, which is also unethical in SEO. Avoid using these practices which will together lead to banning of sites.

Wednesday, September 14, 2011

A New Way To Submit URLs to Google

Google has announced one more way to help site owners request that a specific web page is crawled. The Fetch as Googlebot feature in Webmaster Tools has been around for a while but now makes it possible for site owners to submit a new or updated URL for indexing. The process is simple and doesn’t require users to start at the beginning of the crawling process. After you fetch a URL, and if the fetch is successful, you will now see the option to submit the URL to the Google index.
A URL will typically be indexed within a day, but this doesn’t mean that every URL submitted will be indexed. Once a page has been crawled Google will evaluate whether or not a URL belongs in their index. As with any type of discovery, ie: XML sitemap, internal and external links, RSS feeds, etc; Google goes through another process to determine whether to keep the page in their index.
This is a great function to use if you’ve just launched a new site, added new pages to a site, or updated important or time sensitive content on an existing page. You will no longer have to wait for Google to discover the page naturally. Keep in mind you will be limited to 50 single URL submissions per week. And submissions are limited to 10 per month when submitting a URL along with the links on that page going to other pages.
In addition to this update, Google has also updated the public “Add your URL to Google” form. It is now called the “Crawl URL form” and doesn’t require owner verification, but does still limit you to submitting 50 URLs per week.

The Value of Deep Linking

he question I would like to answer today is: ‘How valuable is it to build links to your sites deep pages?’
The whole issue could be summed up with 1 one sentence: It is VERY valuable to build links to deep pages on your site. Let’s explore why.
Think of your site as a large dam with many tributaries feeding water into it constantly. The tributaries represent all the links pointing to your home page, whether from internal or external links. Your home page is the dam. It is the largest and most popular place for people to link to your site, and therefore usually holds the most weight in the search engines. For the purposes of this analogy, let’s say that when the dam releases some water each month, that water then filters out into the surrounding country, eventually making its way back into the tributaries that feed the dam itself (internal links). It’s a big circle. That is how page rank gets distributed on your site. Your home page has a lot of ‘link juice’ pointing at it. That link juice eventually finds it’s way to some of the inner pages on your site, but probably won’t get to all the pages on your site unless you’ve only got just a few pages one level from the home page.

Tuesday, September 13, 2011

VIII. Promoting Your Site to Increase Traffic

 
































The main purpose of SEO is to make your site visible to search engines, thus leading to higher rankings in search results pages, which in turn brings more traffic to your site. And having more visitors (and above all buyers) is ultimately the goal in sites promotion. For truth's sake, SEO is only one alternative to promote your site and increase traffic – there are many other online and offline ways to do accomplish the goal of getting high traffic and reaching your target audience. We are not going to explore them in this tutorial but just keep in mind that search engines are not the only way to get visitors to your site, although they seem to be a preferable choice and a relatively easy way to do it.

1. Submitting Your Site to Search Directories, forums and special sites

After you have finished optimizing your new site, time comes to submit it to search engines. Generally, with search engines you don't have to do anything special in order to get your site included in their indices – they will come and find you. Well, it cannot be said exactly when they will visit your site for the first time and at what intervals they will visit it later but there is hardly anything that you can to do invite them. Sure, you can go to their Submit a Site pages in submit the URL of your new site but by doing this do not expect that they will hop to you right away. What is more, even if you submit your URL, most search engines reserve the right to judge whether to crawl your site or not. Anyway, here are the URLs for submitting pages in the three major search engines: Google, MSN, and Yahoo.
In addition to search engines, you may also want to have your site included in search directories as well. Although search directories also list sites that are relevant to a given topic, they are different from search engines in several aspects. First, search directories are usually maintained by humans and the sites in them are reviewed for relevancy after they have been submitted. Second, search directories do not use crawlers to get URLs, so you need to go to them and submit your site but once you do this, you can stay there forever and no more efforts on your side are necessary. Some of the most popular search directories are DMOZ and Yahoo! (the directory, not the search engine itself) and here are the URLs of their submissions pages: DMOZ and Yahoo!.
Sometimes posting a link to your site in the right forums or special sites can do miracles in terms of traffic. You need to find the forums and sites that are leaders in the fields of interest to you but generally even a simple search in Google or the other major search engines will retrieve their names. For instance, if you are a hardware freak, type “hardware forums” in the search box and in a second you will have a list of sites that are favorites to other hardware freaks. Then you need to check the sites one by one because some of them might not allow posting links to commercial sites. Posting into forums is more time-consuming than submitting to search engines but it could also be pretty rewarding.

2. Specialized Search Engines

Google, Yahoo!, and MSN are not the only search engines on Earth, nor even the only general-purpose ones. There are many other general-purpose and specialized search engines and some of them can be really helpful for reaching your target audience. You just can't imagine for how many niches specialized search engines exist – from law, to radiostations, to educational one! Some of them are actually huge sites that gather Webwide resources on a particular topic but almost all of them have sections for submitting links to external sites of interest. So, after you find the specialized search engines in your niche, go to their site and submit your URL – this could prove more trafficworthy than striving to get to the top of Google.

3. Paid Ads and Submissions

We have already mentioned some other alternatives to search engines – forums, specialized sites and search engines, search directories – but if you need to make sure that your site will be noticed, you can always resort to paid ads and submissions. Yes, paid listings are a fast and guaranteed way to appear in search results and most of the major search engines accept payment to put your URL in the Paid Links section for keywords of interest to you but you also must have in mind that users generally do not trust paid links as much as they do with the normal ones – in a sense it looks like you are bribing the search engine to place you where you can't get on your own, so think twice about the pros and cons of paying to get listed.

VII. Static Versus Dynamic URLs

Based on the previous section, you might have gotten the impression that the algorithms of search engines try to humiliate every designer effort to make a site gorgeous. Well, it has been explained why search engines do not like image, movies, applets and other extras. Now, you might think that search engines are far too cheeky to dislike dynamic URLs either. Honestly, users are also not in love with URLs like http://domain.com/product.php?cid=1&pid=5 because such URLs do not tell much about the contents of the page.
There are a couple of good reasons why static URLs score better than dynamic URLs. First, dynamic URLs are not always there – i.e. the page is generated on request after the user performs some kind of action (fills a form and submits it or performs a search using the site's search engine). In a sense, such pages are nonexistent for search engines, because they index the Web by crawling it, not by filling in forms.
Second, even if a dynamic page has already been generated by a previous user request and is stored on the server, search engines might just skip it if it has too many question marks and other special symbols in it. Once upon a time search engines did not index dynamic pages at all, while today they do index them but generally slower than they index static pages.
The idea is not to revert to static HTML only. Database-driven sites are great but it will be much better if you serve your pages to the search engines and users in a format they can easily handle. One of the solutions of the dynamic URLs problem is called URL rewriting. There are special tools (different for different platforms and servers) that rewrite URLs in a friendlier format, so they appear in the browser like normal HTML pages. Try the URL Rewriting Tool below, it will convert the cryptic text from the previous example into something more readable, like http://mydomain.com/product-categoryid-1-productid-5.

VI. Visual Extras and SEO

As already mentioned, search engines have no means to index directly extras like images, sounds, flash movies, javascript. Instead, they rely on your to provide meaningful textual description and based on it they can index these files. In a sense, the situation is similar to that with text 10 or so years ago – you provide a description in the metatag and search engines uses this description to index and process your page. If technology advances further, one day it might be possible for search engines to index images, movies, etc. but for the time being this is just a dream.

1. Images

Images are an essential part of any Web page and from a designer point of view they are not an extra but a most mandatory item for every site. However, here designers and search engines are on two poles because for search engines every piece of information that is buried in an image is lost. When working with designers, sometimes it takes a while to explain to them that having textual links (with proper anchor text) instead of shining images is not a whim and that clear text navigation is really mandatory. Yes, it can be hard to find the right balance between artistic performance and SEO-friendliness but since even the finest site is lost in cyberspace if it cannot be found by search engines, a compromise to its visual appearance cannot be avoided.

With all that said, the idea is not to skip images at all. Sure, nowadays this is impossible because the result would be a most ugly site. Rather the idea is that images should be used for illustration and decoration, not for navigation or even worse – for displaying text (in a fancy font, for example). And the most important – in the <alt> attribute of the <img> tag, always provide a meaningful textual description of the image. The HTML specification does not require this but search engines do. Also, it does not hurt to give meaningful names to the image files themselves rather than name them image1.jpg, image2.jpg, imageN.jpg. For instance, in the next example the image file has an informative name and the alt provides enough additional information: <img src=“one_month_Jim.jpg” alt=“A picture of Jim when he was a one-month puppy”>. Well, don't go to extremes like writing 20-word <alt> tags for 1 pixel images because this also looks suspicious and starts to smell like keyword-stuffing.

2. Animation and Movies

The situation with animation and movies is similar to that with images – they are valuable from a designer's point of view but are not loved by search engines. For instance, it is still pretty common to have an impressive Flash introduction on the home page. You just cannot imagine what a disadvantage with search engines this is – it is a number one rankings killer! And it gets even worse, if you use Flash to tell a story that can be written in plain text, hence crawled and indexed by search engines. One workaround is to provide search engines with a HTML version of the Flash movie but in this case make sure that you have excluded the original Flash movie from indexing (this is done in the robots.txt file but the explanation of this file is not a beginners topic and that is why it is excluded from this tutorial), otherwise you can be penalized for duplicate content.
There are rumors that Google is building a new search technology that will allow to search inside animation and movies and that the .swf format will contain new metadata that can be used by search engines, but until then, you'd better either refrain from using (too much) Flash, or at least provide a textual description of the movie (you can use an <alt> tag to describe the movie).

3. Frames

It is a good news that frames are slowly but surely disappearing from the Web. 5 or 10 years ago they were an absolute hit with designers but never with search engines. Search engines have difficulties indexing framed pages because the URL of the page is the same, no matter which of the separate frames is open. For search engines this was a shock because actually there were 3 or 4 pages and only one URL, while for search engines 1 URL is 1 page. Of course, search engines can follow the links to the pages in the frameset and index them but this is a hurdle for them.
If you still insist on using frames, make sure that you provide a meaningful description of the site in the <noframes> tag. The following example is not for beginners but even if you do not understand everything in it, just remember that the <noframes> tag is the place to provide an alternative version (or at least a short description) of your site for search engines and users whose browsers do not support frames. If you decide to use the <noframes> tag, maybe you'd like to read more about it before you start using it.
Example: <noframes> <p> This site is best viewed in a browser that supports frames. </p><p> Welcome to our site for prospective dog adopters! Adopting a homeless dog is a most noble deed that will help save the life of the poor creature. </p></noframes>

4. JavaScript

This is another hot potato. It is known by everybody that pure HTML is powerless to make complex sites with a lot of functionality (anyway, HTML was not intended to be a programming languages for building Web applications, so nobody expects that you can use HTML to handle writing to a database or even for storing session information) as required by today's Web users and that is why other programming languages (like JavaScript, or PHP) come to enhance HTML. For now search engines just ignore JavaScript they encounter on a page. As a result of this, first if you have links that are inside the JavaScript code, chances are that they will not be spidered. Second, if JavaScript is in the HTML file itself (rather than in an external .js file that is invoked when necessary) this clutters the html file itself and spiders might just skip it and move to the next site. Just for your information, there is a <noscript> tag that allows to provide alternative to running the script in the browser but because most of its applications are pretty complicated, it is hardly suitable to explain it here.

III. Backlinks – Another Important SEO Item




What are Backlinks?

In layman's terms, there are two types of links: inbound and outbound. Outbound links start from your site and lead to an external site, while inbound links or backlinks, come from an external site to yours. e.g. if cnn.com links to yourdomain.com, the link from cnn.com is a backlink (inbound) for yourdomain.com, however the link is an outbound link from cnn.com's perspective. Backlinks are among the main building blocks to good Search Engine Optimisation (SEO).

Why Backlinks Are Important

The number of backlinks is an indication of the popularity or importance of that website. Backlinks are important for SEO because some search engines like Google, give more credit to websites that have a large number of quality backlinks, and consider those websites more relevant than others in their results pages for a search query.
Therefore, when search engines calculate the relevance of a site to a keyword, they not only consider the number of backlinks to that site but also their quality. In order to determine the quality, a search engine considers the content of the sites. When backlinks to your site come from other sites, and those sites have content related to your site, these backlinks are considered more relevant to your site. If backlinks are found on sites with unrelated content, they are considered less relevant. The higher the relevance of backlinks, the greater their quality.
For example, if a webmaster has a website about how to rescue orphaned dogs, and received a backlink from another website about dogs, then that would be more relevant in a search engine's assessment than say a link from a site about car racing. Therefore, higher the relevance of the site linking back to your website, the better the quality of the backlink.
Search engines want websites to have a level playing field, and look for natural links built slowly over time. While it is fairly easy to modify your webpages to make them more SEO friendly it is a lot harder for you to influence other websites and get them to link to your website. This is the reason search engines regard backlinks as a very important factor. Further, search engine's criteria for quality backlinks has gotten even tougher, thanks to unscrupulous webmasters trying to achieve these backlinks by deceptive or sneaky techniques, such as hidden links, or automatically generated pages whose sole purpose is to provide backlinks to websites. These pages are called link farms, and they are not only disregarded by search engines, but linking to a link farm could get your site banned entirely.

Anchor Text

When a link incorporates a keyword into the text of the hyperlink, we call this anchor text. A link's anchor text may be one of the most powerful resources a webmaster has. Backlinks from multiple websites with the anchor text "orphaned dogs" would help your website rank higher for the keyword "orphaned dogs". Using your keyword is a superior way to utilize a hyperlink as against having links with words like "click here" which do not relate to your website. The 'Backlink Anchor Text Analysis Tool' is a tool which will assist you find your backlinks and the text which is being used to link to your website. If you find that your site is being linked to from another website, but the anchor text is not being utilized properly, you should request that the website change the anchor text to something which incorporates relevant keywords. This will also help boost your rankings.

Ways to Build Backlinks
Even if plenty of backlinks come to your site the natural way, additional quality backlinks are always welcome.

1 The Backlink Builder Tool

When you enter the keywords of your choice, the Backlink Builder tool gives you a list of relevent sites from where you might get some backlinks.

2 Getting Listed in Directories

If you are serious about your Web presence, getting listed in directories like DMOZ and Yahoo is a must, not only because this is a way to get some quality backlinks for free, but also because this way you are easily noticed by both search engines and potential visitors. Generally inclusion in search directories is free but the drawback is that sometimes you have to wait a couple of months before you get listed in the categories of your choice.

3 Forums and Article Directories

Generally search engines index forums so posting in forums and blogs is also a way to get quality backlinks with the anchor text you want. If the forum or blog is a respected one, a backlink is valuable. However, in some cases the forum or blog administrator can edit your post, or even delete it if it does not fit into the forum or blog policy. Also, sometimes administrators do not allow links in posts, unless they are relevant ones. In some rare cases (which are more an exception than a rule) the owner of a forum or a blog would have banned search engines from indexing it and in this case posting backlinks there is pointless.

4 RSS Feeds

You can offer RSS feeds to interested sites for free, when the other site publishes your RSS feed you will get a backlink to your site and potentially a lot of visitors, who will come to your site for more details about the headline and the abstract they read on the other site.

5 Affiliate programs

Affiliate programs are also good for getting more visitors (and buyers) and for building quality backlinks but they tend to be an expensive way because generally the affiliate commission is in the range of 10 to 30 %. But if you have an affiliate program anyway, why not use it to get some more quality backlinks?

6 News Announcements and Press Releases

Although this is hardly an everyday way to build backlinks, it is an approach that gives good results, if handled properly. There are many sites that publish news announcements and press releases for free or for a small fee . A professionally written press release about an important event can bring you many visitors and the backlink from a respected site to yours is a good boost to your SEO efforts. The tricky part is that you cannot release press releases if there is nothing newsworthy. That is why we say that news announcements and press releases are not a commodity way to build backlinks.

II. Keywords – the Most Important Item in SEO


Keywords are the most important SEO element for every search engine, they are what search strings are matched against. Choosing the right keywords to optimize for is thus the first and most crucial step to a successful SEO campaign. If you fail on this very first step, the road ahead is very bumpy and most likely you will only waste your time and money. There are many ways to determine which keywords to optimize for and usually the final list of them is made after a careful analysis of what the online population is searching for, which keywords have your competitors chosen and above all - which are the keywords that you feel describe your site best

1. Choosing the Right Keywords to Optimize For

It seems that the time when you could easily top the results for a one-word search string is centuries ago. Now, when the Web is so densely populated with sites, it is next to impossible to achieve constant top ratings for a one-word search string. Achieving constant top ratings for two-word or three-word search strings is a more realistic goal.
For instance, If you have a site about dogs, do NOT try and optimize for the keyword "dog" or "dogs". Instead you could try and focus on keywords like "dog obedience training", "small dog breeds", "homemade dog food", "dog food recipes" etc. Success for very popular one-two word keywords is very difficult and often not worth the trouble, it's best to focus on less competitive highly specific keywords.
The first thing you need to do is come up with keywords that describe the content of your website. Ideally, you know your users well and can correctly guess what search strings they are likely to use to search for you. You can also try the Website Keyword Suggestions Tool below to come up with an initial list of keywords. Run your inital list of keywords by the Google keyword Suggestion tool, you'll get a related list of keywords, shortlist a couple of keywords that seem relevent and have a decent global search volume.

When choosing the keywords to optimize for, you need to consider not only the expected monthly number of searches but also the relevancy of these keywords to your website. Although narrow keywords get fewer searches they are a lot more valuable than generic keywords because the users would be more interested in your offerings. Lets say you have a section on your website where you give advice on what to look for when adopting a dog. You might discover that the "adopt german shepherd" keyphrase gives you better results than a keyword like "german shepherd dogs". This page is not of interest to current german shepherd owners but to potential german shepherd owners only. So, when you look at the numbers of search hits per month, consider the unique hits that fit into the theme of your site.

2. Keyword Density

After you have chosen the keywords that describe your site and are supposedly of interest to your users, the next step is to make your site keyword-rich and to have good keyword density for your target keywords. Keyword density although no longer a very important factor in SEO is a common measure of how relevant a page is. Generally, the idea is that the higher the keyword density, the more relevant to the search string a page is. The recommended density is 3-7% for the major 2 or 3 keywords and 1-2% for minor keywords. Try the Keyword Density Checker below to determine the keyword density of your website.

Although there are no strict rules, try optimizing for a reasonable number of keywords – 5 or 10 is OK. If you attempt to optimize for a list of 300, you will soon see that it is just not possible to have a good keyword density for more than a few keywords, without making the text sound artificial and stuffed with keywords. And what is worse, there are severe penalties (including ban from the search engine) for keyword stuffing because this is considered an unethical practice that tries to manipulate search results.

3. Keywords in Special Places

Keywords are very important not only as quantity but as quality as well – i.e. if you have more keywords in the page title, the headings, the first paragraphs – this counts more that if you have many keywords at the bottom of the page. The reason is that the URL (and especially the domain name), file names and directory names, the page title, the headings for the separate sections are more important than ordinary text on the page and therefore, all equal, if you have the same keyword density as your competitors but you have keywords in the URL, this will boost your ranking incredibly, especially with Yahoo!.

a. Keywords in URLs and File Names

The domain name and the whole URL of a site tell a lot about it. The presumption is that if your site is about dogs, you will have “dog”, “dogs”, or “puppy” as part of your domain name. For instance, if your site is mainly about adopting dogs, it is much better to name your dog site “dog-adopt.net” than “animal-care.org”, for example, because in the first case you have two major keywords in the URL, while in the second one you have no more than one potential minor keyword.
When hunting for keyword rich domain names, don't get greedy. While from a SEO point of view it is better to have 5 keywords in the URL, just imagine how long and difficult to memorize the URL will be. So you need to strike a balance between the keywords in the URL and site usability, which says that more than 3 words in the URL is a way too much.
Probably you will not be able to come on your own with tons of good suggestions. Additionally, even if you manage to think of a couple of good domain names, they might be already taken. In such cases tools like the Tool below can come very handy.
File names and directory names are also important. Often search engines will give preference to pages that have a keyword in the file name. For instance http://mydomain.com/dog-adopt.html is not as good as http://dog-adopt.net/dog-adopt.html but is certainly better than http://mydomain.com/animal-care.html. The advantage of keywords in file names over keywords in URLs is that they are easier to change, if you decide to move to another niche, for example.

b. Keywords in Page Titles

The page title is another special place because the contents of the <title> tag usually gets displayed in most search engines, (including Google). While it is not mandatory per the HTML specification to write something in the <title> tag (i.e. you can leave it empty and the title bar of the browser will read “Untitled Document” or similar), for SEO purposes you may not want to leave the <title> tag empty; instead, you'd better write the the page title in it.
Unlike URLs, with page titles you can get wordy. If we go on with the dog example, the <title> tag of the home page for the http://dog-adopt.net can include something like this: <title>Adopt a Dog – Save a Life and Bring Joy to Your Home</title>, <title>Everything You Need to Know About Adopting a Dog</title> or even longer.

c. Keywords in Headings

Normally headings separate paragraphs into related subtopics and from a literary point of view, it may be pointless to have a heading after every other paragraph but from SEO point of view it is extremely good to have as many headings on a page as possible, especially if they have the keywords in them.
There are no technical length limits for the contents of the <h1>, <h2>, <h3>, ... <hn> tags but common sense says that too long headings are bad for page readability. So, like with URLs, you need to be wise with the length of headings. Another issue you need to consider is how the heading will be displayed. If it is Heading 1 (<h1>), generally this means larger font size and in this case it is recommendable to have less than 7-8 words in the heading, otherwise it might spread on 2 or 3 lines, which is not good and if you can avoid it – do it.

I Introduction – What Is SEO

Whenever you enter a query in a search engine and hit 'enter' you get a list of web results that contain that query term. Users normally tend to visit websites that are at the top of this list as they perceive those to be more relevant to the query. If you have ever wondered why some of these websites rank better than the others then you must know that it is because of a powerful web marketing technique called Search Engine Optimization (SEO).
SEO is a technique which helps search engines find and rank your site higher than the millions of other sites in response to a search query. SEO thus helps you get traffic from search engines.
This SEO tutorial covers all the necessary information you need to know about Search Engine Optimization - what is it, how does it work and differences in the ranking criteria of major search engines.

1. How Search Engines Work

The first basic truth you need to know to learn SEO is that search engines are not humans. While this might be obvious for everybody, the differences between how humans and search engines view web pages aren't. Unlike humans, search engines are text-driven. Although technology advances rapidly, search engines are far from intelligent creatures that can feel the beauty of a cool design or enjoy the sounds and movement in movies. Instead, search engines crawl the Web, looking at particular site items (mainly text) to get an idea what a site is about. This brief explanation is not the most precise because as we will see next, search engines perform several activities in order to deliver search results – crawling, indexing, processing, calculating relevancy, and retrieving.
First, search engines crawl the Web to see what is there. This task is performed by a piece of software, called a crawler or a spider (or Googlebot, as is the case with Google). Spiders follow links from one page to another and index everything they find on their way. Having in mind the number of pages on the Web (over 20 billion), it is impossible for a spider to visit a site daily just to see if a new page has appeared or if an existing page has been modified, sometimes crawlers may not end up visiting your site for a month or two.
What you can do is to check what a crawler sees from your site. As already mentioned, crawlers are not humans and they do not see images, Flash movies, JavaScript, frames, password-protected pages and directories, so if you have tons of these on your site, you'd better run the Spider Simulator below to see if these goodies are viewable by the spider. If they are not viewable, they will not be spidered, not indexed, not processed, etc. - in a word they will be non-existent for search engines.

After a page is crawled, the next step is to index its content. The indexed page is stored in a giant database, from where it can later be retrieved. Essentially, the process of indexing is identifying the words and expressions that best describe the page and assigning the page to particular keywords. For a human it will not be possible to process such amounts of information but generally search engines deal just fine with this task. Sometimes they might not get the meaning of a page right but if you help them by optimizing it, it will be easier for them to classify your pages correctly and for you – to get higher rankings.
When a search request comes, the search engine processes it – i.e. it compares the search string in the search request with the indexed pages in the database. Since it is likely that more than one page (practically it is millions of pages) contains the search string, the search engine starts calculating the relevancy of each of the pages in its index with the search string.
There are various algorithms to calculate relevancy. Each of these algorithms has different relative weights for common factors like keyword density, links, or metatags. That is why different search engines give different search results pages for the same search string. What is more, it is a known fact that all major search engines, like Yahoo!, Google, Bing, etc. periodically change their algorithms and if you want to keep at the top, you also need to adapt your pages to the latest changes. This is one reason (the other is your competitors) to devote permanent efforts to SEO, if you'd like to be at the top.
The last step in search engines' activity is retrieving the results. Basically, it is nothing more than simply displaying them in the browser – i.e. the endless pages of search results that are sorted from the most relevant to the least relevant sites.

2. Differences Between the Major Search Engines

Although the basic principle of operation of all search engines is the same, the minor differences between them lead to major changes in results relevancy. For different search engines different factors are important. There were times, when SEO experts joked that the algorithms of Bing are intentionally made just the opposite of those of Google. While this might have a grain of truth, it is a matter a fact that the major search engines like different stuff and if you plan to conquer more than one of them, you need to optimize carefully.
There are many examples of the differences between search engines. For instance, for Yahoo! and Bing, on-page keyword factors are of primary importance, while for Google links are very, very important. Also, for Google sites are like wine – the older, the better, while Yahoo! generally has no expressed preference towards sites and domains with tradition (i.e. older ones). Thus you might need more time till your site gets mature to be admitted to the top in Google, than in Yahoo!

Search SEO